A/B Testing

Running two variants of a page element simultaneously to measure which performs better, using statistical significance to determine the winner.

https://
FreeNo signup~1 minute

A/B Testing explained

A/B testing shows 50% of visitors version A and 50% version B, then measures which converts better. It's the gold standard for data-driven optimization — but only when you have enough traffic to reach statistical significance.

The traffic threshold most teams ignore: to detect a 10% relative improvement with 95% confidence, you typically need 1,000+ conversions per variant. If your page gets 500 visitors/month and converts at 3%, that's 15 conversions/month — you'd need to run the test for over a year. Below this threshold, A/B testing is a waste of time. Use analysis tools and best practices instead.

What to test first

Test the highest-leverage elements first: headline (biggest impact), CTA copy and placement (second biggest), hero section layout, and social proof positioning. Don't start with button colors — the "red vs. green button" debate is the least interesting question in CRO. Test the words, not the colors.

One variable at a time. If you change the headline, CTA, and hero image simultaneously, you won't know which change caused the result. Multivariate testing exists but requires even more traffic.

Related terms

Related reading

See these concepts in action

Analyze your landing page and get scored across all 8 dimensions.

https://