Back to Menu
Connecting You To Opportunity
What can we help you find?
| Login|Sign Up
Back to Menu
Hello
  • Login
  • Sign Up

16 Effective Methods to Make the Most Out of A/B Testing

ByScott Gerber,
business.com writer
| Last Modified
Jul 05, 2019
Home
> Marketing
SHARE THIS
1. Begin with a hypothesis.
2. Have success markers.
3. Do research.
4. Use an all-encompassing approach.
5. Test for mobile.
6. Use Google Optimize.
7. Give it time to get complete results.
8. Segment your results.
9. Use visuals and insights.
10. Remain unbiased.
11. Follow the 45/45/10 rule.
12. Don't be afraid of complexity.
13. Test both variations at once.
14. Start with a clear idea.
15. Use the right software.
16. Get user feedback.

When it comes to deciding how your website or mobile app should look, or fine-tuning a marketing strategy or online advertising plan, rather than guessing what you think is the best option, you are better served conducting an A/B test. A/B testing gives businesses the opportunity to test out various options to see which resonates most with their target audience.

There are many ways to go about A/B testing and many elements that you can test for. So what kinds of approaches do you need to remember in order to make the most of the testing process? To help you understand the best ways to conduct A/B testing, members of the Young Entrepreneur Council (YEC) share what strategies they find most valuable.

1. Begin with a hypothesis.

"To make the most of A/B testing, I always start with a hypothesis. I generate a hypothesis based on a regular analysis of my site to spot potential problems. I also use qualitative polling, surveys and usability tests to better understand what customers are actually struggling with. Once I've compiled the issues, I prioritize them and then decide what solutions should be tested." ‒ Shu SaitoFact Retriever

2. Have success markers.

"It's important to test one variable at a time, but with a reasonable measure of success. You won't completely solve the problem by changing one variable. Figure out how much improvement is a success for the variable you're testing. For each variable, depending on its importance, shift your measure of success accordingly and then test. Decide based on the results which version is performing the best." ‒ Abeer RazaTekRevol

3. Do research.

"Before we implement any A/B testing on our site, we like to research sites that are similar to ours or companies that have tried the same testing metrics we want to use. This additional research helps us determine and prioritize our A/B tests so that we are not going down a path that has already been explored and yielded lackluster results." ‒ David HenzelLTVPlus

4. Use an all-encompassing approach.

"The best comprehensive method for A/B testing is to clearly understand the demographic you're trying to reach, create specific goals for your tests and come up with an accurate and on point hypothesis to test. Then analyze the results and put them into action. This works because it's basically all-encompassing." ‒ Andrew SchrageMoney Crashers Personal Finance

5. Test for mobile.

"Never forget to run A/B tests for mobile components of your website, campaign or landing page. Many users will scour your content via their mobile phones or other devices, so it's important to make sure that those are always optimized for mobile. Otherwise, it could cost you conversions, subscribers and customers." ‒ Chris ChristoffMonsterInsights

6. Use Google Optimize.

"Out of all the paid solutions for conduction A/B testing, I find Google Optimize to be the best. It is free and already integrated into your Google Analytics. There is little to no learning curve to set up your tests, and you can measure your A/B tests using your current analytics goals. It is the best tool because you get a complete picture of the results to make the best decision." ‒ Brian GreenbergTrue Blue Life Insurance

7. Give it time to get complete results.

"Marketers love looking at data from just a couple of days and assuming they have the answer regarding split tests. The truth is that you need much more time to ensure you're looking at complete results. We like to let our A/B tests run for months sometimes so we can better gauge what our customer base wants from our product or service." ‒ Syed BalkhiWPBeginner

8. Segment your results.

"Many times the mistake is made of looking at A/B test data and calling it in favor of the winning arm without digging deeper. When we look at test data, we segment it by other attributes or events. For example, we've found some test arms win on desktop, but lose on mobile. Take the time to segment the data to pull out additional insights and maximize conversion rates." ‒ Colton GardnerNeighbor

9. Use visuals and insights.

"While data is incredibly useful, it can be misleading at times, too. Whenever we're running A/B tests, we use a heatmap such as Crazy Egg and insights from people with some familiarity with our industry, which we source using Upwork or UserTesting. This eliminates a lot of the inherent biases we naturally have and provides useful insights for us to optimize our landing pages." ‒  Brandon PindulicOpGen Media

10. Remain unbiased.

"The purpose of A/B testing is to find what happens with an average, neutral user. If you create the test with a bias, you're likely to skew the results by creating an option that overly points the user toward your preferred outcome, instead of finding a completely unbiased and truthful outcome. Be sure that you're not designing your study to find the answer that you want, but rather what works." ‒ Anthony SaladinoKitchen Cabinet Kings

11. Follow the 45/45/10 rule.

"For web A/B testing, there's myriad people with disabled cookies, slow connections and other technical details that will skew your results with a traditional 50/50 test. You want to run 10% of the original page, 45% to test A and 45% to test B. The people who cannot load the test will automatically be shown the original copy (10% segment), and your data will be more accurate. ‒ Karl KangurMRR Media

12. Don't be afraid of complexity.

"When running A/B tests with anything like landing pages or ads, I originally ran sequential tests, only varying one factor at a time (i.e., color, font, placement of a web button), but quickly realized multivariant testing is more accurate. By adding complexity, you can understand how the combination of factors works together to drive the preferred option. Start slow, then build up to multivariant." ‒ Jared WeitzUnited Capital Source Inc.

13. Test both variations at once.

"When A/B testing, if you run the first variation during the first couple weeks of the month and the second variation during the last two weeks of the month, those are both different times and will give you different results. The month, the day of the week, the time all matters. So make sure you're testing both variations simultaneously to get accurate results." ‒ John TurnerSeedProd LLC

14. Start with a clear idea.

"It's important to start with a good control version of your ad or landing page, then create multiple new variations to test against that. These variations have to perform better than the baseline results of your control. Test anything from your messaging, call to action, button color and the location of the call to action to see if it drives better results than your control." ‒ Andy KaruzaFenSens

15. Use the right software.

"For our business, we use OptinMonster to perform A/B tests on all our campaigns, which I believe is a huge reason why they're so successful. Using the right tools for your company can make or break its progress, so it is important to invest in tools and software that will give you the results you're looking for, whether that's now or further down the road when the company has grown more."‒ Jared AtchisonWPForms

16. Get user feedback.

"While A/B testing will show you what works and what doesn't, it doesn't tell you the reason why. Along with A/B testing, get feedback from real users as well. Getting feedback from your customers through an online survey will allow you to get insights into why something didn't work for them. The results from your A/B testing and customer feedback surveys will give you the full picture." ‒ Stephanie WellsFormidable Forms

Scott Gerber
Scott Gerber
See Scott Gerber's Profile
Scott Gerber is the founder of Young Entrepreneur Council (YEC), an invite-only organization comprised of the world’s most promising young entrepreneurs. In partnership with Citi, YEC recently launched BusinessCollective, a free virtual mentorship program that helps millions of entrepreneurs start and grow businesses. Gerber is also a serial entrepreneur, regular TV commentator and author of the book Never Get a “Real” Job.
Like the article? Sign up for more great content.Join our communityAlready a member? Sign in.
We'd love to hear your voice!
Login to comment.
LoginSign Up