The ABCs of A/B Testing

Page 2 of 3

BEST PRACTICES SERIES

HOW A/B TESTING HELPS PUBLISHERS

A/B testing isn't just for ecommerce giants and tech titans. Electronic publishers and content providers everywhere should be using A/B testing tactics to improve their online offerings, says Sagar Patel, digital director for Santy. "This can include title content, how much information to include about authors, implementation of comment engines, the sweet spot for the amount of body copy, and the use of complimentary images-anything that will allow publishers to receive more eyeballs on their content," he says.

David Nelson, digital analytics practice lead for eClerx, says modern A/B testing tools allow for testing of multipage experiences and can be targeted to specific segments of customers. "Digital publishers have an objective that is easy to understand, yet difficult to execute: How do I make it simple for visitors to find and consume the content that interests them the most?" says Nelson. "Content providers should test everything that could impact the ability of visitors to find content they are looking for, including content length, tone and title, search algorithms, content positioning, organizational filters, and behavior targeting."

Epublishers can post content, track visitors' statistics, and be pleased with the results, but simply tracking these statistics does little to show how to improve their efforts, says Simon Slade, CEO and co-founder of Doubledot Media. "A/B testing is necessary for comparing your current tactics with new ideas," Slade adds. "Only by testing one against the other can you determine which is the most successful. If you fail to test new ideas, you could be missing out on serious income potential."

Carlos Abisambra, CEO and founder of VORTICE SERVICES, agrees that regular A/B testing is a must for online content providers nowadays. "You are no longer competing on content alone, but also on how that content is presented," says Abisambra. "By not making the flow of information on your site more attractive to what your customers are expecting, you are at risk of losing viewership to a competitor that learned more about your customers via A/B testing."

GETTING STARTED

For those who are new to A/B testing, the best advice is to start small and graduate slowly to more complex testing. Many experts recommend following this simple A/B testing formula for best initial success:

  1. Choose a metric you want to test and improve and form a hypothesis (e.g., "Bigger but shorter headlines will increase length of visit on the page").
  2. Determine two designs/pages you want to test: the original page and the revised page.
  3. Aim for test results that are statistically significant (at least a 95% confidence level).
  4. Select a tool or a service that specializes in A/B testing to conduct the test.
  5. Try testing for at least 7 days so that you get a larger sample size of time and visitors.

Fortunately, most A/B testing tools can be easily deployed, often by adding a single line of code to a page. Thanks to services such as Optimizely, Maxymiser, and the free Google Analytics' Content Experiments, A/B testing has become much easier because there's a system in place to help you create that code, determine how much traffic you want to send to each version of a given page, collect the data, and compare the results.

Tyler Roehmholdt, senior manager of marketing technology at Campaign Monitor, says the most common elements to test are subject line, section titles, article length, calls to action, and header images. You could even test two totally different designs to learn which one garners the most clicks, says Roehmholdt. For instance, he says that you can attempt the following:

  • Test two different topics on the subject line to determine what content gets the most attention from subscribers.
  • Apply personalization to identical subject lines to find out if a first name greeting elicits a more desirable response.
  • Gauge what type of promotion is more popular: "free shipping" versus "20% off," for example.

"Start with a simple test that modifies some content on a single webpage," Nelson says. "To be safe, avoid testing important pages that carry the greatest risk, if errors were to occur. The focus of the first several tests is to learn about the testing process and the capabilities of optimization tools. Once the testing process has been ironed out, complex tests that impact user experience beyond a single page can be launched."

Drew Burns, principal product marketing manager for Adobe, recommends focusing first on your site's high-traffic, high-impact locations-such as a home- or landing page-wherein companies often fall victim to over-cluttering with carousels of banners. Another high-impact area to test is near your point of conversion-including the checkout funnel or "subscribe now" button or page. "You have an engaged audience at this point in the customer journey, so optimizing or even personalizing the experience can be immensely impactful in terms of your conversion goal and other metrics that are most important to your business," Burns adds.

BE PREPARED FOR BUMPS IN THE ROAD

Be forewarned: Conducting A/B testing isn't easy or foolproof. Your experiment needs to be set up carefully. The generated results can be frustrating and inaccurate; when it comes to making changes to your site, success isn't guaranteed. For example, poor data analytics can be a major challenge. "Very few marketers have taken any advanced statistics classes, and the math behind A/B testing is not as simple as determining which conversion number is larger," says Abisambra. "Fortunately, there are many tools readily available online that you can use and will spit out answers for you."

Being overzealous is another stumbling block. "We see it frequently with companies that run a couple of experiments, see great results and then want to throw everything at the [site]. When the ‘everything-in-the-kitchen-sink' experiment is completed, they are not able to attribute the lift, or lack thereof, to any single change that they made," Abisambra says. "Companies need to learn to walk before they run, so start by testing one item at a time."

Another problem is garnering a sufficient wide-angle view of your target audience. "A/B testing gives a limited view of the customer, as it only tests the average online visitor rather than a specific individual or robust segment," says Cory Munchbach, director of product marketing for BlueConic. "Instead, defining the audience by segmenting them into a number of complex categories gives marketers a more comprehensive understanding of the customer." If you test and optimize in real time instead of using A/B and multivariate testing, you can immediately start sending the majority of visitors to the better-performing variants, while still testing various options in order to get optimal results, Munchbach says.

While you may be excited about testing and changing your site, don't expect the same enthusiasm from users, who can be resistant to change. "Longtime users who receive the new experience will take time to familiarize themselves with the new layout. Be patient to see how returning visitors react to the new site over time," Nelson cautions.

Page 2 of 3

Related Articles

Perhaps the only thing that changes more rapidly than technology in today's amped-up digital environment is the terminology used to describe that technology and its impact on consumers--and marketers. One recent example is the advent of the term "omnichannel" marketing, which many struggle to differentiate from another relatively recent term--"multichannel" marketing. Still, those who are most enmeshed in the field say there is a key distinction between the two, and it's one that will have an impact on marketers as they continue to seek ways of having a meaningful impact on the consumers they hope to engage. And, importantly, it's less about technology than it may seem.
In this age of analytics, marketers are data detectives. A/B testing can help resolve many of their quandaries related to online marketing. But, first, what is A/B testing? It is a process in which you choose the best-performing version of a webpage, by randomly displaying different versions of your site to visitors and assessing the performance of each variant against a desired metric (such as clicks or sign-ups). You can test by tweaking one page element (such as headline, call to action, or image) at a time, or you can test changes to several page elements all at once (the latter is referred to as "multivariate testing"). Thus, when you use A/B testing, you are not flying blind. You're letting data drive your design choices and decisions. Think of A/B testing as "the scientific method meeting online marketing."
Sure, posting trip photos, tweeting links, and networking with old co-workers is fun. But online citizens want to tap into the pulse of the world around them, belong to a community, and be recognized. "Social media has evolved into a personal online identity. Before, it was used to connect with friends and update your status," says Jesse Till, social media coordinator for Chatter Buzz Media. "Now, it defines a persona. It's a chance for a person to express and broadcast themselves to the world."