Born with an astigmatism (nearsightedness) that continually developed throughout my childhood, I quickly became acquainted with my local optometrist through frequent checkups to update my prescription. In retrospect, narrowing down and pinpointing an eye prescription must have been a tedious task, especially when the test subject is an easily distracted six year old. Nevertheless, my eye doctor took the time to go over small quirks in my vision. He would isolate my vision to just one eye, administer lens samples of slightly differing strength, and use my "yes" or "no" feedback to prepare another adjustment, eventually refining the results into the correct prescription. This experiment style of simple variation and immediate feedback can be employed in a multitude of contexts, but it is arguably most applicable in the field of digital marketing.
Indeed, the beauty of A/B testing is its effectiveness in answering the simple questions digital marketers ask on a daily basis. If a marketer wonders whether a slight change in a webpage (wording, color, etc.) will positively influence a visitor's perception, A/B testing allows them to isolate the variable and test it against a control.
Optimizely is a leader in A/B testing, ringing in big name clients and providing educational content for prospective digital marketers. The first part of their post explains the basics, how visitors can react positively or negatively to a subtle variation to an original webpage, like a change in button color or wording of a banner ad. Although such changes may seem insignificantly minute, they do incrementally effect real change in consumer interaction with the site. Logically, if one performed A/B testing on all factions of a webpage, a website would reap considerable benefit from the cumulative result of all changes stacked together.

When embarking on an A/B testing effort, higher rewards are in store for those that locate tests efficiently and use measured methodology. Usually, when selecting what aspect of your marketing to test, the most efficient starting point is with ads that show up as results on search engines. Ads are the most prominent communicators with consumers, acting as an initial point of contact, directing searchers toward the brand website for more information. Although optimizing ads is the primary driver of improved results, marketers also must allocate resources to perfect landing pages. Increasing the relevance of a landing page to a prospective customer's search will drastically increase the chance of closing with that individual.
The structured process Optimizely advises when conducting A/B testing is similar to the classic scientific method. The first step is to collect and evaluate data, in order to determine which parts of the website drive the most traffic, or which aspects have the most potential for improvement. The next step is identifying goals, referring to the process of defining which metrics will be measured to determine a successful change. Then, marketers give a hypothesis, estimating and justifying what variables need to be changed. Next, marketers create variation, designing the experiment, selecting precisely which variable will be manipulated, as well as the nature of the manipulation (red vs blue button). The fifth step is to run the experiment, where portions of the site's typical traffic are designated toward either the variation or the control, collecting data on both forms. The final step of the A/B testing process is to analyze results, drawing conclusions from the data about whether the marketer's hypothesis was correct. Through disciplined progression through the process of A/B testing, the marketer is ensured to have valid results, whether the variations are beneficial or not.
Since Google is the most common source of website traffic, their word on A/B testing best practices is wise and widely accepted. Optimizely relays a few of their tips in their blog post. Firstly, Google announces their lack of tolerance for cloaking. Cloaking basically entails firms altering the content of their website as a page "variation," intentionally displaying content to searchers that typically not be displayed. If you are running a test with multiple URLs, Google recommends that you use the rel="canonical" attribute to distinguish variations and point back to the original. When redirecting the original URL to a variation URL, using a 302 redirect rather than a 301 designates the webpage as a temporary change, rather than a permanent one. Finally, Google requires that tests run only as long as necessary for marketers to obtain statistically significant findings. An unreasonably long test is perceived by Google as an attempt to deceive the search engine. Any breach of these ethical guidelines established by Google can result in a demotion of your site or elimination of your content from the search engine completely.
Let's take a look at how some big industry names used A/B testing to optimize web results:
ComScoreEvaluating the impact of the characteristics of a testimonial bar on the webpage, ComScore tracked several goals like page views, clicks, and engagement. They tried three variations, keeping the vertical text box and adding a logo of the testimonial company, and implementing a horizontal text box with and and without logo as options. Combining their Digital Analitix tool with Optimizely's A/B testing, ComScore was able to filter through different clients and results to discern that the first option of variation was by far the best.
SonyIn order to test the effectiveness of a promotional banner ad, Sony constructed an A/B/C type experiment, with two variations from the original. The original banner ad informed the reader of the current promotional offering on laptops, emphasizing the personalization of the product. The winning differentiator was a larger banner ad, focusing entirely on the customizability of the laptop, as well as adding call to action button to get the customer started on the process of purchasing a laptop.
Soccerloco
This test analyzed customer preferences for one of the most crucial pieces of the website, the checkout menu. Soccerloco tested the variation of a simplistic checkout with larger call to action buttons against a control of a detailed, wordy menu. The bigger call to action idea received the most traffic in testing, with Soccerloco projecting a 26% increase in revenue upon implementation.
What do all of these companies have in common? They were all seeking optimization of the web pages through subtle variations to achieve tangible results. Tangible results, indeed: they were all successful in their endeavors to produce value for the company.
How will A/B Testing affect the digital landscape in the coming years?
![]() |
| Digital Landscape |


No comments:
Post a Comment