Putting Users First with A/B Testing

via Kinsta

Have you ever noticed one of your apps sporting a new look, seemingly out of the blue, without ever hitting the update button? Chances are, you’ve stumbled into the intriguing realm of A/B testing.

What is A/B Testing?

Per the Harvard Business Review Journal article, A Refresher on A/B Testing, at its most basic, A/B testing is a way to compare two versions of something to figure out which performs better. Through a randomized experiment, designers will roll out two versions of a product, testing a control (A) against an alternative (B). The goal is to figure out which version is most successful amongst users based on your predetermined metrics.

What Can You Test?

This testing method isn’t just practiced on mobile apps, it’s one of the most popular forms of live experiments used by website managers, digital marketers, and user experience (UX) designers to evaluate everything from website layout to search function to size of imagery.

A/B testing can begin once you decide what you want to test. Something as simple as the color of the shopping cart icon on a retail website is testable. Whatever the subject, A/B testing serves to determine which iterations of your product make it more user-friendly.

How Does It Work?

Figuring out what to test should begin with an educated guess. As detailed by UXPin’s article, A/B Testing in UX Design: When and Why It’s Worth It, your goal should attempt to define users’ pain points, or to figure out what prevents them from taking desired actions (e.g., I want my online retail site to have a higher turnover rate from browsing to purchasing). Now, you can form a hypothesis (e.g., I think that changing the color of the “Add to Cart” button from red to green will entice users to make more purchases) and establish two versions of a variable (e.g., red button and green button). In this case, the latter is the altered variable (B), and the control (A) is the red button used to compare the green button against.

via Optimizely

You might be thinking, “What does this have to do with the apps I use?” The experimentation process for A/B testing directly involves ordinary users like you and me.

Now that variations are established, test monitors will show one set of users version A while the other receives B. Of course, users don’t know which version they’re interacting with. The test groups are set randomly and are designed to achieve unbiased results through live data collection, thus giving users an indirect seat at the decision-making table.

Tips for Executing a Successful A/B Test:

  • Select the extent of your changes based on website traffic; high-traffic changes should be kept minimal, while low-traffic changes can be broad
  • Testing should be carried out long enough to collect statistically meaningful insights. More information collected = More dependable results
  • Inconclusive results mean that both designs are strong enough to implement, and now you’re free to choose!
  • Test as many hypotheses as you want, just be sure that they’re prioritized based on consumer-related research

Testing Tools

There are many reliable online tools to guide designers through the A/B testing process. Of those options, VWO recommends the following:

  1. Optimizely – Optimizely provides services ranging from website and feature experimentations to content personalization. Their server-side testing collects precise customer data which informs companies how to better their product.
  2. Google Optimize – Google Optimize offers extensive multivariate testing, allowing companies to test over 100 variables at once. Their A/B experiments are behavioral-focused and cover a lot of ground when it comes to user preferences.
  3. AB Tasty – AB Tasty uses AI to power their experience optimization functions. They support companies in equipping their products with the right features and functionality for their users’ needs.

Putting A/B Testing to the… Test!

A/B testing has been increasingly utilized to center optimization efforts around user data to improve both ease-of-use on the user’s end and business requirements on the company’s end. Businesses everywhere are realizing its potential across many industries.

SWISSGEAR Product Pages

via LinkedIn

SWISSGEAR is a retail company that’s best known for the durability, functionality, and style of their products. According to Willem Drijver’s LinkedIn article, 7 Successful A/B Test Experiments, the company wanted to increase conversions on their product pages. Their control (A) displays black and red fonts that don’t put emphasis on any particular piece of information. So, they created their variable (B) that applies red text on the “Special Price” text and “Add to Cart” function. After A/B testing was completed, it was found that the variable produced a 52% purchase conversion rate which jumped to 132% during the holiday season.

HubSpot Email Subscriptions

Drag the slider left and right to view the control (A) and the variable (B) – via HubSpot

HubSpot, a platform that offers software resources to businesses at a customer-forward approach, wanted to engage more subscribers via email. In their blog post, 11 A/B Testing Examples From Real Businesses, the company explained that to achieve this, they would test two versions of text alignment in emails to see which would receive a higher click rate. They set their variable (B) as left-aligned text and compared this to their control (A) which was centered text. The results showed that emails with left-aligned text got fewer clicks than the control, with less than 25% receiving more clicks than the centered text.

To Sum It All Up

It’s become more important than ever for companies to pay closer attention to the people who actually use their products. A/B testing makes this task easier and benefits both the business and the users through live, reliable data. Now, when you see sudden modifications while scrolling, you can trust that you’re in good hands, because that company might just be in the process of making your experience better for you!

Works Cited

A/B testing in UX design: When and why it’s worth it. (2021, December 17). Studio by UXPin. https://www.uxpin.com/studio/blog/ab-testing-in-ux-design-when-and-why/

Drijver, W. (2022, April 7). 7 successful a/b test experiments. LinkedIn. https://www.linkedin.com/pulse/7-successful-ab-test-experiments-case-studies-willem-drijver/

Gallo, A. (2017, June 28). A refresher on a/b testing. Harvard Business Review. https://hbr.org/2017/06/a-refresher-on-ab-testing

Guha, P. (2021, January 4). Top 13 best a/b testing tools, platforms & softwares in 2024. VWO Blog. https://vwo.com/blog/ab-testing-tools/

Riserbato, R. (2023, April 21). 9 a/b testing examples from real businesses. Blog.hubspot.com. https://blog.hubspot.com/marketing/a-b-testing-experiments-examples

Leave a comment