A/B testing is often used to test variations to web pages and applications in an effort to increase user engagement. But A/B testing can also be used to good effect in the world of email, and the same concepts apply.
A/B testing (also known as split testing) is a randomized experiment with two variants (A and B) to determine which produces the best results. A/B testing is one of the easiest and most effective ways to optimize your email engagement. The goal of A/B testing email is to improve open and click-through rates by identifying phrases, formats, or techniques that subscribers are more likely to respond to. Once identified, they can be combined to significantly improve the success and profitability of your email campaigns.
When A/B testing bulk email, choose one element to test at a time, such as:
- Subject line – Try variations of the same message.
- From name – Determine whether subscribers are more likely to open your email if it comes from a person versus the company or organization.
- Personalization – Address emails to either the subscriber’s first name or their last name (“Mrs. Jones”)
- Delivery dates/times – Send emails at different dates/times to determine when subscribers are most likely to open your email.
- Call to action – Vary the language used to entice readers to click.
- Hyperlinks – Determine whether subscribers are more likely to click on a linked image or linked text.
- Email design – Change the layout of your emails, testing one versus two columns, for example.
When deciding what to test first, consider where you are having the most difficulty. It doesn’t make sense to focus on click-through rates if your open rate is low. If that’s the case, then testing subject lines, ‘from’ name, and delivery dates/times can help you determine how to increase open rates. Regardless of which element you choose, be careful to conduct one test at a time. For example, don’t A/B test a landing page at the same time you’re A/B testing an email that directs to that landing page. You won’t know which element drove an increase (or decrease) in clicks.
It is also important, whenever possible, to test your entire list. This, along with randomly and equally splitting the list will help ensure accurate results. Both variations must also run at the same time, otherwise you are introducing another element (day/time) that can influence your results.
To learn more about SparkPost’s real-time analytics, which you can use to support A/B testing email, see https://blog.sparkpost.com/real-time-analytics-digging-into-email-metrics/.