9. A/B Test of an Email Campaign

A/B testing, in general, is the concept of creating and testing two variations. This concept is used in many parts of running a business, as it can provide a lot of valuable information about the products and brands, by simply analyzing the feedback of two target groups. The process of A/B testing can be as simple or as complex as you want and need it to be. It is a completely personalized method of comparison, and as such it provides actionable results.

What is A/B testing?

In email marketing, A/B testing refers to the process of dividing subscribers into two groups and sending them two different variations of the email. The purpose of A/B testing is to detect strong and weak aspects and to analyze the results of both campaigns. This helps with understanding the potential of your business and your subscribers, which is something your business will benefit from.

To begin with, you will need a tool. Email marketing platforms frequently offer A/B testing as one of the features. The differences here may be in the dashboard layout and some functionalities, but when a software includes A/B testing, you will find all features you need to design, run and monitor this kind of campaign. The choice of a tool is entirely elective, based on your personal preferences, budget, etc.

The next step is setting up goals and choosing the specific aspects of the email campaign you want to test.

What to test in an email campaign?

This is the first question you will need to answer in the process of A/B testing. As much as it seems easier and more productive to test multiple things, this can provide misleading results. To get accurate and tangible results, make sure you test only one thing. In that case, you focus on one thing, one part of your email, exploring ways how you can improve it to achieve better performance.

Subject line

A subject line has a great influence on the open rate, so it is worth investing time in finding out which words or phrases in the subject line generate the highest open rate. Besides the open rates, you could analyze other metrics as well, such as the click-through and conversion rate, and how these metrics change when you alter the subject line.

Different aspects of the subject line you could test include:

Word count

It has been debatable how the length of the subject line affects email metrics, such as the open rate. While some argue the fact that shorter subject line is more effective, for some industries a longer, more descriptive one works better. The best way to make a choice is to analyze your own business through A/B testing.

Word choice

Some words are simply more converting than others, and although there are some general suggestions on best-converting words across different industries, this choice should be based on your own experience. Try testing synonyms as well, because “20% off” might give better results than “discount”.

Word order

Word order is the final thing that could be tested within a subject line. Having decided the right length and word selection, all that is left to do is to experiment with word order and see if there are noticeable differences. For example, there might be differences in engagement when you say “Get your free ebook” in comparison to “Free ebook get yours now”.

Call to action

Call to action, as the most prominent part of an email, surely deserves testing. The main purpose of the call to action is to invite users to do something, and it is only fair to explore different options and see what you can do to improve the number of conversions.

Here is what you can test with CTAs:

  • Size
  • Color
  • Font
  • Font color
  • Text
  • Position in the email

The information about these elements will help you define how you want your CTA to look like and what you want it to say. For example, using a different color for the CTA button during the testing might show that a certain color is more engaging than the other. It is not the same if the text on the CTA button is “Get the coupon” or “I want the coupon”. The differences may be subtle at first glimpse, but the results might surprise you.

Body text

Testing content also offers insights into the way you should be communicating with your subscribers. Unlike a CTA or a subject line, where subtle differences are often the subject of A/B testing, with body text variations are more obvious. Here are the ideas for testing the email body:

Text formatting

Formatting the text is a very important part of improving email performance, but it is also an area offering so many variations. A general rule is to use paragraphs for different parts of the text, to have a heading or a title, to use different colors, bold or italic to highlight the most important parts of the text, links, etc. However, there are many possibilities to optimize an email in such a way, so it is best to test different options and see how each of them performs.

Text length

It is commonly recommended that an email message should be short and concise. However, this does not necessarily have to be true for your business. With this sort of testing, you explore two emails with the same subject line, the same CTA and the message that is essentially the same (an announcement, a blog update, etc.) but the way you share that message is what differs. In one email, you will use fewer words and get straight to the point. In the second version, try to be more descriptive and talkative.

Visualization

Visual elements in the email design include:

  • The use of images
  • The implementation of a video
  • Columns

When people open an email, they scan it for a couple of seconds, before they decide to either keep reading or go back to the inbox folder. In this case, even if the email is opened, your message might never reach the users, because they will disregard it too quickly without reading it. To prevent this from happening, and to increase the level of engagement among the email recipients, you could use visual elements. With this kind of testing, all of the email elements will be the same, but the presentation would be different. One email could feature the text in one column, while the other version would have two or three columns. The same goes for the use of images and videos.

Offer

Announcing offers such as discounts, free resources, etc. might be a part of your online business. If that is the case, you will need to find out how these offers affect your business. And in case you are only in the phase of exploring the use of offers in your business, you will benefit so much from A/B testing. Here is what you can test when it comes to offers:

Type of the offer

The first thing to test here is the type of the offer. There are many things you can offer to your subscribers. Free downloads, images, video courses, discounts, coupons, templates, tickets, etc. are all valid as offers and hopefully they will be of some interest to your subscribers. Offering something to the subscribers only, can make your offer even more exclusive. With A/B testing, you will choose between these types and explore metrics to see how the type of the offer affects the campaign performance. For example, you might notice that free images are more interesting to your subscribers than free templates. These kinds of insights will help you create better content in the future because you will focus on the content and offers your subscribers find worthy the most.

Time limitations

Some offers are valid for only a short time, usually, a couple of days, while some may be available for unlimited time. This can affect the engagement rate because time limitation can urge the recipients to complete the action sooner than they have planned to. This is why an offer with a time limit is something worth testing. For example, if you limit the time for redeeming the coupon code, this might result in a higher conversion rate than a coupon code with no time limitations, as the subscribers must hurry up if they want to take advantage of the offer.

Personalization

There is no doubt that personalization is important for better email deliverability, but to which extent and how you personalize an email can also be a subject of A/B testing. For example, using a subscriber’s name from the database may affect open rates, and since this part of the text is shown in the preview of the email, try testing it to see how it affects the performance. You could also test different writing styles, using formal or informal language, etc.

Localization

Localization can help you establish a better relationship with the local customers, it can drive sales, etc. In fact, “56.2 percent of consumers say that the ability to obtain information in their own language is more important than price” (Source).

A/B testing in terms of localization means testing the performance of email variations with different localization elements including different language, announcements of local events, sales that feature prices in the local currency, etc. With this approach, your goal is to test how emails with and without localization elements perform, whether they bring more clicks, conversions, etc.

How to conduct A/B testing?

Even when you decide what it is that you want to test, you might still be in a dilemma on how to start and how to do the entire process. You have your mailing list and in most cases you will test it entirely, meaning the subscribers will be divided into two groups and each group will be receiving a different version of the email.

If you are in the process of testing an idea or a limited-time offer, you might want to reach out to a certain number of subscribers only. For example, when testing a beta version of a new feature, there is no need to send an email to the entire list of subscribers, because it is in your interest to get feedback from a small group of people.

The success of A/B testing directly depends on what success is for your business. Alternation of email elements to create two variations is used to allow you to examine how each of them affects email marketing metrics. To make this process efficient and measurable start by setting up goals. Do you want to increase the open rate with different subject lines? Do you want to increase a click-through rate by testing different types of a CTA?

Also, make sure you keep an eye on the numbers. Create a file where you will monitor comparison with quantifiable measures. Mark the metrics you want to experiment with and add them values. For example, the open rate was 10%, the click-through rate was 8%, etc. Then add the results of A/B testing to see how these change and what helped you improve the rates.

Other things to have in mind with A/B testing:

  • Test simultaneously – Running two versions simultaneously will help you get relevant results, and avoid discrepancies that might be time-based in which case your analysis will not be much effective.
  • Test large when possible – Reaching more subscribers means you will get a larger sample to analyze, which can lead to more accurate results than testing a small group. When possible, always go for a larger group.
  • Test one thing at a time – The A/B testing results can only be accurate if you are testing one email element at a time because otherwise, you can never be completely sure which of the elements and its variation resulted in the increased rates. Testing several things at once will make it more difficult to determine the winning combination and the elements that generate the best performing email marketing campaigns.

Finally, you should have in mind one thing. A/B testing is not always going to lead to increase in rates. In fact, sometimes, there might not be any differences at all or the rates might just stand still. This situation is completely normal, because A/B testing starts with the idea to explore the efficiency of different variations of email elements. Occasionally, both variations can have the same effect. Sometimes, the new version, you have previously considered an improvement, might even cause the rates to drop. The important thing is to learn that email elements can affect performance of the campaign, and through testing you are trying to figure out how to do so, using testing options that are available to you.

You should always make a report with the initial data and the results after A/B testing, because this will allow you to draw conclusions, and it would also serve as a good starting point for email marketing testing you will do in the future.