Definition of A/B test
An A/B test evaluates the performance of an email campaign by sending two versions of an email to two representative samples of a recipient database. After comparing the statistics, the best-performing version is sent to the rest of the contacts. The A/B test offers the possibility to compare different elements: the subject line, the sender field or the content of the message.
Watch the video on this topic:
How to create an A/B test
To create an A/B test, go to the Mailify Sunrise home screen, click on the "+" button and choose "Email".
You are then asked to choose the type of email campaign you want to create. Select "A/B test".
Then you need to set your campaign's properties: set its name, sharing settings, and tag it (optional).
Then import your list of recipients and make the adjustments and analyzes of your choice on your database.
Set up the A/B test
Once you have imported your contact database, you can proceed to the setup step of your A/B test.
This is where you will define all the details for your test, and decide which elements you want to test.
This step is divided into three parts:
What would you like to test?
This is where you tell Mailify which item(s) in your email you want to test for in order to compare the results.
Simply check the item that interests you. Depending on what you choose to test, the message creation step will be slightly different:
- Sender field: allows you to compare two names and two sender addresses
- Subject line: test two different subject lines and find out which is the most convincing
- Email (sender + subject + content): compare the performance of two different emails in all points.You have the opportunity to test both the sender field, subject line and the entire content of your message.
By choosing this option, the message creation step will be divided in two and you will then proceed to step 3.1 for the creation of your first emailing (message A), and step 3.2 for the second email (message B).
Define groups A and B
Here you can choose the proportion of your recipient database that will be used as a representative sample of your contacts. You can also select the size of the two groups of contacts who will independently receive both versions of your message.The recipients who will receive the tests will be selected at random.
For the sake of representativeness, it is advisable to have at least 100 people per test group to avoid skewing the test.
To change the proportion of contacts in the samples, move the slider to the left or right to increase or decrease that percentage.
Trigger sending the rest of your list
Here you can define the conditions for determining the winning message and the conditions for sending it to the rest of your database.
First, check the condition that will determine which version of your campaign will be set as the winner at the end of the test:
- Best open rate: The message with the highest open rate will be considered the winner of the test
- Best click rate: The message with the most clicks will be considered the winner
- Manually: You select the winner of the test yourself based on the behavioral statistics that will be presented to you over time and choose when the A/B test should stop
If you chose "Best Open Rate" or "Best Click Rate", you must choose the duration of the test using the selector. It's possible to define a number of hours or days in which the test will be automatically canceled and the statistics analyzed by Mailify to send in the stride the winning message to your entire base.
Creating the message
During an A/B test campaign, the creation modes of the message(s) are the same as for a standard campaign.
Similarly, the steps specific to creating your email campaign are also the same, apart from the elements specific to the test as presented below in the "What do you want to test?". You will find these detailed steps in the creation part of an email campaign.
Test and send an A/B test
This last step differs slightly from the "Test and send your standard emailing campaign "step.
In an A/B test, you get two different campaign summaries. You can switch from one to the other by clicking on "Message A" and "Message B" to obtain details on any of these versions.
Similarly, you can also send separate tests, to verify the correct layout of both your messages.
By clicking on "Finalize campaign", you launch your A/B test on the contact samples that you have defined previously.
If you have chosen an automatic decision (better opening rate or better clickthrough rate), you don't need to do anything. Mailify Sunrise will send your two versions, analyze the results and send the final campaign.
If you have chosen a manual trigger, you will be able to consult the performance of your two versions at any time from the Mailify Sunrise home from the "Campaigns" menu on the home screen. Click on the statistics icon next to the screen next to the name of your A/B test campaign. You can then trigger the sending manually whenever you like.