In Kit, you can test two subject lines for a Broadcast to determine which version your recipients find more appealing.
Here's how it works:
We'll send two versions of your Broadcast with different subject lines to 15% of your recipients (i.e., 30% of your total recipients).
After the testing period, we'll determine the winning subject line based on which one resulted in the higher open rate.
We'll then automatically send the winning variant to the remaining 70% of your recipients (i.e., those not involved in the initial test).
Setting up an A/B test
To set up an A/B test for a Broadcast, click the A/B icon next to the subject line in the Broadcast editor:
Alternatively, turn on the Run an A/B Test toggle in the right sidebar.
Fill out your two subject line variants in the window that pops up:
Then, use the Settings tab to set your A/B test duration. Your test can run between 30 and 240 minutes, with 240 minutes being the default. Click the curved arrow icon to use this default duration.
If you're unsure how long your test should be, get guidance on deciding the duration here.
Hit Save to save your A/B test settings.
You can then send the Broadcast as per normal and the test will happen automatically—no further action is required on your part (except checking back to learn the winner if you're curious!).
If you need to edit your A/B test settings, click the A/B icon next to the subject line or the Edit A/B test button in the right sidebar.
Alternatively, disable the Run an A/B Test toggle if you change your mind about running an A/B test.
Who should (and shouldn't) A/B test?
We recommend A/B testing emails going out to 1,000 recipients or more. A test run on a smaller number will not give actionable data as the results will not be statistically significant.
For example: if you send your Broadcast to 100 recipients, each subject line variant would go out to only 15 people (15% of recipients) for the initial test. Even if your email has a high open rate of ~50%, this still only means seven or eight subscribers will likely open each variant.
The result? That "winning" subject line could be determined by the whims of a single person. One individual's decision to open an email (or not) will not provide meaningful insights for applying to your greater list.
How long should your A/B test be?
Our suggested A/B test duration is 240 minutes, which you can use by clicking the curved arrow icon in the duration setting. But these pointers may be useful if you want to set your own duration:
If you're sending to a smaller list, we suggest setting a longer A/B test duration to give more recipients time to open your email. This, in turn, will lead to a more meaningful test result.
That said, if your email is more time-sensitive, set a shorter A/B test duration so that all recipients receive it as close to the initial send time as possible.
Experiment with different test durations to learn what works best for you.
A/B test reporting
The Analytics page of your Broadcast's reports will show the A/B test results for each variant. Each variant's stats will continuously update as your test goes on.
After the test is complete, the winning variant will be highlighted in green and display a star icon! ⭐
The aggregate stats for your Broadcast—as sent to all recipients—will also become available on the Overview page at the end of the test.
FAQs
Where in Kit is A/B testing available?
A/B testing is available for only Broadcast subject lines at this time.
Can I end an A/B test early?
Yes! To do so, go to the Broadcast's Analytics report.
Click the three vertical dots on the right, followed by Cancel A/B test.
What happens if I cancel my A/B test?
If you cancel your A/B test before it completes sending, your Broadcast will have been sent to only your test recipients (i.e., 30% of the recipient list).
To send the Broadcast to the remaining 70% of the recipient list, click the Recipients page to view the 30% of subscribers who have already received the Broadcast.
Select all of these subscribers, and then add a temporary Tag (e.g., a Tag called "Cancelled A/B test") to them.
Duplicate the Broadcast. Next, send it to your initial recipients list (i.e., all 100% of them) except for the subscribers who have the temporary Tag you just created.
Feel free to delete the temporary Tag after that.
Why does the winner have a lower open rate?
At the end of the A/B test, our system will automatically send out the subject line variant with the higher open rate at that time to the remaining recipients.
However, the losing variant at the end of the test can end up overtaking the winner later on while your original test recipients continue to open the two test variants after the end of the testing period.
We continue to update the isolated stats for each test variant even after the test is over. This makes it possible for the winner of the A/B test to ultimately have a lower open rate after the other variant overtakes it.
Why does the winner have a lower click rate?
Click rates do not factor into A/B testing—only open rates.
We display the click rate per variant for your reference. However, it will not factor into which subject line wins the test.