Skip to main content
All CollectionsSendBroadcasts
How to A/B test subject lines
How to A/B test subject lines

How to A/B test your Broadcast subject lines, the reports provided, and answers to frequently asked questions.

Updated this week

In Kit, you can test up to five Broadcast subject lines to determine the version your recipients find more appealing.

Testing your subject lines and using the winning variant is a great way of increasing your emails' open rates and engagement. Before setting up your A/B tests, we recommend understanding the A/B test factors so you get results you'll be most confident in:

NOTE: Testing more than two subject lines is only available on the Creator Pro Plan.

How to A/B test your subject lines

To set up a subject line A/B test for your Broadcast, click the flask icon next to the subject line in the Broadcast editor:

Alternatively, turn on the Run an A/B Test toggle in the right sidebar.

Fill out your subject line variants in the window that pops up:

Fields for two subject line variants are provided by default. If you're on the Creator Pro Plan, you can test up to five subject line variants by clicking + Add another subject line.

Click Continue to go to the Settings tab and set your A/B test duration.

Your test can run between 30 and 240 minutes, with 240 minutes being the default. Click the curved arrow icon to use this default duration.

Otherwise, use the slider to adjust your test duration in 30-minute intervals, or type a custom test duration into the minutes field on the right.

Hit Save to save your A/B test settings. You can then send or schedule the Broadcast, and the test will run automatically.

To edit your A/B test settings, click the flask icon next to the subject line or the Edit A/B test button in the right sidebar.

If you change your mind about running an A/B test, you can turn it off by disabling the Run an A/B Test toggle.

How does the A/B test work?

If you're testing two subject lines:

  • We'll send two versions of your Broadcast with different subject lines to two 15% segments of your recipients (i.e., 30% of your total recipients).

  • After the testing period, we'll determine the subject line with the higher open rate.

  • We'll then automatically send this winning variant to the remaining 70% of your recipients (i.e., those not involved in the initial test).

If you're testing three to five subject lines:

  • We'll send three, four, or five versions of your Broadcast with different subject lines (as relevant), with each version going out to approximately 7.7% of your recipients.

  • After the testing period, we'll determine the subject line with the highest open rate.

  • We'll then automatically send this winning variant to the remaining recipients not involved in the initial test.

How many recipients should you have for an A/B test?

If you want to run an A/B test, we recommend having the following number of recipients:

Number of subject lines you're testing

Recommended number of recipients

2

At least 500

3

At least 750

4

At least 1,000

5

At least 1,250

Test runs on numbers smaller than these will not give actionable data as the results will not be statistically significant.

For example, if you're A/B testing two subject lines for a Broadcast sent to 100 recipients, each subject line variant would go out to only 15 people (15% of recipients) for the initial test. Even if your email has a high open rate of ~50%, this still means only seven or eight subscribers might open each variant.

The result? That "winning" subject line could be determined by the whims of a single person. One individual's decision to open an email (or not) will not provide meaningful insights for applying to your greater list.

How long should your A/B test be?

Our suggested A/B test duration is 240 minutes, which you can use by clicking the curved arrow icon in the duration setting. But these pointers may be useful if you want to set your own duration:

  • If you're sending to a smaller list, we suggest setting a longer A/B test duration to give more recipients time to open your email. You will get a more meaningful test result this way.

  • That said, if your email is more time-sensitive, set a shorter A/B test duration. This is so that all recipients receive your email as close to the initial send time as possible.

Experiment with different test durations to learn what works best for you.

A/B test reporting

The Analytics page of your Broadcast's reports will show the A/B test results for each variant. Each variant's stats will continuously update as your test goes on.

After the test is complete, the winning variant will be highlighted in green and display a trophy icon.

The aggregate stats for your Broadcast—as sent to all recipients—will also become available on the Overview page at the end of the test.

FAQs

Where in Kit is A/B testing available?

A/B testing is available for Broadcast subject lines at this time.

Can I end an A/B test early?

Yes. To do this, go to the Broadcast's Analytics report.

Click the three vertical dots on the right, followed by Cancel A/B test.

What happens if I cancel my A/B test?

If you cancel your A/B test before it completes sending, your Broadcast will have been sent to only your test recipients.

To send the Broadcast to your remaining recipients, click the Recipients page to view the subscribers who have already received the Broadcast.

Select all of these subscribers, and then add a temporary Tag (e.g., a Tag called "Cancelled A/B test") to them.

Duplicate the Broadcast. Next, send it to your initial recipients list (i.e., all 100% of them) except for the subscribers who have the temporary Tag you just created.

You can delete the temporary Tag after that.

Why does the winner have a lower open rate?

At the end of the A/B test, our system will automatically send out the subject line variant with the highest open rate at that time to the remaining recipients.

However, the losing variant(s) at the end of the test can overtake the winner later on when your test recipients continue to open the test variants after the test period has ended.

We continue updating the isolated stats for each test variant even after the test is over. As a result, it's possible for the A/B test winner to ultimately have a lower open rate after the other variant(s) overtake it.

Why does the winner have a lower click rate?

Click rates do not factor into A/B testing—only open rates.

We display the click rate per variant for your reference. However, it will not factor into which subject line wins the test.

Did this answer your question?