Split Testing (A/B Testing)
  • 03 Apr 2024
  • 1 minute read
  • Dark
    Light
  • PDF

Split Testing (A/B Testing)

  • Dark
    Light
  • PDF

Article Summary

Split testing in Deliver allows for the creation and testing of two different versions of a message to see which results in more opens or clicks. Split testing is only available for outreach mailings, but is not available for system emails customized in Deliver.

To set up a Deliver mailing for split testing:

  1. Click Deliver in the top navigation bar.

  2. Click New Mailing

  3. After setting up a recipient list, click Edit Message, and craft the first version of the message to be tested. Select "Version 1" in the dropdown next to the subject field. Save the message.

  4. Next, select "Version 2" from the dropdown. The subject field and message body will clear of text. Craft the second version of the message to be tested and save the message. 

  5. Once both versions of the message are configured, an icon next to the different preview message modes will appear. This icon can be used to toggle between the two versions.

  6. Click on the "Send Mailing" button to configure send settings for the mailing. Split Testing settings will only be visible if two versions of a mailing are created:

    • Split Testing: Choose to send a split test sample, Version 1 only, or Version 2 only of the mailing.

    • Sample Size: Set a sample size for the mailing test.

    • Sample Period: Define how long you want the mailing test to run, from a few hours to multiple days. 

    • Action after sample period: Choose to deactivate the mailing after the sample size has run. If choosing to deactivate the mailing, there is the ability to assess which version of the mailing received the highest unique click rates in the statistics area of the Deliver mailing, to help decide which version to use. 

Is split testing only available for campaign messages?

No, split testing may be used for both campaigns and one-off or individual Deliver mailings.

Tips

Split testing in Slate is not multi-variant testing, so be consistent with the variables that differ between the two versions. Choosing a specific variable, like subject line, will make it easier to pinpoint which element of the message is most likely attributed to higher clicks or opens, and devise a formula for what's working or not working for a specific communication.


Was this article helpful?