Skip to main content

How to Run an A/B Test in Outreachly

A detailed article on A/B testing using campaigns in Outreachly

Joudy | Outreachly avatar
Written by Joudy | Outreachly
Updated over 3 months ago

A/B testing is a great way to optimize your outreach campaigns, ensuring you use the best-performing messaging to increase engagement and conversions.

This guide will show you how to run an A/B test manually by duplicating a campaign, introducing a variable, and moving prospects.

Steps to Run a Manual A/B Test in Outreachly

1. Duplicate the Campaign

  • In your Outreachly dashboard, you can find the campaign you want to test.

  • Click Duplicate to create an exact copy of the campaign.

  • Name the duplicated campaign clearly, like "Campaign A - Subject Line 1" and "Campaign B - Subject Line 2."

2. Add Your Variable

  • Decide what one element you want to test—this could be:
    ✅ Subject line
    ✅ Email body content
    ✅ Call-to-action
    ✅ Sending time
    ✅ Personalization style

  • Update only one variable in Campaign B to accurately measure its impact.

3. Move Prospects Between Campaigns

  • Add your data to one campaign and then split your prospect data evenly and randomly between Campaign A and Campaign B.

  • For a fair comparison, ensure each campaign has a similar mix of lead types, industries, or engagement history.

  • Follow this Guide on Moving Prospects.

4. Launch the Campaigns

  • Start both campaigns at the same time to maintain consistency in results.

  • Monitor key metrics such as open rates, reply rates, and conversion rates over a set period.

5. Analyze the Results

  • After a set period (e.g., one week), compare Campaign A and B based on:

    • Open rates (which subject line performed better?)

    • Reply rates (which email sparked more responses?)

    • Click-through rates (if links were included)

    • Conversion rates (did one version lead to more booked meetings or sales?)

  • You can use the winning version for future campaigns to improve outreach effectiveness.

🚀 Success! By running manual A/B tests, you can refine your outreach strategy, boost engagement, and maximize results in every campaign. Try it today and start optimizing like a pro!


Frequently Asked Questions (FAQ):

Q: Can I test multiple variables at once?
A: For accurate results, test one variable at a time. It’s hard to know which change made the difference if you test more than one.

Q: How long should I run the test?
A: A good rule of thumb until you have a statistically significant number of responses. This is likely more than you think!

Q: What if both versions perform the same?
A: If there’s no clear winner, test a new variable or try a different approach - minor tweaks can make a big difference over time!

Q: What does the A/B Test Condition do in the Smart Campaigns?
A: This condition only splits the data into 2 different routes. Limited reporting on the outcome is available, so we recommend duplicating the campaign as the best way to conduct an A/B test.

Did this answer your question?