Your email list is only as valuable as your ability to convert customers from it.
Research abounds about email’s ROI potential. Every time a consultancy or agency publishes an ROI study on marketing channels, businesses consistently rank email high on the list.
How are marketers extracting so much value from this channel? It’s not like email is cutting edge technology. The medium has been around for over 40 years, and in 2020 you still can’t reliably embed a video into an email.
But you can test each and every element of email marketing with relative ease. In fact, that’s the secret: if you want to turn your email marketing into a gold mine, you need to be obsessively running A/B tests.
That’s what all those high performing marketers are reporting in those yearly channel surveys. And it’s what you should be doing too.
In this article, we’ll cover three high impact A/B tests you should start running as soon as humanly possible.
How to think about A/B testing
Not all A/B tests carry the same impact potential.
Not every hypothesis is bold enough to generate a significant difference in results. It’s all too easy to deploy tests that won’t generate pivotal insights that lead to a sea change in how you do email marketing.
You don’t want that, because it multiples the number of tests you’ll need to run before you arrive at truly valuable conclusions.
How do you get to the good stuff? Ask yourself these two questions every time you’re formulating your next test:
- Will this test unearth insights about our audience that we can apply to other scenarios?
- Will this test deliver an increase in revenue, demo requests, or lead volume?
If the answer is no to both of those questions, then you need to shelve that idea and think of something better.
With that criteria in mind, let’s move on to the three tests you need to work on this week.
3 A/B tests to run right now
1) Transactional email CTAs
Transactional emails are those messages we all receive when we do something important on the web — like place an ecommerce order, schedule a flight, or download an eBook.
(If you’re not in B2C, you probably call these “thank you” emails.)
They’re the most common type of email because people crave confirmation after they take action online. Transactional emails close the loop by saying “Thanks for doing that. Here’s proof that it happened.”
A delicious example of a transactional email via Really Good Emails
The problem is most marketers aren’t making the most of transactional emails. They send the standard confirmation and leave it at that. Huge mistake.
People almost always pay attention to transactional emails, and your audience is already thinking about your brand because they just converted somewhere on your site.
That means it’s a prime opportunity to include a new call to action. What should that call to action be? That’s exactly what you’ll be testing.
To figure out what CTA to include, think about what the most logical next step would be based on the actions the person just took.
This varies based on your industry. B2B marketers typically need to nurture leads for a lot longer, so their CTAs will reflect that.
When someone downloads a Hubspot eBook, they don’t ask them to then sign up for an account. Instead, they offer a content upgrade that shows how Hubspot can help their customers achieve the results outlined in the eBook.
Meanwhile, Casper includes a referral CTA at the bottom of their transactional email that encourages customers to refer their friends and get some money back.
To run this test, think about what type of thank you / transactional emails you send most often. Then isolate what action makes the most sense for those recipients to take next. Now you’re ready to put this test into action.
2) Short copy vs long copy
A lot of marketers who aren’t copywriters are certain that short copy will win over long copy every time.
The problem is that logic isn’t irrefutable, and there are case studies to prove it. Copyhacker’s case study about 3.5xing Wistia’s paid conversions from email is a perfect example. When Joanna first showed the test version to Wistia, they immediately objected with “that copy is longer than we’re used to”.)
And look what happened.
Sometimes people don’t act because they aren’t adequately persuaded. That calls for a longer, more detailed copy.
The question is what should you include in that extra copy? Yep, that’s what you’ll be testing.
Common examples of additional copy include:
- Customer testimonials
- A longer list of all the features or benefits associated with your product or service
- Stories that explain, in detail, how your product or service helped people overcome a problem or achieve their goal
- Stats and data that reinforce the need for your product or service
- Frequently asked questions (with answers)
Here’s an example from The Nue Co. This is an ecommerce brand, which is an industry that’s prone to short copy. Check out what this company does instead:
Rather than just having a well-designed image and a brief description of the product, The Nue Co. adds some extra copy to clarify the problem their products are solving.
If you’re not sure what to add with your extra copy, do a quick survey at the end of the nurture campaign you want to optimize. Make it one question, and ask “What’s stopping you from doing x?” (In this context, “x” is whatever action you want people to take.)
Use the answers from that survey to uncover what’s holding people up, and then test some additional copy and see if your results improve.
3) Subject lines
In that regard, testing subject lines is one of the most important experiments you can run as an email marketer.
But how, exactly, should you go about testing those single lines of copy? Here are a few places to start.
A. Short vs medium vs long subject lines
Copywriters have been known to agonize over subject line length. The current evidence suggests both short and long subject lines work better than average length ones (average being about 44 characters).
Which type does your audience prefer? You should probably test it.
B. Specificity vs curiosity
There’s a smorgasbord of subject line formulas to choose from. However, you should focus on two in the beginning: specific subjects lines versus relevant subject lines.
In a way, this is a version of the short versus long subject line test above.
Specific subject lines describe the contents of the email in detail. These are usually classic, value-driven pieces of copy.
Meanwhile, relevant subject lines are typically shorter and less specific. As people have grown accustomed to classic marketing headlines that tout “5 ways to [insert promise here]” marketers have switched to brief headlines that peak the recipient’s curiosity.
The copywriters at The Hustle are masters of this technique.
Lines like “Big trouble at Tupperware,” “Elevators as a service,” and “Tag, you’re rich” do just enough to spike readers’ curiosity.
Wishpond ran a version of this test where they compared “Halloween costume content” with “How to get 1000s of new leads this Halloween.”
The shorter, less specific headline won by a landslide.
Personalization works in email marketing. There’s plenty of evidence to suggest that using someone’s first name or the name of their business in a subject line will increase open rates.
You just have to figure out what type of personalization will work best for your audience. Is using someone’s first name as good as it gets? Or perhaps you can get other information about them — like their past purchases on your site — and send them an entire email series about other related products.
Either way, make sure you’re using personalization in some form or fashion. It works.
A/B testing in email marketing is a low-risk, high-reward scenario, which isn’t something you come across that often as a marketer.
There’s a veritable library of information about A/B testing, but working through all that data will just slow you down in the beginning.
Your best bet is to start with the three tests outlined in this article, get some insight and quick wins, and then expand your testing program to incorporate more complex experiments.