I’m not a fan of how many email marketers do subject line testing.
For many it’s almost an afterthought, the easiest way to say that you’re testing without spending much time, effort, or thought.
But done properly, taking the time, effort, and thought that goes into a valuable test, you can actually get some big bottom-line returns. This case study is an example of that.
A lot is written about short versus long subject lines; I don’t look at subject lines that way. I focus on the content, because in my experience it’s the content, and its placement, which makes a successful subject line, not the length.
Our starting point was the previous year’s subject line and preheader text, also known as our ‘control.’
Not bad, right? But could they be better?
One of things I always look at is the first 25 characters of the subject line and preheader text, because this is all you can guarantee your recipients will see. You want to make sure that the copy here is strong enough to engage people, to get them not just to open, but to take action on the content in the email.
The first 25 characters of the control subject line and preheader text are highlighted below.
Once again, not bad.
But the subject line is focused on the sender – it begins with “Our” – not the recipient. Subject lines should speak to the recipient, they should tell them what’s in it for them if they open this email.
And the preheader text is a bit wordy. While “save time & money” isn’t bad, the “budget-friendly” phrase toward the end is probably a more relevant one to use when talking about gift-giving.
Also, the “value packs” get a raw deal in both the subject line and the preheader text – it’s outside of the first 25 characters in both, and as a result it’s unlikely to be seen.
So, what did we write to test against these controls? See below.
For the subject line I moved the products to the beginning – “Gift Sets & Value Packs” – to be sure they would both be seen. I also removed the focus on the sender – “Our new” – and instead focused on the giving – “Holiday Giving” – since that’s what the recipient will be doing.
Since we focused on what the products are in the subject line, I used the preheader text to present the value proposition. I shortened it to “Spend less, give more” because that seemed easier to skim and I wanted to focus once again, on the idea of giving.
Here are the text and control with the first 25 characters highlighted:
I thought the test would beat the control; I projected a 10% lift in revenue-per-email (RPE). Actually, for this client, we used revenue-per-thousand-emails (RPME) to make the numbers larger and easier to eyeball.
Why just 10%?
Subject lines don’t tend to deliver very large lifts in performance. They are the start of the interaction with the email; there are many additional steps that the recipient needs to take before they convert. Testing elements closer to the conversion, like landing pages, tend to deliver larger lifts in RPE/RPME.
Why did I use RPME as a KPI, instead of open rate?
Because we’re looking for more than an open here. We want the recipient to open, click, then purchase. Just getting them to open isn’t enough. You should always use a business metric, one that reflects your bottom-line goal, when you do A/B split testing. Otherwise, you aren’t really optimizing what matters.
Here are the final results:
As you can see, the test lagged behind the control in open rate (-3%; this was before Apple’s MPP was launched), click-through rate (-13%), and click-to-open rate (-10%). All the diagnostic metrics which don’t go directly to the bottom-line success of the campaign.
But the test bested the control when it came to RPME and conversion rate. The test generated more than $71 for each 1,000 emails sent, compared to the control which generated just under $50 for each thousand emails sent. That’s a boost of 43%; it’s not insignificant (this was a 50/50 split of the list).
You can also see what caused the lift in RPME by looking at the data. The test converted 66% more recipients to purchasers. This strong showing made up for the fact that the average order value (AOV) from the test version was a little less (-15%) than the control.
But the benefit of this test does not stop with this send. Our test subject line and preheader text were based on two hypotheses which can be translated into formulas and used on future sends. They are:
By using these formulas with future sends, we are able to expand the value of the test and improve pretty much every email we send going forward.
How do you approach subject line testing? If it’s not like this, give it a try and let me know how it goes.
Be safe, stay well,
Meet the Author
Jeanne Jennings is a recognized expert in the email marketing industry and a sought-after consultant, speaker, trainer and author specializing in email marketing strategy, tactics, creative direction, and optimization. She helps organizations make their email marketing programs more effective and more profitable.
Jeanne is Founder and Chief Strategist at Email Optimization Shop, a boutique consultancy focused on optimizing bottom-line email marketing performance with strategic testing. She is also General Manager of the Only Influencers community of email marketing professionals, Chair of the annual Email Innovations Summit conference, and an Adjunct Professor in the graduate program at Georgetown University.
Her direct response approach has helped B2B, B2C, government and non-profit clients including AARP, Capital One, Consumer Reports, Hasbro, National Education Association, Network Solutions, New York Times, PayPal, Scholastic, UPS, US General Services Administration (GSA), Verizon, Vocus (now Cision), and the World Bank.
Jeanne is based in Washington, DC, she earned her MBA from Georgetown University (Hoya Saxa!), and she is an avid hockey fan (Let’s Go Caps!).