What if I told you that you could improve your open rate, and engagement from your nonprofit emails and newsletter?
Would you want to?
Of course you would.
A/B testing, or the more advanced version, multivariate testing, allows you to answer questions about your emails like:
- Are my subscribers more likely to open an email with subject line A, or subject line B?
- Would my subscribers respond better to images in the email or just text?
- What call to action will inspire the most clicks, donations, or sign ups?
When you have the answer to these questions you can consistently increase the opens, engagement, and ultimately the donations and impact can have.
What is A/B Testing for Nonprofit Emails?
A/B testing is the science of sending 2 or more ALMOST identical emails with only 1 element that is different, in order to see which performs better. After seeing the results, you know the winning variable and can try that new winner against another alternative.
Each time you find the best option you’ve just learned something new about your readers and how you can get more value from your nonprofit email campaigns. Testing over and over with different elements can make a big impact over time.
How is Multivariate Testing Different Than A/B Testing?
An A/B test has 1 element that is tested, but you might have 2, 3 or 4 options on that 1 element, which would make it an A/B/C/D test actually.
But it can be a long process if you’re only testing 1 element at a time, and if you test 2 elements in an A/B test you will get misleading results, example:
- Email 1 had Subject Line A and Image A
- Email 2 had Subject Line B and Image B
If the second email did better, you won’t know for sure if it’s because of the subject line or the image that caused the increase in engagement.
Multivariate testing allows you to create enough variations that every element change would have a conclusive test – even if it’s done for the same “campaign”. So the proper way to do the above example would be by sending 4 emails instead of 2:
- Email 1: Subject Line A and Image A
- Email 2: Subject Line B and Image A
- Email 3: Subject Line A and Image B
- Email 4: Subject Line B and Image B
Now looking at the results of these 4 variations you can determine the impact of the subject line change and the image change – and see directly the winning combination. This speeds up the process because you can test multiple elements at the same time without having to wait for your next campaign to test the next element.
However, it only speeds up time if you have a program that can manage this for you because that’s a lot of emails. MailChimp does a good job at simple A/B testing and multivariate testing.
4 Elements to A/B Test in Nonprofit Email Campaigns
1. Sender Name and Address
Your opens rates are dependant on 4 things, your sender address, your subject line, day/time sent and the expectation/brand you’ve built up in your reader’s mind. It’s hard to control the brand perception portion, and that’s a larger discussion about strategy and marketing, so the most obvious place to start is the sender information.
You might think, we just send it from “firstname.lastname@example.org” and that’s it. But you might be better off sending it from your ED’s personal email, from a staff member, or from “OrgName News”. You won’t know until you try.
People judge a book by its cover, and everything counts. If you do decide to send the email from a person, their actual name might even play a role in the perception of the reader.
2. Subject Line
Step 2 in affecting your open rates is the subject line. Obviously, it has to relate to the content and context of what you’re emailing about, but there is so much to think about with subject lines.
You can test short and vague subject lines like “Update from this month”, or something longer and descriptive like, “How your donations helped us support 110 people last month”. The options are…endless. A good thing and bad thing.
3. Email Content
This is a broad category. It encompasses everything in your email from salutations to image selections, content, call to action, button text or colours, and much more. It’s all fair game to test. One thing to keep in mind is that this content will not affect your open rates at all (unless you’re sending a series of emails that are connected in some way, providing anticipation for the next). The goal of content optimization is to encourage better engagement, which would be clicks, then ultimately sign-ups, downloads, volunteers etc. after the click.
4. Sending Time
There have been many studies about the best email sending time, and many of them agree that Tuesdays and Thursdays between about 8 am and 10 am are the best times. But this DOES NOT mean that it’s the best time for your readers. You might have busy parents reading your emails, and they can’t read them until after their kids are in bed. There could be a ton of reasons that weekends could be better for you or any other time. Testing is key.
How to A/B Test Your Next Nonprofit Email
Thankfully, this work doesn’t need to be done manually anymore. Any reputable email sending program will have some form of A/B testing built in. If you know me, you know I’m all about MailChimp, and this is another area where they excel.
The process is incredibly simple, and the results easy to read. Just follow the instructions outlined by MailChimp.
As part of this process in most platforms, you should be able to select how much of the list you want to use for your test. Then they automatically select the best email based on a particular metric and send it to the remainder. This is great for getting the most from your campaigns.
The other option, for smaller email lists, is using your entire list for the test and using the insights to improve your next campaign.
Is A/B Testing for Your Nonprofit?
Unfortunately, A/B testing loses its effectiveness with smaller email lists. In MailChimp’s platform, they suggest a single email as part of a test should be sent to over 5,000 emails. So to do an A/B test, that would mean a list of 10,000. For many of you, this will be far outside of your range. Not to worry, you can still send your test campaigns, but perhaps you want to send the campaign to the entire list split 50/50 instead of doing a “test” portion of the list with automatic winners sent to the remainder.
If you have a list of over 1000 subscribers A/B testing could provide you with some very decent insights. Less than that and it gets a little bit harder to trust the results.
Either way, A/B testing is actually quite simple and can yield some great results. There’s no reason to not give it a try – especially for something as simple as a subject line.
Have you done A/B testing on your nonprofit emails? What did you learn from it?