Mining Un-engaged Prospects using a Multivariate A/B Approach

Mining Un engaged Prospects using a Multivariate A B Approach FI Mining Un engaged Prospects using a Multivariate A/B Approach

What are your testing strategies for getting your unopened, unengaged marketing emails opened?

Control testing your subject line and content with multivariate A/B testing, prior to a major deployment, is necessary on many levels. It’s part of that tactical advantage you want to give a campaign prior to fully committing the remainder of your targeted delivery.

Before you seek that upper hand, consider what kind of data you’re targeting, exactly. A/B testing to a group of engaged openers (or people who’ve been historically known to participate in your calls to action) is inherently easier than testing to those who have yet to open an email.

Your engaged users have already established a reputable connection with your sending domain. It’s also likely that their associated click traffic or other downstream data aggregation, after converting has been entered into a basic CRM profile (you’re aggregating recipient interest profiles into a CRM database, right?).

So with that in mind, and for the purpose of this post, I wanted to focus on the prospects with sometimes big potential. I’m talking about the non-open prospect emails in your email universe.

Don’t ignore your unengaged email prospects

This is not meant as a go-to resource that’s able to solve everyone’s prospecting issues. Instead, it is intended to highlight the slipshod approach that is oftentimes associated with unengaged email prospects, and to make a few recommendations on how to get those recipients to open up for the first time. Hopefully, this post can help serve up some variable adjustments to your multivariate A/B testing methods (or inspire you to begin testing), and help get you a better return on those unengaged assets.

Don’t get tunnel vision on your subject lines

Throughout the years I have attempted delivery to billions of non-open email prospects. I recommend a healthy testing phase prior to delivering to them. I’m always curious why clients regularly shoot this approach down.

The typical attitude is, “Well, if the subject line and content is good enough to get a response from the regularly engaged, then it should be good enough for the unengaged, too.”

Such an approach always makes me smile (while shaking my head, of course). And even when I finally convince someone to commit to an A/B test on their unengaged prospects, they tend to reject the multivariate approach, choosing only a strict A/B test, on only the subject line.

Many senders, quite frankly, disregard how their email’s content may hurt prospecting efforts and instead believe the subject line is the sole reason why a user chooses to open the email or not. Most of the time, marketers will just look at the open rates a couple of hours after sampling, as bad as they may be, and decide “Yep, that one beat out the other. Let’s use that one.” What they really should also be considering is how their HTML content and segments tested are affecting their inbox placements.

When testing your subject line

In many ways, yes, I agree that an enticing subject line can make or break your open rate; however, so too can inbox visibility. Your subject line is irrelevant if your emails aren’t hitting the inbox. Aside from this, if you’re going to A/B test the subject line only, then my best recommendations to anyone prospecting non-open emails are to:

  • Propose a question in the subject line. Try it on the next A/B test. Send out one test sample with a subject line that makes a statement, and one with a subject line that asks a question. In almost every scenario the test with the question wins because you’re engaging the recipient by encouraging dialog. Use a single question mark.
  • Personalize with localization. Merging the user’s name into the subject line doesn’t carry the impact you may think. Try merging their city, county or state name, instead.
  • Don’t start a subject line with Fwd: or RE:. This strategy is known as deceptive familiarity, and the idea that it helps get a foot in the door on open rates has been proven false time and time again. Sending emails that include such verbiage implies deception, and so ISPs think of them negatively and will feed your email to the spam folder more readily than if you did not use these signifiers. Also, people don’t like being deceived. This behavior heightens the likelihood for increased churn through spam complaints and unsubscribes.

Still, no matter what you do, some unengaged prospects will continue to be unengaged. No matter how many times you send to them, they will not open. Others might simply need to see your message land in their inbox. Once that happens they will sometimes open regardless of the subject line. When they do finally open and click, that engagement is recorded with the recipients ISP. Both your sending domain and the recipient’s email provider will then be able to connect these dots. This “digital hand-shake” will greatly increase your chance for an inbox placement on the next send, many times over.

Other variables to test

Try changing up some other variables in the form of slightly different versions of HTML content. This may give you an edge on hitting the inbox. A couple great ideas to help you create these multivariate test samples in your HTML content would include:

  • Adjusting the text/image ratio of the HTML content. Try letting the text weigh more heavily and relying on fewer images in one of the test samples. Nothing kills more than when a client wants to send a single-image infographic email (one image, containing all the text, with no stand-alone text) to recipients who’ve never opened. It’s very likely that literally every single one of those users will have their images disabled at this stage of engagement. If the only text in the email is an unsubscribe link, then that is all the recipients will see upon opening. Knowing that some non-openers will have their images show as disabled upon first opening (most will), then try putting some persuasive text in the content alongside your info-graphic in order to offer an incentive to enabling the images.
  • Defuse any potential for strong language that would normally be distributed to your regular openers or activists. Save that stronger rhetoric for the landing pages or maybe plant it inside an image so the filters can’t read it. The sophistication on some algorithms that major filters and ISPs are using now a days can be very touchy with regards to the tone of your email text.
  • Create a segment based on historical delivery success to the non-open files, and use segmentation as part of your multivariate environment. After all, if the non-open users are stale, they will have very low potential for actually being delivered successfully. Then worrying about subject lines and HTML content is the least of your concerns.

This last is an old trick that I’ve been using for years. The point is to segment and test to the non-open prospects based on who received their last 3/3, 2/3 or 1/3 emails. It’s also really interesting to analyze the data after this testing. This would likely involve tabling the data and creating a process for tracking the delivery attempts.

On this note, you may also consider permanently purging from your list anyone who hasn’t been successfully delivered to over the last year or so. If they haven’t received an email after a year’s worth of targeting, then they are probably doing more harm than good. Letting them hang out on your list and bounce on every single send is going to trash your deliverability rates. Depending on your email marketing goals and your sales cycle, you may want to shorten that time table for purging to six or eight months and/or suppressing after some number of soft bounces.

Lastly, give your test samples a little more time to marinate:

After the test samples have gone out, you may want to let the results trickle in for more time than usual before finalizing your strategy. Instead of waiting a few hours, come back the next morning and then evaluate the core metrics. Some recipients will not open the first couple hours after receiving, so give them more time to open.

Final thoughts:

Who’s helping you run the multivariate A/B tests and deploy your email campaigns? Are you staffed totally in-house, or do you contract with an advertiser who controls the distribution of your emails based on the content you give them?

Either scenario can be both a blessing and a curse. The outcome is totally bent around the capability and experience of the team pulling the trigger and managing your account’s marketing strategy.

If your team is totally in-house, then the onus is on you to formulate creative solutions for the A/B testing. This means keeping an open mind and testing new approaches to multivariate A/B deployment, like those mentioned above. A curious mind will serve you well when prospecting to a non-opens list. So will some research, education and application of what you’ve learned through trial and error.

If you’re going to take anything away from this post, it should be due diligence. Formulating a test solution that works for you is incredibly important, but can sometimes require an out-of-the box frame of mind. Creating a thoughtful and patient approach to multivariate A/B testing is a challenge, but will definitely help when you’re mining out those non-open prospects in an effort to harvest more conversions.

This entry passed through the Full-Text RSS service – if this is your content and you’re reading it on someone else’s site, please read the FAQ at fivefilters.org/content-only/faq.php#publishers.
Recommended article from FiveFilters.org: Most Labour MPs in the UK Are Revolting.

Act-On Marketing Action Blog