Response rate doesn’t help you optimize sequences. (here’s what does.)

Posted February 16, 2021

By Meghan Donovan

Sales Sequence Specialis at Outreach

This article is part of our Outreach on Outreach content series, in which we showcase our own revenue team’s use of the Outreach Sales Engagement Platform to help you drive success at your own company. We share workflows and strategies, backed by original research and data from the results of our own experiments and customer base.

Looking to improve your email open and response rates? I have the perfect subject line for you to try on your next Outreach sequence:

Look out your window

Or, how about:

I just emptied your bank account

It’d certainly grab a prospect’s attention. You might even get a reply, too.

But you might also wind up on the receiving end of a restraining order. And you can forget about ever booking a sales meeting.

The point of this purposefully dramatic example is: on its own, your response rate is all but meaningless.

Sure, it may show whether your prospects are hitting the reply button — but it doesn’t tell you what they’re saying. Are they begging you for a contract or are they telling you they’ve called the police?

Armed with only your response rate, you can’t tell. That leads sales reps to the wrong conclusions and, ultimately, the wrong actions.

Our data science team found out that when sales reps select email variants based on response rates, they choose underperforming options 40% of the time. In other words, your main metric for outbound email leads you to the wrong decision almost half the time.

Add a new layer to your measurement

If you want to drive next-level results, the most important thing isn’t how many replies drop into your inbox. It’s the sentiment of those replies, with a clear focus on your positive response rate.

If you’re new to sentiment, here’s a quick recap on the possible response categories:

  • Positive: It could be “Sure” or the magical reply that goes something like: “Funny you should email me. I was just thinking there has to be a better solution than what I’m doing today. When can we meet?”
  • Unsubscribe: Your prospect doesn’t want to hear from you. You’re completely out of the running.
  • Objection: Your prospect replied to your email only to let you know that they don’t have budget. Ideal? No. But with the right strategy, you can turn this into a deeper conversation.

By prioritizing sequences based on positive reply rate instead of reply rate, you’ll book 14% more meetings.

What could your numbers look like with 14% more meetings on your reps’ calendars tomorrow?

Want to see how it works?

Read on.

Double down on what converts

We recently put sentiment to the test in a sequence targeting marketing leaders.

The sequence generated objections in 47% of replies — significantly higher than the 26% average across our group. We knew something was going wrong, so we dug into the details.

As you scan down the steps, one jumps out: the automated email on day four. More than half of all replies to that email are objections — but we had an A/B test running.

As you can see, the two variants have roughly the same number of replies. If we were using response rate as our sole metric, we’d probably keep running the test — and we’d lose business.

With our more in-depth engagement metrics, we could see the first template — where the rep was personalizing the email with COVID-19 context — generated nearly double the objections and half the positive responses than the second variant (a more standard followup email).

We didn’t need any more proof. We called the A/B test early, switched off the underperforming variant, and the sequence performance spiked immediately.

And this sequence wasn’t an outlier.

Our data scientists recently analyzed 6.5 million sequences and discovered that optimizing for positive replies has 33% higher correlation with booked meetings than response rate on its own.

How we run effective A/B tests

We’re starting to see a tide change in the sales industry. Gut feel and intuition are on the way out. Data-driven decisions are becoming more common.

Gartner predicts that three-fifths of sales orgs will have switched to adaptive data-driven selling by 2025. Those that don’t are going to get stuck in a downward spiral.

Harnessing data can be tricky. Outreach gives you sentiment data — but you still have to work out how to use the insights to improve your sales process.

Usually, that takes the form of an A/B test. You create two variants for a sequence step and see which performs better.

Here are the best practices we follow to run reliable experiments:

  • Start with a hypothesis: Blindly testing random variations is (almost certainly) a waste of time. Before they set up A/B tests, we encourage our sales managers to think about why a variant might work better. For example, in our marketing leader sequence A/B test, the manager thought working in more COVID-19 personalization would increase the urgency. While we ultimately proved that hypothesis false, it was well worth testing.
  • Test small: If we ran A/B tests on every single part of a sequence at once, it’d be impossible to know what changes were driving improvements. We focus on individual steps and test them in isolation. That way, we can tie any shift in our outcomes directly to the experiment.
  • Apples to apples: Different sequences have different purposes, so you can’t always compare sentiment breakdowns. For example, if you’re deep in contract negotiations, a high objection rate isn’t necessarily a bad thing. Your prospect is getting into the details and wants to be sure they’re making the right call. But in a cold outreach email, you’re just trying to start a conversation. If your prospects are immediately hitting back with objections, you might adjust your messaging to get proactive in addressing those concerns. Wherever possible, we compare emails, sequences, and snippets with the same purpose

Using sentiment in our A/B tests has led us to some unique conclusions. For example, we discovered our go-to messaging — aggressive value proposition messaging — fell flat in Europe. By testing new strategies we created a new sequence optimized for EMEA. We doubled down on rapport building and nixed our aggressive messaging entirely.

Sure, it felt uncomfortable to move away from the language that had driven so much success — but that’s what the data told us to do.

Finally: Good sequences need great reps

All of our sequences include some element of personalization. In the marketing leadership example I mentioned before, our reps researched how the prospect’s business was coping with COVID-19 and built out a personalized hook around that.

Personalization typically only covers a few lines in each email — but it can make or break a conversation.

Sentiment can help here, too.

We can look at any of our sequences and immediately see our baseline. Here’s the average performance in our marketing leadership sequence:

  • Positive: 47%
  • Objection: 26%
  • Referral: 6%
  • Unsubscribe: 4%
  • Other: 17%

Because we have that baseline performance, we can stay alert for reps underperforming.

If one of our sales managers notices a rep has a spike in their unsubscribe requests, they’ll dig into it. Perhaps the rep is coming in too hot and prospects are putting up their guard. It’s possible the rep hasn't taken the time to personalize each message. Or maybe they’ve been too soft and haven’t proven the value to buyers. Whatever the specifics, sentiment helps us improve our sequences and our reps.

But coaching is a story for another day.

A new yardstick for sales performance

The difference between a reply and a positive reply seems small — but it isn’t. As you’ve seen, optimizing for response rate can — and does — lead you astray. You start selecting underperforming sequences, missing out on deals, and leaving money on the table.

To take your sales performance to the next level, sentiment analysis is a must. It should form the cornerstone of your performance analysis and optimization.

We all know that a metric can’t drive progress on its own, but you have to start somewhere. You need some structure around your testing and experimentation. That’s where quality A/B testing comes in. Start with a hypothesis, test small, and compare apples to apples. Get your basics right and your performance will skyrocket.

Sentiment isn’t just about the nuts and bolts of sales strategy. It can help turn good sales reps into deal-closing machines.

---

Check back for our fourth installment of Outreach on Outreach, where Head of Strategic Engagement, Scott Barker, lifts the lid on our referral selling workflow!

Don’t forget to follow us on LinkedIn for first dibs on new and exclusive Outreach on Outreach content.


Related

Read more

Discover the Sales Execution Platform
See how Outreach helps sellers close over 2 million opportunities every month.