"Just checking in" - does it actually work?

Posted June 4, 2019

By Pavel Dmitriev

VP of Data Science at Outreach

You've probably heard the saying, “trust your gut,” and whether they are willing to acknowledge it, this is how many sales leaders make decisions.

But this is antiquated, outdated, and dangerous. Yet sales is one of the few remaining disciplines that often relied on intuition, word-of-mouth strategies, guesstimates, and even superstition, to close deals.

I say: trust the data. Intuition is helpful, but we should treat our ideas as hypotheses, not as absolute truth, and then use data to test whether these hypotheses are true.

Successful sellers use workflows, playbooks, and tests to close deals, which why we unrolled Amplify Guided A/B Testing. Using machine learning, Guided A/B testing helps sales managers set up accurate A/B tests that avoid the most common pitfalls: running the test too long, ending the test too early, or making changes during the test, so that the results are trusted and actionable to improve your sales team’s performance and drive more pipeline.

In other words, our Guided A/B testing makes advanced statistical analyses understandable to those of us who don’t have a PhD in Computer Science (not that there’s anything wrong with that). Now, sellers are able to use data to make decisions through A/B testing to experiment and understand what works. After all, guts can be wrong, but data never lies.

A/B Testing “Just Checking in”

At Outreach, we not only provide guided A/B testing capabilities to our customers, we also use A/B testing ourselves. One of the more interesting tests we ran centered around the phrase, “Just checking in.” We've all said it when talking to prospects, “Hey Will, just checking in,” or even using “Just Checking In” as the email subject line.

It’s a polarizing phrase. Some sellers say it’s harmless while others believe it’s detrimental to your deals.

In fact, once upon a time, many Outreach sales managers said the same thing–don’t do it! It turned out, however, that there was one–just one–email template at Outreach that opened with “Just checking in.” Everyone was shocked when the template had a 13% reply rate-–an unexpectedly high number!

We decided to create a scientifically valid A/B test to determine whether “just checking in” actually increased, decreased, or resulted in no impact to email reply rates.

The Method

First, we determined the scenario we wanted to test: bumper emails, which are follow-ups to an initial cold email that uses the same subject line and email thread as the original. We specifically selected scenarios to determine if “just checking in” helped in some situations but detracted in others.

We then added the phrase “just checking in” in the beginning, right after “Hi {{Name}},” like this:

The control email removed the “just checking in” copy and read:

We ran the test for 4 weeks, accumulating more than 2,000 emails in each group.

As we waited for the findings, Outreach sales leaders shared their perspectives:

“I don't think that emails that start with ‘just checking in’ will have any effect on reply rates, positive reply rates, unsubscribes, or objections.”
MARK KOSOGLOW, VP OF SALES

“I think when in a deal cycle, a ‘just checking in’ email matters not. In a cold prospecting email, I think it might. I would be the test subject prospect who would be annoyed by that if I don't know you.”
STEVE ROSS, SENIOR DIRECTOR OF SALES DEVELOPMENT & INSIDE SALES

Were our own sales leaders correct?

The Findings

We found that the template with “just checking in” received an 86% higher reply rate-–a statistically significant result.

“Just checking in” received an 86% higher reply rate

I wasn’t exaggerating when I said the results are eye-opening. Even our very sales leaders were surprised! As a result of our test, we now use “just checking in” in a number of Outreach sequences.

Here’s a table that shows our findings:

Summary

Will “just checking in” work for your sales team? It depends. The point of this experiment isn’t to arrive at a universal truth or silver bullet—if you read anything that presents findings that way, run!

The point is to show how Outreach uses data science to combat the “gut feeling” approach that many teams take, and uncover the sales best practices for our team using Outreach’s built-in features.

Want to get started on your own A/B tests? Check out our post for our criteria and tips on how to create a solid A/B Test.

What other sales myths should we test? Leave a comment to let us know what you’d like to see.


Related

Read more

Discover the Sales Execution Platform
See how Outreach helps sellers close over 2 million opportunities every month.