Sales Best Practices

Word Play: Leveraging Machine Learning to Understand Email Sentiment

Sunny Bjerk's Avatar

Sunny Bjerk

Product Storyteller

Understanding human communication is difficult. Languages have nuances, slang, and regional dialects, and while these subtleties may be learned by humans, these are surprisingly hard for machines to understand. But while the science is not there yet, Outreach is making progress.

Our Data Science team continues to push the boundaries of what’s possible in the field of Natural Language Processing (NLP). Last week, Yong Liu, our Principal Data Scientist, was at the High Performance Machine Learning (HPML) Workshop in Cyprus to share our latest achievements in understanding the sentiment of sales emails.

Bringing together researchers and practitioners from universities, supercomputing centers, national labs, and industries across the globe, men and women shared the latest techniques, implementations, benchmarks, and applications for Artificial Intelligence (AI) and Machine Learning. This year, the Data Scientists' paper titled, “An Evaluation of Transfer Learning for Classifying Sales Engagement Emails at Large Scale,” and which was co-authored by Yong, Pavel Dmitriev, Yifei Huang, Andrew Brooks and Li Dong, was the only one accepted from our industry.

I sat down with Yong to get his thoughts on Machine Learning and NLP, and the ways we’re using science to enhance the art of sales.

Q: Why is understanding email intent or sentiment important?

Yong: One key goal of a sales leader is to continuously improve the sales process, which requires visibility into every step to assess its effectiveness. For example, email is the dominant form of sales communication in most companies, but how would they know if their emails are working?

The traditional answer is looking at reply rates. But, as we previously discovered, reply rates are often misleading. Most replies are objections and unsubscribe requests, so looking at the overall reply count doesn’t tell you much about how good the email template really is.

This is where automated email understanding comes in. If we could understand the intent or sentiment behind prospects’ replies, we could help sales leaders understand the effectiveness of the sales process much better and enable them to take actions to improve it.

It turns out, however, that doing this is not easy.

Q: Why is understanding email sentiment so challenging?

Yong: (Laughing) There are several reasons!

  • Outreach is used by SDRs, AEs, CSMs, and other roles, and each role supports the customer at different stages of their journey. So how they talk to each customer varies as the deal progresses, and also because each person has individual speaking styles or use of language, such as slang or colloquialisms.
  • Outreach is used by customers of all sizes and across industries ranging from tech, financial services, and sports, and each industry has specific business needs and use cases, which means that they will also have specific jargon or language, and this specificity makes it difficult for a computer to navigate and understand.
  • While it’s common to outsource email annotation to third parties like Amazon Mechanical Turk, this isn’t possible in sales. At Outreach we don’t share our customers’ data, which makes it much more expensive to annotate emails for training machine learning models, and limits the amount of annotation we can do for each customer.

Q: How are we solving this problem at Outreach?

Yong: At Outreach, we are exploring a technology called Transfer Learning, which is considered one of the next driving forces and frontiers for AI and Machine Learning. The idea is to use a computer model that is trained on public datasets such as Wikipedia, and then use the knowledge learned to quickly learn to classify the email sentiment in the sales communication. Transfer learning is really meant to mimic how human gains knowledge cumulatively. For example, we learn the alphabet so that we can learn to read and write, and we then use those literacy skills to write blogs on the internet. By retaining the knowledge we’ve acquired before, we are capable of greater intellectual feats.

Q: What did the HPML crowd think of the paper and presentation?

Yong: It was well-received! A lot of interesting discussions started over the required number of training emails and growing number of email sentiments. It was such a great brainstorming session! It compels us to further explore this direction.

Stayed tuned for more updates on our Data Science team’s adventures. Next up: a visit to Montreal to present three key checklists for running A/B tests correctly at the International Conference on Software Engineering, followed by a stop at Snowflake Summit in San Francisco to share the mechanics of our cost effective and trustworthy A/B testing platform!