How Centerfield is sifting insights from millions of phone calls

Echo AI's technology is helping Centerfield turn a jumble of unstructured verbiage into actionable customer data.

Chat with MarTechBot

“Echo AI came onto our radar screen because we were looking for AI-driven consumer insight platforms that can help marketing, sales and customer service teams optimize and become more efficient. What we’ve seen are the traditional mediums through which marketing campaigns are pushed out — emails, websites, social media. What we haven’t seen as much is how you take these personalized interactions that have emerged, particularly with chatbots, and use that data to optimize your marketing campaigns and messaging. That’s what Echo AI is able to do.”

That was a brief introduction to Echo AI offered by Camille Manson of innovation advisory firm Silicon Foundry back in May.

“Listening to customers is pretty critical for any business,” Alex Kvamme, CEO of Echo AI, told us. “The paradox of success is that if you’re good at listening to customers you get more customers — and it’s harder to listen to them. You have thousands or millions of customers telling you exactly what they want and what’s happening in the market; the problem has been that it’s very hard to get at those insights.”

Solutions have included manual review of customer conversations and then pre-deep learning natural language processing models. “You could do sentiment and keyword tracking, but it was not very good, it was not configurable, it was surface level.” While it was possible to run search queries against transcripts, customers have many different ways of saying things. Also: “You can only search for what you know to search for.”

Improvements in deep learning technology and the development of generative AI have enabled a deeper dive. Audio files are easily transcribed, but: “You have this unstructured jumble of billions of words, being added to every day, and there was just no way to comprehensively get in there until the invention of LLMs that are unstructured word processing machines. It was the perfect problem for LLMs.” Echo AI, incidentally, is model-agnostic using a range of available models including OpenAI.

Customer acquisition company Centerfield had not so much a jumble but a mountain of unstructured words from calls to their call centers. Echo AI was able to shine a light on those conversations.

Supercharging customer acquisition

“In a nutshell, we supercharge customer acquisition for our clients,” explained Aniketh Parmar, EVP technology at Centerfield. “We do that using a wide variety of mechanisms especially around our technology platform called Dugout.” In other words, Centerfield acts as an extension of the marketing teams, typically at Fortune 500 companies. Also: “We don’t get paid until we acquire a customer,” said Parmar, which indicates some confidence in the platform’s capabilities.

How does Centerfield’s acquisition strategy work? First, there is a portfolio of more than a dozen digital brands like business.com and security.org that redirect traffic to Centerfield’s clients. The brands attract something like 170 million users throughout the year. Centerfield also drives significant traffic from search keywords to landing pages through partnerships with Google and Bing.

The majority of connections, however, come through Centerfield’s globally distrbuted call centers. “Using our technology platform we can track everything end-to-end, so we know what ad you clicked on or what you saw on the landing page; and when you make a phone call we know that you’re coming from so-and-so location. We use that information to find the best available agent to convert you as a customer.” The agent will have all the information available to them when taking the call.

After making the first sale, Centerfield hands off the customer to its client. This goes a step beyond lead generation; what the customer Centerfield hands off is more than a lead, it’s someone who has already converted.

Mining millions of phone calls

In the call center space, Centerfield has faced a number of challenges including improving the performance of agents. “It’s difficult to identify the best-performing agents if you try to do it manually,” said Parmar, “so we wanted to use some kind of technology. Prior to Echo, we weren’t doing this at scale; we were only doing this on certain clients and certain subsets of calls. We could not mine every single call. Today we’re mining pretty much every single call.”

The relationship with Echo AI began with a proof of concept exercise on a small number of calls and then scaled up. What kind of scale are we talking about? “Six or seven million calls [by the end of the year]. That’s the kind of volume we’re talking about here.”

It’s one thing to have millions of calls mined, another thing to ingest the results. “It’s a two-step journey,” Parmar explained. “First, they mine the calls; then you can put questions to the platform, like ‘Find the traits of agents that are really good at converting customers.’ The platform can identify themes and then report the information back to us. We can then ingest the information in our platform, Dugout, that has all the performance data.”

This is Centerfield’s first year using Echo AI. From the POC exercise they saw an improvement in conversions. Parmar did not yet have exact insights of the improvements seen as Echo AI has been scaled across the company.

Dig deeper: 5 simple ways to improve customer experience

The micro and the macro

We asked Kvamme to go into more detail about how insights from millions of phone calls can be reported in digestible form. Kvamme distinguishes between micro-insights and macro-insights. Specific conversations can reveal micro-insights like purchase intent — including for products or services that are not the main subject of the call — and intent to churn. Echo AI can detect and flag these insights and push them via APIs into the organizations other systems (CRM, for example) where they can surface and be acted on.

Macro-insights identify trends, opportunities and risks across the business. “We’re surfacing that in our application where you can view those trends over time; and you can take the high levels trend representing millions of conversations and dig, dig down into the singular conversations — the atoms that make the molecule.”

Does Echo AI’s technology serve channels other than this call center example? “Our vision is that every word that comes out of your customer should feed back to the business,” said Kvamme. “Conversations are the richest medium but you also have reviews, NPS surveys, social media — there are many ways your customers can communicate to you or about you. We are working on analyzing some of those channels.”

Email:


About the author

Kim Davis
Staff
Kim Davis is currently editor at large at MarTech. Born in London, but a New Yorker for almost three decades, Kim started covering enterprise software ten years ago. His experience encompasses SaaS for the enterprise, digital- ad data-driven urban planning, and applications of SaaS, digital technology, and data in the marketing space. He first wrote about marketing technology as editor of Haymarket’s The Hub, a dedicated marketing tech website, which subsequently became a channel on the established direct marketing brand DMN. Kim joined DMN proper in 2016, as a senior editor, becoming Executive Editor, then Editor-in-Chief a position he held until January 2020. Shortly thereafter he joined Third Door Media as Editorial Director at MarTech.

Kim was Associate Editor at a New York Times hyper-local news site, The Local: East Village, and has previously worked as an editor of an academic publication, and as a music journalist. He has written hundreds of New York restaurant reviews for a personal blog, and has been an occasional guest contributor to Eater.

Fuel up with free marketing insights.