The story of data, Part 2: Where are we now?
Hyperlocal, across all channels, everywhere you are: We’re currently knee-deep in the era of personalization.
Now that we’ve learned a little history about how we got to our present state — in Part 1 of our story of data — let’s assess where exactly we are. And that’s in an era of personalization that started with Amazon recommendations and has blossomed into elaborate journey mapping that can predict where, when and how you might interact with a message or a prompt.
Big Data has given way to Deep Data
Once companies realized that the data that they collected for their own entities could provide value to other companies, we saw a rise in the commodification of data. We’ve moved beyond Big Data as a concept and into an age of “Deep Data.” (I naively thought that I came up with that term until Google set me straight.)
Abhi Yadav, founder and CEO of customer data platform ZyloTech, puts it simply:
“We were drunk on Big Data,” Yadav said. “Enterprises were tracking, quantifying and tagging everything with the implicit assumption that one day we’d be able to use all that data. It hasn’t panned out that way.”
Solutions aplenty
Technology companies pop up every day with new solutions, tools and platforms, all designed to make the most of the available data. We’ve seen identity graphs, predictive modeling and any number of specialized algorithms that are meant to deliver a perfect picture of a perfect customer to marketers and advertisers.
Consumer data platforms (CDPs) have been proliferating for the past few years — unified databases that draw data from a variety of integrated sources. Their holy grail is to present marketers with a perfect customer journey (the fabled 360-degree customer view), one that uses artificial intelligence and machine learning to predict where, when and how a certain user acts.
Scott Anderson, chief marketing officer at content management system Sitecore, says that personalized customer experiences drive ROI.
“According to recent research, up to 86 percent of customers see personalization as a critical factor when making a purchase, yet a third of brands admit they lack the skills and tools needed to properly provide the experience customers desire,” Anderson said.
“These pressures and demands will only continue to grow stronger. Research also finds that consumers see value in providing personal information so that marketers and brands can use it to interact with them when and where they want. By establishing a closer relationship via all the digital channels at our customers’ fingertips, changing our mindset and processes, and implementing the right technology and tools to build those relationships, brands will quickly find themselves in a better position to provide exceptional, personalized customer experiences,” Anderson said.
We’re inundated with costly bad data
In December 2017, the Winterberry Group released The State of Data, showing that companies have steadily increased their investment on how they prepare data for analysis — a process that includes collection, cleansing, organizing, analyzing and then, and only then, doing something with it.
From the report:
“Reflecting the inherent complexity associated with collecting, managing, cleansing and deploying a dynamic set of data assets, users will invest a majority of their data budgets — some $10.1BB — on services and technology offered by third‐party providers to support the activation of audience data (rather than on data itself).”
According to Gartner’s Magic Quadrant for Data Quality Tools report, bad data cost organizations $15M annually on average in 2017, up from $9.6M in 2016.
The need for “good” data, essentially usable data, has inspired a hiring boom of data scientists. According to a recent study by Salesforce and Deloitte, brand leaders will employ nearly 50 percent more data scientists than they do now.
Chaitanya Chandrasekar, co-founder and chief executive officer at data platform QuanticMind, says that data scientists are a must for marketers.
“By making use of emerging techniques like data science, which extrapolates actionable insights from large data pools, marketers can finally get a handle on their data and put it to work for them,” Chandrasekar said. “Retailers that use data science see +26 percent ROI, lead generation businesses that use data science see +31 percent profit margins and financial services advertisers that use data science see +30 percent revenue, on average.”
Chandrasekar said that companies would benefit by focusing their data.
“In 2018, the marketing technology industry must be laser-focused on finally doing something about this proliferation of tons of disparate marketing data from different sources (online, mobile, CRM, ecommerce and social, among others) and multiple devices (including desktop and mobile) … all things that are segregated, siloed and don’t talk to each other by default. It’s only when marketers combine the collective insights from all their data, from all their relevant sources, that they can paint a holistic picture and fully understand exactly how their sales funnel is working and isn’t working,” Chandrasekar said.
And with that push toward good data, we’ve also seen a shift in what types of data sets are the most desirable. Instead of collecting as much data as possible, the goal has become to compile as much data as possible per person or persona.
Hyperpersonalization is the name of the game
Karl van den Bergh, chief marketing officer at translytical data platform DataStax, explains.
“Companies have until very recently looked at data over long stretches of time, focusing on ‘big data’ versus ‘real-time data’ that is actionable in the moment and absolutely essential for the success of the right-now enterprise,” Bergh said. “Smart marketers today are implementing new solutions for real-time data and honing in on what really matters for their customers, aggregating ‘customer 360’ views of that data, and refining what drives engagement and continued sales with an emphasis on right-now moments.”
Technological advances such as artificial intelligence and machine learning are driving new innovations, creating even deeper, more personalized experiences.
Consumers vs. privacy
Conflicted consumers are finding it difficult to balance the benefits they get from surrendering personal details against the risk involved with providing them. A recent survey from identity and access management company ForgeRock, says that consumers think they’re safeguarding their personal information while sharing more information than ever. The report also revealed that customers don’t realize the amount they’re sharing and more than half (53 percent) worry about what they have shared.
Even more unsettling, almost half the respondents (48 percent) aren’t sure who is liable if their information is hacked.
With great data comes great responsibility… and lack thereof
With so much data, mistakes will be made. And oh boy, have there been mistakes. Last year, there was a record number of data breaches, hacks and all-out confusion around data and how it’s handled.
And right now we’re seeing data behemoths like Facebook and Google facing stronger scrutiny about how they collect, handle, use and share data. We are rapt — watching as history is being made.
Tech companies are paying closer attention. So are brands and advertisers. There’s been a proliferation of data safety solutions. But we have no idea how it will all end up (even so, we’ll make some guesses in Part 5).
Here are links to the entire The Story of Data series:
Part 1: How did we get here?
Part 2: Where are we now?
Part 3: Who owns it?
Part 4: Will it ever be truly secure?
Part 5: Where are we going?
Contributing authors are invited to create content for MarTech and are chosen for their expertise and contribution to the martech community. Our contributors work under the oversight of the editorial staff and contributions are checked for quality and relevance to our readers. The opinions they express are their own.
Related stories