Data privacy: Picking the lock on Pandora’s box

Columnist Lewis Gersh says the development of smart technology, which can surreptitiously harvest consumer data in everything from toys to televisions, raises concerns that the ad industry can’t afford to ignore.

Chat with MarTechBot

computer-security-lock-privacy-ss-1920“This complaint concerns toys that spy.”

That’s the stark opening sentence of a Federal Trade Commission complaint that a collection of consumer watchdog groups has filed against Genesis Toys and speech-recognition technology provider Nuance Communications. The complaint alleges that Genesis and Nuance “unfairly and deceptively collect, use and disclose audio files of children’s voices without providing adequate notice or obtaining verified parental consent …”

The toys in question, including My Friend Cayla and i-Que Intelligent Robot, “converse” with kids using technology like what’s found in voice-activated apps such as Siri. These conversations are recorded and converted to text. The text is scanned for keywords used to generate, among other things, insidious product endorsements. Cayla tells kids her favorite movie is Disney’s “The Little Mermaid,” for instance.

The complaint paints a chilling picture:

[blockquote] By purpose and design, these toys record and collect the private conversations of young children without any limitations on collection, use or disclosure of this personal information. The toys subject young children to ongoing surveillance and are deployed in homes across the United States without any meaningful data protection standards. They pose an imminent and immediate threat to the safety and security of children in the United States.[/blockquote]

Who saw that coming? (I did.)

In my recent column offering predictions for what 2017 would bring in marketing technology, I wrote that “the odds-on favorite for a scandal sensational enough to splash down in the mainstream will be an egregious violation of user privacy.”

December was a busy month for the FTC in the privacy sphere. Along with hearing the Genesis/Nuance complaint, the commission held a workshop to address growing concerns about the surveillance and data-collection capability of smart TVs.

Televisions and toys represent a new frontier in harvesting consumer data. And those who try to exploit this data expose the entire ad industry to a severe consumer backlash. As Jessica Rich, the FTC’s director of Bureau Protection, put it at the workshop, “Consumers expect some level of data collection when they use their computers … [but] many consumers have a fundamentally different relationship with their TVs than their computers.”

That’s why they will find the idea that their smart TVs are listening for their dialogue to parse keywords for retargeting purposes so outrageous.

How’s my tinfoil hat look now?

When I first warned about things like speech-recognition technology six or seven years ago, 90 percent of people looked at me like I was crazy. But it’s not like this was some deranged, futuristic vision that I pulled out of the ether. I had built the largest portfolio of retargeters, and I knew what my CEOs were asked to do.

Funny thing is, now about 50 percent of the people who were pooh-poohing the idea of covert consumer surveillance six years ago are saying, “Oh, yeah, of course that’s happening.”

And that’s not all that’s happening. Ambient-listening-enabled smartphones, tablets and TVs are everywhere, including business meetings, living rooms — even the bedroom. It really just has to stop.

Really, none of this should come as a surprise. A certain segment of the industry finds a way to abuse every new development in marketing technology.

When some CMOs see diminishing returns from the new technology, they double down on efficiency at the expense of efficacy. They overuse the technology to the point of abusing the consumer, who then rebels. We saw it with the devolution of direct mail to junk mail and of email to spam and display ads into stalking. A consumer shouldn’t need a restraining order for six weeks to look at a pair of socks.

What’s different this time is the stakes. The advent of ad exchanges changed everything. Now a bad actor who uploads their data or gives it to a third party flings open a global Pandora’s box. That ill-intended data becomes completely ubiquitous worldwide, and you can never get it back. That creates an entirely different dynamic.

The bottom line, unfortunately, is that reputable brands are going to have to spend more time, money and effort to ensure that they stay reputable. That means not only safeguarding consumers’ data internally but also being extremely rigorous about where their data goes.

A single misstep can have devastating consequences. Look at the cautionary tale going back to ValueClick, which ended up paying a $2.9 million settlement after the FTC accused it of distributing deceptive emails and failing to adequately protect its users’ data. In addition to the financial penalty, the company had to rebrand in order to rebuild its reputation.

And it seems that hardly a month goes by without news of another major retailer suffering a security breach involving its customers’ credit card data.

Even well-meaning brands can blunder into privacy catastrophes through the unintended consequences of parsing consumer data. Remember when Target inadvertently revealed a teenager’s pregnancy to her father?

Guard your data like your money

The only way to be sure that you’re protecting your business, your brand and your relationship with your consumers is through ceaseless diligence. Keep track of where your data is going at all times. That requires rigorous self-policing of access to your data. You also need to identify where you’re getting the data you use for your own retargeting.

That’s where it gets really hard. If you’re buying over exchanges, how do you know where that data came from? A good first step is to vet individual partners and use only those as your data source. But even then, how do you know where that partner is getting their data? They could be relying on a whole basket of providers from across the ethical spectrum.



At the very least, all brands should have contracts stating that they will never knowingly accept any data from a partner that harvests data through unethical practices such as ambient listening. And any company that deliberately abuses consumer privacy should be not only condemned by the entire industry but also boycotted.


Opinions expressed in this article are those of the guest author and not necessarily MarTech. Staff authors are listed here.


About the author

Lewis Gersh
Contributor
Lewis Gersh is founder and CEO of PebblePost, guiding corporate strategy and company vision with over 20 years of board and executive management experience. Prior to PebblePost, Lewis founded Metamorphic Ventures, one of the first seed-stage funds, and built one of the largest portfolios of companies specializing in data-driven marketing and payments/transaction processing. Portfolio companies include leading innovators such as FetchBack, Chango, Tapad, Sailthru, Movable Ink, Mass Relevance, iSocket, Nearbuy Systems, Thinknear, IndustryBrains, Madison Logic, Bombora, Tranvia, Transactis and more. Lewis received a B.A. from San Diego State University and a J.D. and Masters in Intellectual Property from UNH School of Law. Lewis is an accomplished endurance athlete having competed in many Ironman triathlons, ultra-marathons and parenting.

Fuel for your marketing strategy.