Martech: Martech is Marketing Logo
  • Topics
    Transformation
    Operations
    Data
    Experience
    Performance
    Management
    Special Reports
    All Topics
  • Conference
  • Webinars
  • Intelligence Reports
  • White Papers
  • What is MarTech
    Mission
    Team
    Newsletter
    Search Engine Land
    Third Door Media

Processing...Please wait.

MarTech » Customer & Digital Experience » Facebook’s humanless sifting of trending news suggests some AI needs supervision

Facebook’s humanless sifting of trending news suggests some AI needs supervision

As AI suits up for self-driving marketing platforms, the Facebook experience may outline some boundaries.

Barry Levine on September 9, 2016 at 3:25 pm
Rokas Tenys / Shutterstock.com

Rokas Tenys / Shutterstock.com

Artificial intelligence (AI) is powering marketing tools at various levels, with some products and many predictions emerging that forecast self-managed platforms.

But Facebook’s removal last month of human editors from its Trending Topics team, and the resulting errors in judgment by its AI-powered algorithm, raise questions if AI — at least for the foreseeable future — can be left alone without human supervision. Among other tasks, the editors had created short descriptions that accompanied the topics.

In particular, there may be some kinds of content and contexts — such as the combo of news and Facebook — where human sensibility is still required.

The human editing process had come under fire from conservatives following a story in Gizmodo about whether conservative-oriented viewpoints were neglected. Facebook conducted an internal review that found no evidence of bias, but decided anyway to remove the humans from the process. Human engineers still remain in the unit, but to tweak the algorithm, not to oversee the choices.

Although Facebook contended that Trending Topics were always surfaced by an algorithm, the team of editors had been charged with making sure the results were what the company called “high quality and useful,” as well as linking the topics to relevant news stories. The algorithm is optimized for stories that demonstrate a large volume of mentions on Facebook, as well as for peaks of mentions in a short period of time.

Facebook Trending Topic changes

The human editors were removed on Friday, August 26, and issues with story choices emerged soon after. Over the weekend, two inappropriate stories emerged into Trending Topics. One was about a Saturday Night Live star’s obscenity-enhanced roast of rightwing pundit Ann Coulter, and another was a story about a video of a man getting intimate with a McDonald’s sandwich.

And, on that Monday, there was a fake story: “Fox News Exposes Traitor Megyn Kelly, Kicks Her Out for Backing Hillary.” According to the Washington Post, the fake article remained for several hours as Facebook’s top Megyn Kelly story.

‘Not surrender completely’

The intention of Trending Topics, Facebook says, is to “surface the major conversations happening on Facebook,” including news events, but without the “suppression of political perspectives.”

In fact, Titus Capilnean, Growth Manager at AI-powered customer service provider DigitalGenius told me, intention is one of at least two key human traits that AI software still needs from us carbon-based mortal units. Those two traits, he said, show why Facebook “should not surrender completely to an algorithm.”

Until we reach the point when computing systems become conscious, an intelligent system’s intention needs to be defined by humans. We still determine if a system is intended to increase sales, suggest a movie like the one we’ve just rented, or find news stories that are generating conversations.

The other human-specific trait, Capilnean said, is empathy. Algorithms have begun to detect emotions, but empathy involves one human being’s innate sense of another’s feelings.

Perhaps more comprehensive data and better training could helped the Facebook algorithm recognize that the Megyn Kelly story was bogus. But empathy might be the determining factor in deciding if, say, a story about a hundred people dying in a hurricane in the Philippines deserves a higher ranking in the news than Justin Bieber’s latest breakup with his girlfriend. Otherwise, Facebook is content to let click popularity do that.

Although Facebook co-founder and CEO Mark Zuckerberg has said his company is a technology platform and not a publisher, a study this year by the Pew Research Center found that an astounding 44 percent of U.S. adults get their news at least in part on Facebook. Whether the company wants to or not, it is a publisher.

So, assessing story importance and credibility is essential to any organization providing a filter for news. The question is whether the algorithm can develop that kind of sophisticated judgement.

Point of view

This kind of understanding, Capilnean said, is not going to show up in AI “anytime soon.”

“It needs guidance from a human editor,” he said. “AI shouldn’t be left alone.”

Although the removal of human editors was intended to remove any semblance of bias, the algorithm’s problems thus far with making news assessments is not that it is not objective enough. The problem, in part, is that it doesn’t seem to have a sufficiently realized point of view.

If you read the New York Times, you implicitly subscribe to the Times’ view of the world. You trust its perspective for making value judgments, not its neutrality.

That’s especially true for news, and, according to AI/marketing firm Boomtrain CTO Chris Monberg, it pertains as well to other kinds of business.

“All AI needs to be opinionated to meet business obligations,” he said. For example, the software might need to offer its “opinions” about who might make a good future customer and which marketing efforts would best reach them.

Facebook’s algorithm for news, like Boomtrain’s for marketing tech, needs an opinionated view of its task. It needs a way of determining validity and value, according to some standard.

Monberg told me you could create “an approximate taxonomy” of values that is comprehensive enough to know whether a story is appropriate or not.

Under construction

As an example, he said, you could introduce things like “shaming,” similar to a child’s sense of shame after being scolded for using a bad word. It’s the same kind of training that Microsoft could have used, he pointed out, to prevent Microsoft’s chatbot Tay from being goaded by users into becoming a racist.

Tay would have had “opinions” that place values on permissible conversations, just as a news algorithm might be able to determine when a story about a sensitive subject has value, or if it seemed credible.

But those are tricky choices, Monberg said, and AI is not yet able to match the integrity and judgment of a human news editor.

Although Facebook could introduce “more sophisticated framing models” than what it’s now using, he said, we’re still a long ways from emulating a human editor.

AI may be ready for prime time to make its own decisions about how to best sell socks to the most receptive online customers, but its news judgment appears to be still under construction.


New on MarTech

    Webinar: Overcome third-party data challenges for CX success

    Public wants to know where brands stand on issues, surveys show

    Why event technology is critical to marketing success

    HubSpot expands App Accelerator program internationally

    The power and limitations of universal IDs

About The Author

Barry Levine
Barry Levine covers marketing technology for Third Door Media. Previously, he covered this space as a Senior Writer for VentureBeat, and he has written about these and other tech subjects for such publications as CMSWire and NewsFactor. He founded and led the web site/unit at PBS station Thirteen/WNET; worked as an online Senior Producer/writer for Viacom; created a successful interactive game, PLAY IT BY EAR: The First CD Game; founded and led an independent film showcase, CENTER SCREEN, based at Harvard and M.I.T.; and served over five years as a consultant to the M.I.T. Media Lab. You can find him at LinkedIn, and on Twitter at xBarryLevine.

Related Topics

Customer & Digital ExperienceDigital TransformationPerformance Marketing

Get the daily newsletter digital marketers rely on.

Processing...Please wait.

See terms.

ATTEND OUR EVENTS

June 7, 2022: Master Classes

September 28-29, 2022: Fall

Start Discovering Now: Spring

Learn More About Our MarTech Events

June 14-15, 2022: SMX Advanced (virtual)

November 14-15, 2022: SMX Next (virtual)

March 8-9, 2022: Master Classes (virtual)

Learn More About Our SMX Events

Webinars

Take a Crawl, Walk, Run Approach to Multi-Channel ABM

Content Comes First: Transform Your Operations With DAM

Dominate Your Competition with Google Auction Insights and Search Intelligence

See More Webinars

Intelligence Reports

Enterprise SEO Platforms: A Marketer’s Guide

Enterprise Identity Resolution Platforms

Email Marketing Platforms: A Marketer’s Guide

Enterprise Sales Enablement Platforms: A Marketer’s Guide

Enterprise Digital Experience Platforms: A Marketer’s Guide

Enterprise Call Analytics Platforms: A Marketer’s Guide

See More Intelligence Reports

White Papers

Reputation Management For Healthcare Organizations

Unlock the App Marketing Potential of QR Codes

Realising the power of virtual events for demand generation

The Progressive Marketer’s Ultimate Events Strategy 2022 Worksheet

CMO Guide: How to Plan Smart and Pivot Fast

See More Whitepapers

Receive daily marketing news & analysis.

Processing...Please wait.

Topics

  • Transformation
  • Operations
  • Data
  • Experience
  • Performance
  • Management
  • All Topics
  • Home

Our Events

  • MarTech
  • Search Marketing Expo - SMX

About

  • What is MarTech
  • Contact
  • Privacy
  • Marketing Opportunities
  • Staff

Follow Us

  • Facebook
  • Twitter
  • LinkedIn
  • Newsletters
  • RSS

© 2022 Third Door Media, Inc. All rights reserved.