YouTube removed more than 8M videos for violent & extremist content during Q4 2017
YouTube says it's making strides in protecting viewers from abusive content, but brand safety is still not 100%.
In its first-ever community guidelines enforcement report, YouTube says it removed 8,284,039 videos for violent or extremist content between October and December of 2017. Of the more than 8 million videos taken down, 6.7 million were initially flagged by a machine versus a human — and, according to YouTube, 76 percent of the 6.7 million videos removed were removed before they received a single view.
“Machines are allowing us to flag content for review at scale, helping us remove millions of violative videos before they are ever viewed,” writes the YouTube Team on the site’s official blog.” And our investment in machine learning to help speed up removals is paying off across high-risk, low-volume areas (like violent extremism) and in high-volume areas (like spam).”
YouTube also reported more than 9 million videos were flagged by human reviewers for a variety of content violation rules, from spam or misleading content to videos that involved hateful or abusive content or promoted terrorism.
According to the report, flagged content remains on the site as long as it doesn’t violate Community Guidelines. In addition to offering up stats around flagged videos, YouTube introduced a “Reporting History Dashboard” where users can see the status of any videos they may have flagged for inappropriate content.
In 2017, Google said it was committed to hiring 10,000 employees by the end of this year to help identify violent content being uploaded to YouTube. At this point, the company claims a majority of the additional roles needed to “reach our contribution to meeting that goal” have been filled, but it did not give a specific number of hires. It did say that it had recruited “full-time specialists with expertise in violent extremism, counter-terrorism, and human rights.”
But YouTube’s brand safety issues are still a major concern. On April 20, CNN released an investigative report that showed a number of YouTube ads from well-known brands — including Nissan, Under Armour, Amazon, Hershey, Netflix and Hilton — ran alongside extremist content.
Both Nissan and Under Armour said they were pausing their YouTube campaigns after being made aware that their ads had appeared within extremist video content, and Hilton said it was removing ads from the site.
A YouTube spokesperson sent the following comment to Marketing Land in response to CNN’s report:
We have partnered with our advertisers to make significant changes to how we approach monetisation on YouTube with stricter policies, better controls and greater transparency. When we find that ads mistakenly ran against content that doesn’t comply with our policies, we immediately remove those ads. We know that even when videos meet our advertiser friendly guidelines, not all videos will be appropriate for all brands. But we are committed to working with our advertisers and getting this right.
The quarterly report released by YouTube today, the first of what it says will be a regular update, aims to give more transparency to how much content the site is reviewing and removing. And while more than 8 million videos being removed is a sizeable number, it doesn’t necessarily offer a lot of insight given that we don’t know how many videos in total were uploaded to YouTube during that same time period.
Last year, YouTube was plagued with a number of brand safety issues, resulting in multiple brands boycotting the site. Since then, the company has dedicated much of its efforts to win back advertisers’ confidence in the site, but the latest report from CNN shows the investments YouTube has made to police violent and extremist content are still not enough.
Opinions expressed in this article are those of the guest author and not necessarily MarTech. Staff authors are listed here.