Facebook now letting users appeal posts that were removed for violating content standards
For the first time ever, Facebook has published the guidelines its internal teams use to decide what content is allowed on the site.
For the first time ever, Facebook has released the guidelines its internal content review teams use to determine whether or not to remove a post for breaking community standards. Not only is the site offering a window into how it decides what stays up and what gets removed, Facebook is opening an appeals process for users to request a second opinion if they believe one of their posts has been taken down by mistake.
“Our policies are only as good as the strength and accuracy of our enforcement — and our enforcement isn’t perfect,” writes Facebook’s VP of global policy management, Monika Bickert, in a blog post announcing the company’s internal enforcement guidelines.
Right now, the appeals process will only apply to posts that were removed for nudity, sexual activity, hate speech or graphic violence, but the site says it plans to “build-out” the appeal process during the coming year.
If Facebook removes a photo, video or post, it will alert the user that the content was removed for violating the site’s Community Standards. The notification will also include an option for the user to request an additional review. Facebook says this will lead to a review by a person on their team and will typically take 24 hours.
“If we’ve made a mistake, we will notify you, and your post, photo or video will be restored,” writes Bickert.
Facebook says its Community Standards are rooted in three specific principles: safety, voice and equity. The guidelines published on Facebook’s Community Standards site are divided into six categories:
- Violence and Criminal Behavior.
- Objectionable Content.
- Integrity and Authenticity.
- Respecting Intellectual Property.
- Content Related Requests.
According to Bickert, Facebook’s Community Standards are developed by subject matter experts located in 11 offices around the world.
“Many of us have worked on the issues of expression and safety long before coming to Facebook. I worked on everything from child safety to counterterrorism during my years as a criminal prosecutor, and other team members include a former rape crisis counselor, an academic who has spent her career studying hate organizations, a human rights lawyer, and a teacher,” writes Bickert.
To enforce the guidelines, Facebook uses a combination of artificial intelligence and reports from people who have identified posts for potential abuse. The reports are then reviewed by an operations team made of up of more than 7,500 content reviewers spanning over 40 languages. (Facebook notes its 7,500-person Community Review staff is 40 percent larger than it was this time last year.)
Bickert says the company’s challenges in enforcing its community guidelines are, first, identifying posts that break the rules, and then accurately applying policies to flagged content.
“In some cases, we make mistakes because our policies are not sufficiently clear to our content reviewers; when that’s the case, we work to fill those gaps,” writes Bickert. “More often than not, however, we make mistakes because our processes involve people, and people are fallible.”
Facebook says it is also launching a Community Standards Forum next month that will include a series of public events in the US, Germany, France, the UK, India, Singapore and other countries to get user feedback on its content guidelines.
Opinions expressed in this article are those of the guest author and not necessarily MarTech. Staff authors are listed here.