Facebook has long tried to explain what is and isn’t allowed on its platform through publicly available community standards, but the company today published what it said are internal guidelines for how it enforces its own rules.
The underlying problem Facebook has wrestled with — almost since its inception — is that the lines between “free speech,” “obscenity,” “objectionable content,” “harassment” and a million other related terms are blurred, and the usual issues around subjectivity are often hugely divisive.
For example, Facebook has long faced criticism over how it deals with photos of breastfeeding mothers, and iconic photos such as the “Napalm girl” have been censored for breaching the company’s broad nudity guidelines.
Elsewhere, Facebook has also become embroiled in debates on issues such as what constitutes “hate speech” and other similar inflammatory content. Facebook often finds itself in the unenviable position of having to play judge and jury on matters that aren’t always clear-cut, and at a scale that isn’t possible for humans to manage in a timely manner.
June 5th: The AI Audit in NYC
Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.
Takedowns
The new publicly available enforcement guidelines cover violence and criminal behavior, safety, objectionable content, integrity and authenticity, respecting intellectual property, and content-related requests. Most of the sections are fairly self-explanatory and not entirely surprising — for example, if you share copyrighted material such as a video that you do not own, don’t be surprised if it magically disappears. And Facebook isn’t overly keen on content related to terrorism, human trafficking, or organized violence either. But despite its vocal aversion to the spread of fake news, the company won’t remove misinformation because this could be construed as “stifling productive public discourse” and because there is also a “fine line between false news and satire or opinion,” the document states.
The guidelines are quite detailed in parts — for example, the company said it will comply with legal guardian requests for content removal of “attacks on unintentionally famous minors.” So presumably if you’re a well-known child actor, you aren’t quite so well protected on the platform.
Ultimately, Facebook wants to be seen as being more transparent about its often inconsistent content takedowns, but in all likelihood this will do very little to assuage critics. There are, of course, clearcut cases that most people would agree have no place on a social network, but with other matters there is often too much nuance to be able to keep everyone happy. One man’s freedom fighter is another’s terrorist, or so the saying goes.
“We decided to publish these internal guidelines for two reasons,” noted Facebook’s VP for global product management, Monika Bickert, in a blog post. “First, the guidelines will help people understand where we draw the line on nuanced issues. Second, providing these details makes it easier for everyone, including experts in different fields, to give us feedback so that we can improve the guidelines — and the decisions we make — over time.”
Appeals
Related to all of this, Facebook is also rolling out a new appeals process for content that has been removed. The company has faced backlash in the past over how difficult it is to get in touch with someone to contest a takedown. You can’t argue with an algorithm, and an algorithm doesn’t have the nuanced understanding to differentiate between porn and a work of art.
The first feature to address this will be an option to request a manual review for a post that has been flagged for nudity, hate speech, or violence.

Above: Innocent photo
Facebook promises that a person will review the post within 24 hours to assess whether the platform’s algorithms have missed the mark.
“We are working to extend this process further by supporting more violation types, giving people the opportunity to provide more context that could help us make the right decision, and making appeals available not just for content that was taken down, but also for content that was reported and left up,” added Bickert. “We believe giving people a voice in the process is another essential component of building a fair system.”
To support its new detailed community standards, Facebook will also be launching a series of events around the world, including in the U.S., U.K., Germany, France, India, and Singapore, where it will solicit feedback from users on the ground.