Leak reveals Facebook’s rules for controversial content
It’s no secret that Facebook’s judgment calls on risky content are sometimes more than a little problematic. But just what are the rules guiding those decisions? You’ll know after today. The Guardian has obtained leaked copies of over 100 internal documents outlining Facebook’s rules for handling sensitive content, and it’s clear that the social network is struggling to walk a fine line between freedom of expression and protecting users. At least some of it is understandable, but there are areas where its decision-making might rub you the wrong way.
For instance, its general rule is to allow violent language unless it involves a “credible” threat to an individual or group. Just what that means is a bit fuzzy, however. While threatening political figures is an obvious red flag, a statement like “I’m going to kill you” is considered too generic. Many people use violent language in “facetious and unserious ways,” Facebook says, and even a disturbing tone doesn’t necessarily mean a statement is violating guidelines. But what about situations where the context turns a normally generic threat into a very specific one? That’s not clear.
There are other gray areas. Facebook will sometimes allow videos of violent deaths and self-harm so long as they’re marked as disturbing and can help raise “awareness” of issues like mental health. It also won’t automatically scrub images of non-sexual animal or child abuse, so long as the material is both flagged appropriately and doesn’t take pleasure in the act. In the case of children, the site will sometimes leave material online to help identify and rescue victims. These are again understandable, but it still raises questions about when preserving content crosses the line from informative to insensitive.
When it comes to nudity, Facebook has clearly learned some lessons from its Vietnam War photo controversy. It’ll allow some “newsworthy exceptions” to photographic nudity as well as “handmade” nude art, but other photos and digital art are off-limits.
Facebook is up front about the complexities of the issues it’s facing: in a statement to the Guardian, Global Policy Management head Monica Bickert explains that people will have “very different ideas about what is OK to share.” There will always be some ambiguity, she says. We reached out ourselves, and Bickert pointed to the company both hiring 3,000 more moderators as well as efforts to “make it simpler” to both report and review posts (you can read her full statement below). In short: Facebook knows it’s in a tricky position, and is betting that it can do a better job by devoting more resources to the task.
Whether or not you agree, Facebook is facing a lot of pressure to get things right. It has nearly 2 billion users, so many decisions can have far-reaching implications. And for some governments, Facebook’s current approach doesn’t go far enough — the UK has proposed fining companies that don’t quickly purge material deemed to be hate speech. The current guidelines might not be perfect, but changing them could be just as risky as leaving them alone.
“Keeping people on Facebook safe is the most important thing we do. Mark Zuckerberg recently announced that over the next year, we’ll be adding 3,000 people to our community operations team around the world â on top of the 4,500 we have today â to review the millions of reports we get every week, and improve the process for doing it quickly. In addition to investing in more people, we’re also building better tools to keep our community safe. We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help.”
(37)