BBC
New York
Facebook is providing the public with more information about what material is banned on the social network.
Its revamped community standards now include a separate section on “dangerous organisations” and give more details about what types of nudity it allows to be posted. The US firm said it hoped the new guidelines would provide “clarity”.
One of its safety advisers praised the move but said that it was “frustrating” other steps had not been taken. Facebook says about 1.4 billion people use its service at least once a month
The new guide will replace the old one on the firm’s website, and will be sent to users who complain about others’ posts. Monika Bicket, Facebook’s global head of content policy, said the rewrite was intended to address confusion about why some takedown requests were rejected.
“We [would] send them a message saying we’re not removing it because it doesn’t violate our standards, and they would write in and say I’m confused about this, so we would certainly hear that kind of feedback,” she told the BBC.
“And people had questions about what we meant when we said we don’t allow bullying, or exactly what our policy was on terrorism.
“[For example] we now make clear that not only do we not allow terrorist organisations or their members within the Facebook community, but we also don’t permit praise or support for terror groups or their acts or their leaders, which wasn’t something that was detailed before.”
Ms Bicket stressed, however, that the policies themselves had not changed. The new version of the guidelines runs to nearly 2,500 words, nearly three times as long as before.
The section on nudity, in particular, is much more detailed than the vague talk of “limitations” that featured previously.
Facebook adds that the restrictions extend to digitally-created content, unless posts are for educational or satirical purposes. Likewise, text-based descriptions of sexual acts that contain “vivid detail” are forbidden.