Social media network updates its Community Standards to tackle offensive content.
Facebook has released the latest update of its Community Standards, where it explains what its one billion users can and cannot share on the platform.
The company’s attempts to reach a balance between freedom of expression and the blocking of offensive content has been detailed in revised guidelines focusing on a variety of topics, including bullying, threats of violence, self-harm and hate speech.
Facebook has gone to war against posts involving sexual violence threats, self-harm encouragement and the support of terrorist organisations.
Mark Zuckerberg’s university prototype is also going to remove hate speech that attacks people based on their race, ethnicity, national origin, religious affiliation and sexual orientation.
Facebook will monitor users’ posts and will review specific posts before deleting them. This is to ensure that images of a campaign to fight child slavery for instance, don’t get deleted as they are being used in good will.
Countries across the world will also be able to make requests for Facebook to remove content from the platform. Under the new standards, the platform will not necessarily take it down, but may restrict access to it in that given country.
A Global Government Requests Report covering the second half of 2014 has also been published and revealed an increase of 11% in government requests for content restrictions, from 8,774 to 9,707. The countries making these type of requests were led by Turkey and Russia, with declines in requests seen in places like Pakistan.
Facebook adds that it will "continue to push governments around the world to reform their surveillance practices in a way that maintains the safety and security of their people while ensuring their rights and freedoms are protected."