Facebook’s secret guidelines and rules for deciding what its users can post on the site are for the first time revealed in a Guardian investigation and this will fuel the global debate about the ethics and role of the social media giant.
The Guardian has seen over 100 internal training manuals, spreadsheets and flowcharts giving unprecedented insight into the blueprints. Facebook has used it to moderate issues such as hate speech, violence, terrorism, pornography, racism and self-harm.
The internal manual reveals how the site strikes balance between the cries for help and how it discourages the behavior of copycat. Very soon, Facebook will allow its user’s livestream self-harm.
There are guidelines on cannibalism and match-fixing. The Facebook Files give the codes and rules first view formulated by the site, and is under huge political pressure in Europe and the US.
They are experiencing difficulties faced by executives reacting to new challenges such as “revenge porn” – and the moderators are under serious pressure, who say they are overwhelmed by the work volume, which means they have “just 10 seconds” to make a decision.
Many moderators have concerns about the inconsistency and strange nature of some of the policies. For instance, the sexual contents are said to be most complex and confusing.
These blueprints may alarm free speech, while both sides may demand transparency.
Videos of violent deaths, marked as disturbing, can help create awareness of issues such as mental illness.
Photos of animal abuse can be shared, marked as “disturbing”.
Facebook will allow people to livestream attempts to self-harm as it “doesn’t want to punish or censor people in distress”.
In one of the leaked documents, Facebook acknowledges “people use violent language online to express frustration” and feel “safe to do so” on the site.