Instagram accidentally banned a post criticizing solitary confinement because Facebook had misplaced the policy allowing it, according to a new Facebook Oversight Board (FOB) decision.
The semi-independent Oversight Board says the Facebook-owned site should not have removed a post about Abdullah Öcalan, a founding member of the militant Kurdistan Workers’ Party (PKK). Facebook designates Öcalan and the PKK “dangerous entities” that users cannot support on its platforms. In January, moderators applied that policy to a message criticizing Öcalan’s imprisonment and solitary confinement — a practice the United Nations has deemed a form of torture — in a Turkish prison.
“The Board is concerned that specific guidance for moderators … was lost for three years”
The user appealed to the Oversight Board, which agreed to examine the case. As it did, Facebook apparently “found that a piece of internal guidance on the Dangerous Individuals and Organizations policy was ‘inadvertently not transferred’ to a new review system in 2018.” The policy had been developed in 2017 partly because of the debate over Öcalan’s living conditions, and it “allows discussion on the conditions of confinement for individuals designated as dangerous.” But the internal guidance was never made public to Facebook or Instagram users — and Facebook only realized it had dropped out of the moderation guidelines altogether when the user appealed.
“Had the board not selected this case for review, the guidance would have remained unknown to content moderators, and a significant amount of expression in the public interest would have been removed,” the FOB’s decision states. Facebook declined to comment on whether it considers that assessment accurate.
“The board is concerned that specific guidance for moderators on an important policy exception was lost for three years,” the FOB’s decision continues. While Facebook restored the post about Öcalan in April, it told the board it was “not technically feasible” to see how many other posts might have been removed because moderators weren’t aware of the exception. “The Board believes that Facebook’s mistake may have led to many other posts being wrongly removed and that Facebook’s transparency reporting is not sufficient to assess whether this type of error reflects a systemic problem.”
“This case demonstrates why public rules are important for users”
The FOB decision broadly pushes Facebook to make its rules more transparent. “This case demonstrates why public rules are important for users: they not only inform them of what is expected, but also empower them to point out Facebook’s mistakes,” it says. The decision says Facebook is conducting a review of how the policy failed to be transferred, and the FOB has offered a series of further, optional recommendations. They include conducting a review process to see if any other policies were lost, as well as publicly clarifying the limits of its ban on supporting “dangerous individuals and organizations.”
Social networks like Facebook and Instagram often keep parts of their policies secret, saying that releasing totally precise moderation rules lets trolls and other bad actors game the system. However, as the FOB notes, this kind of secrecy on a huge, diffuse service can make miscommunication easier — as it apparently did in this case.