Facebook says it has accidentally lost a moderate rule that includes “dangerous” figures

[ad_1]

Instagram inadvertently denied a message criticizing the isolation because Facebook had misplaced its permissive policy, according to a decision by the new Facebook Supervisory Board (FOB).

Semi – independent Supervisory Board says the Facebook – owned site should not have been removed message from Abdullah Öcalan, a founding member of the militant Kurdistan Workers’ Party (PKK). Facebook names Öcalan and the PKK as “dangerous entities” that users cannot support on its platforms. In January, moderators applied this policy to a message criticizing Öcalan’s imprisonment and isolation – UN practice considered a form of torture – In a Turkish prison.

The user complained to the Supervisory Board, which agreed to investigate the case. As it did, Facebook apparently “found that internal guidance on the policies of dangerous individuals and organizations was not” inadvertently transferred “to the new inspection system in 2018.” The policy had been developed in 2017, in part because of the debate over Öcalan’s living conditions, and “allows for a debate on the conditions for giving birth to people classified as dangerous”. However, internal instructions were never disclosed to Facebook or Instagram users – and Facebook only realized that it had fallen completely out of control instructions when the user appealed.

“Had the government not selected this case for reconsideration, the guidelines would have remained unknown to content controllers, and a significant amount of expression in the public interest would have been removed,” the FOB decision states. Facebook declined to comment on whether it considered the assessment accurate.

“The government is concerned that the guidance given to supervisors on an important political exception was lost for three years,” the FOB decision continues. Although Facebook returned a message about Öcalan in April, it told the government it was not “technically possible” to see how many other messages could have been removed because supervisors were unaware of the exception. “The government believes that Facebook’s error may have led to the erroneous deletion of many other messages and that Facebook’s transparency reporting is not sufficient to assess whether this type of error addresses a systemic problem.”

The FOB decision drives Facebook extensively to make its rules more transparent. “This case shows why public rules are important to users: they not only inform them of what is expected, but also allow them to point out Facebook’s mistakes,” it says. According to the decision, Facebook will re-examine how the policy could not be transferred, and FOB has provided a number of other optional recommendations. These include carrying out a review process to determine whether other practices have been lost, as well as publicly clarifying the limits of the ban on “dangerous persons and organizations”.

Social networks like Facebook and Instagram often keep some of their policies a secret, saying that publishing completely strict control rules allows trolls and other bad players to play the system. However, as the FOB points out, such secrecy in a huge, decentralized service can facilitate false communication – as it apparently did in this case.

Leave feedback about this

  • Rating

Flying in Style: Explore the World’s Tiniest Jets! How Fast Is a Private Flight? Master the Skies with Your Private Jet License with Easy Steps! Top 8 Best Private Jet Companies Your Ultimate Guide to Private Jet Memberships!