Facebook has not applied one of its content standards for 3 years
top of page

Facebook has not applied one of its content standards for 3 years


 

Facebook and Instagram have not applied one of their content standards since 2018, which establishes an exception that allows talking about the human rights of dangerous people or organizations, as has been alerted by the Facebook Content Advisory Council, the body of independent experts that advises to Facebook in complex moderation cases.


This warning came in the context of an Advisory Council resolution to remove an Instagram post from January this year that encouraged discussion of the solitary confinement in a Turkish jail of Abdullah Öcalan, a founding member of the Party of the Kurdistan Workers (PKK).


Due to the PKK's use of violence, both the party and Öcalan were designated as dangerous entities in accordance with Facebook's policy on dangerous people and organizations, so the post was deleted and Facebook rejected their readmission.


However, in a resolution of Facebook's Content Advisory Council published this Thursday, it has warned that "by accident" a part of Facebook's policy on dangerous people, created in 2017, was not transferred to the review system they use. its moderators since 2018.


This guideline, which "opens the debate on the prison conditions of people designated as dangerous", caused the content to be allowed again on Instagram on April 23.


Likewise, Facebook informed the Advisory Council that it was "working on an update of its policies in order to allow users to discuss the human rights of persons designated as dangerous." The deleted publication did not support Öcalan or the PKK, it only encouraged a debate on their human rights due to their imprisonment.


Facebook is not consistent with its own standards


In its conclusions, the Council said that Facebook's decision to remove the content "was not consistent" with its own rules, and has stated that it is concerned "that Facebook has overlooked a specific guideline on an important exception of politics for three years. "


The Council considers that due to the Facebook error, many other posts may have been mistakenly removed since 2018 and that Facebook's transparency reports are not sufficient to assess whether this type of error reflects a system problem.


That's why independent experts have asked Facebook to reinstate the missing 2017 rule immediately, to evaluate its review processes for posts about dangerous people and organizations, and to publish these results, as well as to ensure that its moderators receive adequate training.


Likewise, it has also asked Facebook to add a clearer explanation in its policies about when it considers that a post supports a dangerous leader and how users can make their intentions clear.

5 views0 comments

Recent Posts

See All
bottom of page