The Facebook Oversight Board has once again urged the social networking giant to be more open about its content moderation decisions.
Often described as Facebook’s (now Meta’s) Supreme Court, the board overturned two decisions made by Meta to remove user posts from the platforms.
In both cases, the board said the content did not violate Facebook’s policies and it again recommended that the firm provides more information to users about actions it takes on their content.
The Oversight Board on Thursday covered two rulings overturning Meta’s decisions to remove user posts from its platforms, as part of its own transparency report for third quarter (July to September) of 2021.
This report revealed that the number of cases submitted to the board had risen significantly during the third quarter, and there had been an increase in cases related to violence and incitement.
“In around half of the cases shortlisted by the Board in Q3 2021, Meta decided its original decision to remove the content was incorrect,” noted the board.
This happens when the board asks Meta if particular cases are eligible for review, and then Meta sometimes realises that its original decision to remove content was incorrect.
But then the board addressed the need for greater transparency in two cases.
“In addition to issuing our Q3 transparency report, today we have also published two new case decisions which call for more transparency around Meta’s policies and how they are enforced,” it said.
The first case apparently concerned an Instagram post about “ayahuasca, a psychoactive brew with spiritual uses, we repeated our call for Meta to explain to users that it enforces the Facebook Community Standards on Instagram, with several specific exceptions.”
The board ordered Instagram to restore the post.
The second case about a “decision about a wampum belt, a North American Indigenous art form, we asked Meta to assess how accurately its reviewers enforce exceptions on artistic expression and expression about human rights violations.”
The offending post was meant to raise awareness about the discovery of unmarked graves at a former residential school for Indigenous children.
Facebook’s automated systems identified the post as potentially violating its hate speech policy and a human reviewer removed it the day after it was posted; when the user appealed the decision, a second human reviewer affirmed the decision to remove.
When the Oversight Board selected the case for review, Meta identified the post’s removal as an “enforcement error” and restored it on August 27, according to the board’s Thursday statement.
In both decisions, the board recommended that Meta provide more information to users about actions it takes on their content.
This is not the first time that the Oversight Board and called for more transparency from Meta/Facebook.
In the Autumn for example, the board began an investigation over reports about the platform’s internal system, used to deal with elite or high-profile public figures.
Facebook’s ‘Cross-check’ (Xcheck) system was alleged to have allowed public figures to flout the content rules of the online platform.
It was claimed that Facebook was giving special treatment to well-known users including celebrities, politicians, sportsmen and journalists by putting them in a separate ‘whitelist’ system, allowing them to break Facebook rules.
Under the programme, some users were reportedly “whitelisted” – i.e. not subject to enforcement action – while others were allowed to post material that violated Facebook rules, pending content reviews that often do not take place.
In October the Oversight Board slammed Facebook, saying it had not been fully forthcoming with the Board on its ‘cross-check’ system, which the company uses to review content decisions relating to high-profile users.
But perhaps the most high profile case the Oversight Board has dealt with, concerns former US President Donald Trump.
In early May the board upheld Facebook’s suspension of Donald Trump, after it implemented an ‘indefinite’ ban of the former US President in January this year.
This was because of Trump’s role in inciting a mob of his supporters to storm the US Capitol building on Wednesday 6 January, which resulted in the deaths of five people (including one police officer who was beaten to death).
However the board did criticise Facebook at the time for its language, among other things.
As businesses re-examine how they operate in a post-pandemic world, it is increasingly evident that…