Facebook Manipulated Media Policy ‘Incoherent’, Board Finds

facebook, libra, social network, mark zuckerberg, meta

Facebook Oversight Board finds policy on manipulated media ‘incoherent’ amidst concern over election misinformation

Facebook parent Meta Platforms has said it is “reviewing” findings from its own independent Oversight Board that criticised its manipulated media policy as “incoherent” and recommended changes to prevent election misinformation.

The company said it would respond within 60 days to a report into the company’s decision not to remove a manipulated video of US president Joe Biden.

The video, which used October 2022 footage of Biden with his granddaughter, was edited to suggest he touched her inappropriately.

The board agreed with Meta’s decision not to remove the video because it did not violate its policy, which only covers video manipulated with AI or showing people saying things they did not say.

tara winstead ai artificial intelligence robot facebook
Image credit: Tara Winstead/Pexels

‘Incoherent’ policy

Because the video was edited without AI and showed someone doing something they did not do, it did not violate the policy, the board said.

In addition the edits were “obvious and therefore unlikely to mislead” most users, it said.

“Nevertheless, the Board is concerned about the Manipulated media policy in its current form, finding it to be incoherent, lacking in persuasive justification and inappropriately focused on how content has been created rather than on which specific harms it aims to prevent (for example, to electoral processes),” the board wrote.

“Meta should reconsider this policy quickly, given the number of elections in 2024.”

Media labelling

It said the policy should be rewritten to apply regardless of the method of creation.

The board also called for more labelling of manipulated media, especially in cases where the media cannot be removed for breaking policies.

In such cases Meta should “apply a label indicating the content is significantly layered and may mislead”, the board said.

The Ovesight Board said Meta representatives had told it the company “plans to update the Manipulated Media policy to respond to the evolution of new and increasingly realistic AI“.