YouTube ‘Major Conduit’ Of Misinformation, Fact-checkers Allege

YouTube

Group of more than 80 fact-checking organisations write open letter to YouTube’s CEO, calling platform a major conduit of misinformation

Alphabet’s YouTube division is accused of being a major conduit of online misinformation, and is alleged to be not doing enough to tackle and take down falsehoods.

This is the central allegation laid against it in an open letter to YouTube’s chief executive, Susan Wojcicki.

A full copy of the letter can be found here.

The letter comes from 80 fact checkers around the world, including Full Fact, and they are calling on YouTube “to take effective action against disinformation and misinformation”, because “up until now (it has) largely escaped criticism despite fact checkers seeing problematic content daily on the platform.”

Open letter

The letter is addressed to YouTube CEO Susan Wojcicki and starts by noting that it has been almost two years now since the Covid-19 pandemic started.

“The world has seen time and time again how destructive disinformation and misinformation can be for social harmony, democracy, and public health; too many lives and livelihoods have been ruined, and far too many people have lost loved ones to disinformation,” the letter states.

“As an international network of fact-checking organisations, we monitor how lies spread online – and everyday, we see that YouTube is one of the major conduits of online disinformation and misinformation worldwide,” the organisations wrote. “This is a significant concern among our global fact-checking community.”

They allege that they do not see is much effort by YouTube to implement policies that address the problem.

“On the contrary, YouTube is allowing its platform to be weaponised by unscrupulous actors to manipulate and exploit others, and to organise and fundraise themselves,” it said. “Current measures are proving insufficient.”

The letter states in the past year, fact checkers have seen conspiracy groups thriving and collaborating across borders, including an international movement that started in Germany, jumped to Spain and spread through Latin America, all on YouTube.

It also pointed out that millions of other users have been watching videos in Greek and Arabic that encouraged them to boycott vaccinations, or treat their Covid-19 infections with bogus cures.

Beyond Covid-19, YouTube videos have been promoting false cures for cancer for years, the letter points out.

It said for example in Brazil YouTube has been used to amplify hate speech against vulnerable groups, reaching tens of thousands of users.

The letter also alleged that YouTube is being used to spread lies and misinformation about elections around the world.

“The whole world witnessed the consequences of disinformation when a violent mob assaulted the US Capitol last year,” wrote the letter. “From the eve of the US presidential election to the day after, YouTube videos supporting the “fraud” narrative were watched more than 33 million times.”

The letter said misinformation examples are too many to count, and it therefore proposes “some solutions that would make a great deal of difference in reducing the dissemination of disinformation and misinformation on YouTube.”

These include:

  • A commitment to meaningful transparency about disinformation on the platform: YouTube should support independent research about the origins of the different misinformation campaigns, their reach and impact, and the most effective ways to debunk false information. It should also publish its full moderation policy regarding disinformation and misinformation, including the use of artificial intelligence and which data powers it.
  • Beyond removing content for legal compliance, YouTube’s focus should be on providing context and offering debunks, clearly superimposed on videos or as additional video content. That only can come from entering into a meaningful and structured collaboration taking the responsibility and systematically investing in independent fact-checking efforts around the world that are working to solve these issues.
  • Acting against repeat offenders that produce content that is constantly flagged as disinformation and misinformation, particularly those monetizing that content on and outside the platform, notably by preventing its recommendation algorithms from promoting content from such sources of misinformation.
  • Extend current and future efforts against disinformation and misinformation in languages different from English, and providing country- and language-specific data, as well as transcription services that work in any language.

“We hope you will consider implementing these ideas for the public good and to make YouTube a platform that truly does its best to prevent disinformation and misinformation being weaponised against its users and society at large,” the letter states. “We are ready and able to help YouTube. We wish to meet with you to discuss these matters and find ways forward on a collaboration and look forward to your response to this offer.”

YouTube accused

Tech platforms have been facing pressure from Western governments for a while now to crackdown on misinformation.

In November 2020, big name tech platforms apparently teamed up with the UK government to try and find new ways to combat Coronavirus misinformation.

It was the role of Full Fact at the time to co-ordinate a collaboration between Facebook, Twitter, Google, and the UK government among others, to tackle misinformation.

But questions remain as to how effective it has been, and whether firms such as YouTube are doing enough to tackle the issue.

In July 2021 the US administration reportedly targetted YouTube and Facebook, as it sought to stem the spread of misinformation about Covid-19 vaccines amidst renewed pandemic pressure.

Earlier that month President Joe Biden had commented that Facebook was “killing people” due to Covid misinformation, but later backtracked to put the focus on a dozen users held responsible for spreading two-thirds of the fabrications in question.

But the US administration is also taking YouTube to task over the matter, with both Facebook and the Google-owned video-sharing site having “inconsistent enforcement”, an unnamed senior administration official previously told Reuters.

Misinformation crackdown

To be fair YouTube has taken enforcement action in the past.

YouTube for example in April 2020 banned all conspiracy theory videos that falsely linked 5G networks to the spread of Coronavirus.

It also banned ‘medically unsubstantiated’ claims relating to Covid-19, and in October 2020 it went one step further to ban misleading content relating to vaccines.

In September 2021 YouTube deleted the German-language channels of state-backed broadcaster RT (formerly Russia Today) for repeatedly breaching its Covid-19 misinformation policy.

RT is a Russian state-controlled television network funded by the federal tax budget of the Russian government.

The move has incensed the Russian government, which in retaliation has threatened to block YouTube.

Meanwhile it has been reported that Meta, Twitter and YouTube have all partnered with fact-checkers in some capacity over the years.

Meta’s effort, named the International Fact Checking Network, is thought to be the most robust as it relies on 80 fact-checking organizations across 60 languages globally.

YouTube meanwhile collaborates with hundreds of publishers to direct users in several countries to authoritative information in fact-check panels.