YouTube To Block All Anti-Vaccine Content

ManagementMarketingMobile AppsMobilitySocial lawsSocialMedia
coronavirus Image credit: World Health Organisation

Streaming platform will now block all anti-vaccine content, in a major expansion of its ban on misinformation on Coronavirus vaccines

Google;s YouTube division has this week pledged to block all anti-vaccine content, in a major expansion of its misinformation clampdown.

The streaming platform said it will now block all anti-vaccine content about officially approved vaccines such as the flu or MMR vaccine, and not just anti-vaccine content related to Covid-19.

YouTube has gradually ramped up its campaign against anti-vaccine content. In the first half of 2020 it banned ‘medically unsubstantiated’ claims relating to Covid-19.

Vaccine misinformation

Then in October 2020, when most of the world was in lockdown due to the pandemic, YouTube announced it would also ban misleading content relating to Coronavirus vaccines.

This clampdown was not without some hiccups however.

In January this year YouTube was criticised for booting British national radio station, talkRADIO off its platform.

TalkRADIO is one of the UK’s main news and speech radio stations and regularly hosts government ministers, and has an estimated 400,000 listeners.

YouTube a week later reinstated the channel, but explained that it had removed the radio station after some of its Coronavirus content contradicted health advise and information from the World Health Organisation.

All vaccines

And now this week YouTube has signaled it is prepared to combat misinformation on all types of vaccines – not just Covid-19 vaccines.

The platform made the announcement in a blog posting on the matter, in which it confirmed that since last year, it had removed over 130,000 videos for violating its Covid-19 vaccine policies.

“Crafting policy around medical misinformation comes charged with inherent challenges and tradeoffs,” said the platform. “Vaccines in particular have been a source of fierce debate over the years, despite consistent guidance from health authorities about their effectiveness.”

“Today, we’re expanding our medical misinformation policies on YouTube with new guidelines on currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and the WHO,” it said.

YouTube said that it has long removed content that promotes harmful remedies, such as saying drinking turpentine can cure diseases.

But the Covid-19 pandemic has forced it to build on these policies, and it said it has worked with experts to develop 10 new policies around Covid-19 and medical misinformation.

“Working closely with health authorities, we looked to balance our commitment to an open platform with the need to remove egregious harmful content,” it said. “We’ve steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general, and we’re now at a point where it’s more important than ever to expand the work we started with Covid-19 to other vaccines.”

Today’s policy update is an important step to address vaccine and health misinformation on our platform, and we’ll continue to invest across the board in the policies and products that bring high quality information to our viewers and the entire YouTube community,” it said.

This means that banned content will now include videos that claim that the flu vaccine causes infertility and that the MMR shot, which protects against measles, mumps, and rubella, can cause autism.

YouTube has already reportedly banned channels associated with several prominent anti-vaccine activists including Robert F. Kennedy Jr. and Joseph Mercola.

Current challenges

Most major tech platforms have been under pressure from governments and health experts around the world to halt the spread of dangerous conspiracy theories, anti-vaccine content, and anti mask and anti lockdown content.

In July the US administration targetted YouTube, and not only Facebook, as it sought to stem the spread of misinformation about Covid-19 vaccines.

Earlier that month President Joe Biden had commented that Facebook was “killing people” due to Covid misinformation, but backtracked to put the focus on a dozen users held responsible for spreading two-thirds of the fabrications in question.

He later told reporters he wanted Facebook to “do something about the misinformation, the outrageous misinformation about the vaccine”.

Yet YouTube continues to face a backlash against some of its decisions about which content is harmful.

Only this week YouTube deleted the German-language channels of state-backed broadcaster RT (formerly Russia Today).

RT is a Russian state-controlled television network funded by the federal tax budget of the Russian government.

The move incensed the Russian government which called the move “unprecedented information aggression.”

In retaliation threatened to block YouTube in Russia altogether.

Read also :
Author: Tom Jowitt
Click to read the authors bio  Click to hide the authors bio