Doesn’t already do this? YouTube updates its medical misinformation policy and starts removing false claims about cancer treatment
Alphabet’s YouTube is updating its medical misinformation policy to deal with false cancer treatment claims.
In a blog post on Tuesday, Dr Garth Graham (YouTube’s Global Head of Healthcare) and Matt Halprin (VP and Global Head of Trust and Safety) said the platform is “thinking about the future of medical misinformation policies, including removing cancer misinformation.”
This decision may surprise some, who would naturally expect the platform to already tackle false cancer treatments – treating them along the same lines as false claims about vaccines or abortions, or even content that promotes eating disorders.
This year there have been allegations that some on the platform have been scamming people, claiming to have terminal cancer, but reaping the rewards after ‘making’ a miraculous recovery.
But now it seems that YouTube is playing catch up and will clampdown on false claims about cancer treatments.
“In the years since we began our efforts to make YouTube a destination for high-quality health content, we’ve learned critical lessons about developing Community Guidelines in line with local and global health authority guidance on topics that pose serious real-world risks, such as misinformation on Covid-19, vaccines, reproductive health, harmful substances, and more,” wrote Dr Graham.
YouTube has been beefing up its health policies about posted online content over the past few years.
In the first half of 2020 the platform had banned ‘medically unsubstantiated’ claims relating to Covid-19.
Then in October 2020, when most of the world was in lockdown due to the pandemic, YouTube announced it would also ban misleading content relating to Coronavirus vaccines.
In September 2021, YouTube said it would block all anti-vaccine content, in a major expansion of its ban on misinformation on Coronavirus vaccines.
“As medical information – and misinformation – continuously evolves, YouTube needs a policy framework that holds up in the long term, and preserves the important balance of removing egregiously harmful content while ensuring space for debate and discussion,” Dr Graham said this week.
“Moving forward, YouTube will streamline dozens of our existing medical misinformation guidelines to fall under three categories – Prevention, Treatment, and Denial,” Dr Graham said.
These policies will apply to specific health conditions, treatments, and substances where content contradicts local health authorities or the World Health Organization (WHO).
“Our goal is to make the rules of the road clearer and have a more transparent framework in place for evaluating whether certain diseases or conditions would be included in the future,” said Dr Graham.
“Starting today and ramping up in the coming weeks, we will begin removing content that promotes cancer treatments proven to be harmful or ineffective, or content that discourages viewers from seeking professional medical treatment,” said Dr Graham.
“This includes content that promotes unproven treatments in place of approved care or as a guaranteed cure, and treatments that have been specifically deemed harmful by health authorities. For instance, a video that claims “garlic cures cancer,” or “take vitamin C instead of radiation therapy” would be removed.”
YouTube also plans to promote cancer-related content from the Mayo Clinic and other authoritative sources.
Most major tech platforms in recent years had faced pressure from governments and health experts to halt the spread of dangerous conspiracy theories, anti-vaccine content, and anti mask and anti lockdown content.