About time? Google’s video-sharing platform joins Facebook and Twitter, after announcing ban on content about conspiracy theory QAnon
YouTube has become the latest tech platform to ban any content representing the conspiracy theory QAnon from its platform.
QAnon is a far-right conspiracy that allege there is a “deep state” campaign against US President Donald Trump and his supporters. Essentially QAnon touts there is a worldwide cabal of Satan-worshipping paedophiles who rule the world, and who control politicians, the media. and Hollywood.
Earlier this month Facebook announced it would ban any pages, groups, and Instagram accounts representing QAnon from its platforms.
Twitter in July also announced that content and accounts associated with QAnon were being permanently suspended, as part of its fight to halt the spread of misinformation on its platform,
And now YouTube is following suite, when it provided an update in a blog post on Thursday about harmful conspiracy theories has confirmed it is banning content such as QAnon.
“Today, we are taking another step in our efforts to curb hate and harassment by removing more conspiracy theory content used to justify real-world violence,” it wrote. “Managing misinformation and harmful conspiracy theories is challenging because the content is always shifting and evolving.”
“Today, we are taking another step in our efforts to curb hate and harassment by removing more conspiracy theory content used to justify real-world violence,” it said.
““Nearly two years ago, we took a major step to limit the reach of harmful misinformation by updating our recommendations system,” it said. “This resulted in a 70 percent drop in views coming from our search and discovery systems. In fact, when we looked at QAnon content, we saw the number of views that come from non-subscribed recommendations to prominent Q-related channels dropped by over 80 percent since January 2019.
YouTube also said that it has “removed tens of thousands of QAnon-videos and terminated hundreds of channels under our existing policies, particularly those that explicitly threaten violence or deny the existence of major violent events.”
“All of this work has been pivotal in curbing the reach of harmful conspiracies, but there’s even more we can do to address certain conspiracy theories that are used to justify real-world violence, like QAnon,” it said.
“Today we’re further expanding both our hate and harassment policies to prohibit content that targets an individual or group with conspiracy theories that have been used to justify real-world violence,” it said. “One example would be content that threatens or harrasses someone by suggesting they are complicit in one of these harmful conspiracies, such as QAnon or Pizzagate.”
YouTube said it would could to adapt its policies going forward, but these changes would “ramp up” in the weeks to come.
It should be noted that in 2019 the FBI designated QAnon as a potential domestic terrorism threat, and Facebook this year removed a US network of fake accounts linked to Qanon.
Earlier this week YouTube also revealed it banning misleading content relating to Coronavirus vaccines.
YouTube had already banned ‘medically unsubstantiated’ claims relating to Covid-19 on its platform.