Ofcom, the UK regulator that will be in charge of enforcing the UK’s controversial Online Safety Act, has noted the link between social media posts and the riots in late July and early August in response to a mass stabbing in Southport that resulted in the murder of three girls.
In an open letter to the government, Ofcom chief executive Melanie Dawes noted the regulator had “spoke directly and in detail with many of the largest social media and messaging platforms from early August, to find out their response to the riots.”
It added that when its (Online Safety Act) powers come into force early next year “we will be able to consider whether to open an investigation into any future incidents of this kind, and take enforcement action if we have evidence that a breach of the rules may have occurred.”
Ofcom’s Dawes also wrote that “these events (the riots) have clearly highlighted questions tech firms will need to address as the duties come into force.”
“While some told us they took action to limit the spread of illegal content, we have seen evidence that it nonetheless proliferated, and appears to have contributed to the significant violent disorder which followed the tragic murders in Southport,” Dawes wrote.
“Of the numerous convictions which have followed, some have been found guilty of online posts threatening death or serious harm, stirring up racial hatred, or sending false information with intent to cause harm.”
It comes after Ofcom earlier this month had issued a stark warning to big tech platforms about their content moderation practices, when Melanie Dawes said that it will take “strong action” against tech companies that break new rules on content moderation.
But after the summer riots, Ofcom’s Dawes said that its assessment based on information provided to it by tech firms and other stakeholders was as follows:
Illegal content and disinformation spread widely and quickly online following the attack. Some of the posts had “malicious intent and seeking to influence public opinion and reaction” and “some platforms were used to spread hatred, provoke violence targeting racial and religious groups, and encourage others to attack and set fire to mosques and asylum accommodation.”
Ofcom’s Dawes noted that some accounts (including some with over 100,000 followers) falsely stated that the attacker was a Muslim asylum seeker and shared unverified claims about his political and ideological views. “Posts about the Southport incident and subsequent events from high-profile accounts reached millions of users, demonstrating the role that virality and algorithmic recommendations can play in driving divisive narratives in a crisis period.”
Ofcom’s Dawes also stated that “there was a clear connection between online activity and violent disorder seen on UK street.”
Ofcom’s Dawes also stated that “most online services took rapid action in response to the situation, but responses were uneven.”
Ofcom’s Dawes said she was “ confident that, had the draft Codes been in force at the time, they would have provided a firm basis for urgent engagement with services on the steps they were taking to protect UK users from harm.”
It is not clear whether Ofcom’s Dawes was referring to Elon Musk when she stated that “posts about the Southport incident and subsequent events from high-profile accounts reached millions of users, demonstrating the role that virality and algorithmic recommendations can play in driving divisive narratives in a crisis period.”
Elon Musk had clashed directly with the British government over free speech vs social media posts that made false claims, which fueled riots and violence in the summer.
Musk had accused Prime Minister Sir Keir Starmer’s Labour government of censorship and recently falsely claimed it was “releasing convicted paedophiles in order to imprison people for social media posts” – a reference to a policy of releasing some offenders early to ease prison overcrowding.
Elon Musk’s X (as well as Telegram) had been blamed for enabling the spread of misinformation and being notably slow to act on harmful material.
Indeed, Musk repeatedly relayed posts containing misinformation to his 193 million followers on the service, and criticised the British government for cracking down on hate speech.
In August Ofcom had urged social media companies to take action against posts inciting violence, and the government warned it may revisit the Online Safety Act.
The media regulator previously urged platforms to address content that depicts “hatred and disorder” and promoted violence and misinformation.
Group of news publishers in Canada become the latest group to sue OpenAI for allegedly…
CEO Pat Gelsinger has been “retired” effective 1 December, after the board of directors reportedly…
World spanning subsea cable measuring 40,000km (or 24,854 mile) long, reportedly being planned by Meta…
More legal trouble. Canada's Competition Bureau sues Google for alleged anti-competitive conduct in online advertising
As social media ban passes, Australian government is accused by Meta of failing to consider…
Is it enough? After Intel disappointment, Germany to offer approximately 2 billion euros in subsidies…