The UK’s revised online safety bill proposes fines of 10 percent of revenue if social networks fail to block racist, sexist content
The UK government’s revised Online Safety Bill proposes fines of 10 percent revenue if tech platforms breach pledges to block sexist and racist content.
The revised bill, announced on Tuesday, will however include stronger protections for children, and will make encouraging self harm a criminal offence.
But the revised bill has dropped the harmful communications offence, and platforms such as Facebook and Twitter, must give users the option of avoiding content that is harmful but does not constitute a criminal offence. This could include for example racism, misogyny or the glorification of eating disorders.
The government said the bill’s “legal but harmful provisions” are to be replaced with new duties to boost free speech and increase accountability of tech firms.
The revised bill also addresses any incentives for social media firms to over-remove people’s legal online content, after it will be taken out of the Online Safety Bill, the government said.
“Firms will still need to protect children and remove content that is illegal or prohibited in their terms of service, however the Bill will no longer define specific types of legal content that companies must address,” said the Department for Digital, Culture, Media & Sport.
The government said this removes any influence future governments could have on what private companies do about legal speech on their sites, or any risk that companies are motivated to take down legitimate posts to avoid sanctions.
New measures will also be added to make social media platforms more transparent and accountable to their users, as a result of amendments the Government will propose.
The revised Online Safety Bill means that social media firms will still be legally required to remove illegal content, take down material in breach of their own terms of service, and provide adults with greater choice over the content they see and engage with.
Ofcom, the communications regulator, will have the power to fine companies up to 10 percent of global turnover for breaches of the act.
The government said parents and the wider public will benefit from new changes to force tech firms to publish more information about the risks their platforms pose to children so people can see what dangers sites really hold.
Firms will be made to show how they enforce their user age limits to stop kids circumventing authentication methods and they will have to publish details of when the regulator Ofcom has taken action against them.
“Unregulated social media has damaged our children for too long and it must end,” said Digital Secretary Michelle Donelan. “I will bring a strengthened Online Safety Bill back to Parliament which will allow parents to see and act on the dangers sites pose to young people.”
“It is also freed from any threat that tech firms or future governments could use the laws as a licence to censor legitimate views,” said Donelan.
“Young people will be safeguarded, criminality stamped out and adults given control over what they see and engage with online,” said Donelan. “We now have a binary choice: to get these measures into law and improve things or squabble in the status quo and leave more young lives at risk.”
The draft legislation is now due to return to the House of Commons next week (5 December) when British lawmakers will resume scrutiny of the wide-ranging proposals.
The harmful communications offence was dropped from the legislation after criticism from Conservative MPs that it was legislating for “hurt feelings”.
Other changes to the bill include criminalising encouragement of committing self-harm, a change that was introduced after the inquest into the death of 14-year-old Molly Russell, who died after viewing extensive amounts of harmful material on Instagram and Pinterest in 2017.