Online Safety Bill Changes Urged By MPs

Evil parliament (c) pisaphotography, Shutterstock 2014

Changes required. Major changes are being urged by joint committee of MPs to the draft of the UK’s online safety bill to tackle online concerns

The government is facing pressure to make major changes to the draft of its Online Safety Bill that seeks to toughen up the UK’s laws for the online world.

A new report from a government created committee of MPs and peers has said sweeping changes are needed to tackle an industry that has become the ‘land of the lawless’, the Guardian reported.

The UK’s online safety bill looks to mostly create a duty of care, obliging tech companies to take initiatives to remove illegal content, and to report it to authorities.

Online safety bill

It also seeks to ramp up online protections for both adults and children, including stopping children from accessing pornography.

Another controversial part of the bill concerns end-to-end encryption, which governments and law enforcement have campaigned against for years.

The bill will see new rules being enforced by Ofcom, which would have the power to impose multi-million pound fines or bar companies from operating in the UK

Ofcom could, for example, will be able to levy unprecedented fines of up to £18m or 10 percent of global turnover

But the committee’s report into the proposed bill has indicated the need for changes before this can happen.

It should be noted that the government actually created this cross-party committee in the first place to subject its the draft bill to “pre-legislative scrutiny.”

“As we and our witnesses have reiterated time and again, online safety is one of the most complex and most fundamental policy issues of our age,” said the report. “We hope that even those who do not agree with our conclusions will accept that the Bill will be the better for these issues being aired in the collaborative environment of a Joint Select Committee, prior to consideration on the floors of both Houses.”

“The major online services have become central to many of our lives, but as their power has grown, there has been no meaningful increase in their public accountability,” said the report. “The harmful user experiences which have emerged on many platforms have been allowed to run unchallenged and unchecked for too long. Now is the time to see platforms held to account for harms which arise from the decisions they make about their systems and processes, the operation of their services, and their corporate governance structures.”

“Online services can and should be accountable to the Regulator, and thus ultimately to the public, about the impact of the decisions that they make,” said the bill. “Our hope is that this Bill will make this kind of meaningful accountability possible, and in turn that more accountability will lead to better systems and processes, better operation of services, better corporate governance structures at the major platforms, and a better experience for users who can make informed decisions about the services that they use.”

The inquiry into the proposed bill heard from over 50 witnesses across 11 meetings, held four roundtables, and received over 200 pieces of written evidence.

What changes

The report recommends a wide-range of proposals to amend the legislation.

This includes creating a new criminal offence for cyberflashing, punishing tech platforms for hosting fraudulent adverts, and exempting news organisations from content takedowns, the Guardian reported.

The recommendations by the joint committee will tackle an industry that has become the “land of the lawless”, committee’s Conservative chair, Damian Collins MP reportedly said.

“A lack of regulation online has left too many people vulnerable to abuse, fraud, violence and in some cases even loss of life,” he was quoted by the Guardian as saying.

Other changes urged by the report include:

  • Creating a digital ombudsman to deal with complaints by individuals against platforms
  • Introducing codes of conduct that tackle “rabbit hole” algorithms
  • Requiring tech companies to implement mechanisms to deal with harmful anonymous accounts
  • Social media companies having to reveal how many underage users are on their platforms.

Game changing?

The proposed changes have been welcomed by Dr Rachel O’Connell, founder and CEO of the age-verification platform TrustElevate.

“The Joint Committee’s recommendations on the Online Safety Bill are potentially game-changing,” noted Dr O’Connell. “Specifically that platforms systems should be regulated, not just content. This shift in focus is hugely significant as it is the design choices and data-driven operations that platforms make that regulators should scrutinise.”

“These include AI-driven recommendation engines that actively connect children to adults with a sexual interest in children, and content relating to suicide to children and young people who feel depressed,” said Dr O’Connell.

“These insidious bad practices go to the heart of the dangers that platforms pose – the choices that platforms make when deploying AI to enable frictionless interactions and active surfacing of harmful content to ensure eyeballs remain glued to the platforms so that more advertising can be viewed,” said Dr O’Connell.

“The Joint Committee’s recommendation suggests that these practices be regulated and calls for the regulator to have powers to audit companies decision making and hold them to account,” said Dr O’Connell. “Policing platforms, enforcing the principles through audits, and ensuring accountability and transparency are key.”

“Suppose these recommendations are accepted and included in the Online Safety Bill,” Dr O’Connell concluded. “In that case, this could represent a significant step forward, combined with age verification, toward creating a safer internet for children and young people.”

Not straight forward

But others, such as Robin Wilton, director of Internet Trust at the Internet Society, has highlighted the problems of trying to tackle societal problems with legislation.

“The parliamentary joint committee’s findings show that attempting to tackle societal problems with tech legislation is nowhere near as straightforward as the Home Office makes it seem,” said Wilton. “The conclusions confirm what experts have been telling the government for years, there are no simple solutions to these issues and the ones being brought forward will make us all less safe.”

“Case in point, despite having already legislated for age verification (in the Digital Economy Act 2017), the government has yet to come up with a way to make it work,” said Wilton.

“Regardless of whether the content is illegal or ‘legal but harmful’, another issue highlighted by the committee, there is no way to monitor content without breaking the security and privacy of end-to-end encrypted services,” said Wilton.

“There is a massive reality gap between the complexity of the issues the Online Safety Bill tries to address and the simplistic approach being take,” said Wilton. ” If the government is truly interested in online safety, it cannot continue this ‘legislate now and fix later’ approach.”