A committee of British lawmakers have called on the government to establish a new regulator designed to tackle social media companies that break the law.

In a report the UK Parliament’s Science and Technology Committee warned social media firms that they should have a legal “duty of care” to children.

The report comes amid growing concern from parents about the amount of time children are glued to their screens. Cases such as 14-year-old Molly Russell, who took her own life after viewing self-harm images on Instagram, only further their concerns about online content.

Social media regulations

In its report, the committee warned firms such as Facebook, Snapchat and others that they are pressuring the British government to impose specific regulation on their activities.

It lamented the current “patchwork of regulation and legislation” that is used to regulate social media firms.

“A principles-based regulatory regime for social media companies should be introduced in the forthcoming parliamentary session,” said the report. “The regime should apply to any site with registered UK users.”

“One of the key principles of the regulatory regime must be to protect children from harm when accessing and using social media sites, while safeguarding freedom of speech (within the bounds of existing law),” said the report. “This principle should be enshrined in legislation as social media companies having a ‘duty of care’ towards its users who are under 18 to act with reasonable care to avoid identified harms.”

The report also added that while “great strides” had been made to address and remove terrorist content, social media firms must now apply “the same effort and determination” to “curb the proliferation online of the physical, emotional and sexual abuse and exploitation of children, as a matter of urgency.”

The report said that the Home Secretary’s expectation of more effective partnership between technology companies, law enforcement agencies, the charity sector and the Government to protect children, is not enough.

It said that a “statutory code of practice for social media companies, to provide consistency on content reporting practices and moderation mechanisms, must be introduced.”

And the report called for a new regulator (or the extension of Ofcom’s powers) to govern this area, and consideration must be given to directors of companies (watch out Mark Zuckerberg) to be held personally liable for violations.

“A regulator should be appointed by the end of October 2019 to uphold the new regime,” said the report. “It must be incumbent upon the regulator to provide explanatory guidance on the meaning and nature of the harms to be minimised; to monitor compliance with the code of practice; to publish compliance data regularly; and to take enforcement action, when warranted. Enforcement actions must be backed up by a strong and effective sanctions regime, including consideration being given to the case for the personal liability of directors.”

Screen time

The report also touched upon the concerns about screen time and young people, and its affects on their health.

The report called on the government to carry out research to see who is at risk of experiencing harm online, and why, and what the long-term consequences of that exposure are on the young person.

The MP’s report comes after the Royal College of Paediatrics and Child Health (RCPCH) last month concluded that there was no good evidence screen time is “toxic” to kids health.

That RCPCH report was the first ever guidance on children’s screen time to be published in the UK, and it found that there “is not enough evidence to confirm that screen time is in itself harmful to child health at any age, making it impossible to recommend age appropriate time limits.”

Instead the guidance suggests parents approach screen time based on the child’s developmental age, the individual need and value the family place on positive activities such as socialising, exercise and sleep.

Despite this guidance, the amount of time children are spending using tech will no doubt still cause parents worry.

Last year Apple CEO Tim Cook urged parents to stop children using social media. He has banned his nephew for example from using social networks.

Will you like our Facebook quiz?

Tom Jowitt

Tom Jowitt is a leading British tech freelancer and long standing contributor to Silicon UK. He is also a bit of a Lord of the Rings nut...

Recent Posts

Norway Hit By DDoS Cyber Attacks From Pro Russian Group

Norwegian national security agency warns pro-Russian group has targetted private and public institutions in Norway…

22 mins ago

Google Tells Staff They Can Relocate After Roe v Wade Ending

After US Supreme Court last week removed women's reproduction rights, Google tells staff they can…

1 hour ago

Taiwan Developing Own Digital Currency – Report

Central bank of Taiwan confirms it is still working on its digital currency, but has…

3 hours ago

Tesla Cuts 200 Autopilot Jobs, Closes San Mateo Office – Report

More restructuring at Tesla with hundreds of bob losses and California office closure, where staff…

4 hours ago

US FCC Commissioner Urges Apple, Google To Remove TikTok

Fresh worry for TikTok, after FCC Commissioner writes to Apple and Google about removing the…

5 hours ago

Airbnb Permanently Bans Parties, With Few Exceptions

Victory for irate neighbours? Airbnb confirms its temporary Covid ban on parties in its listings…

6 hours ago