Google, Facebook Urged To Do More To Tackle Fake News

Image credit: European Commission

European Commission says tech giants must to more to tackle fake news, after submission of self-assessment reports

The European Commission has published the first annual self-assessment reports by Facebook, Google, Microsoft, Mozilla, Twitter and 7 European trade associations.

The self-assessments were part of the agreement that these firms committed to when they signed the EU Code of Practice against disinformation in October 2018, and were asked to report monthly on their actions.

The European Commission has previously welcomed the progress made by the big three in the fight against disinformation, but it continues to warn them that tech giants are not doing enough on the matter.

Google fake news

Slow progress

In February the Commission was forced to rebuke Facebook, Google and Twitter over their efforts to crack down on fake news.

There has been widespread concern at social media platforms being used to spread disinformation. These tech firms have been asked to tighten up on ad placement, transparency of political advertising, closure of fake accounts, and identifying automated bots.

And now the European Commission has said that there has been “improved transparency”, and “closer dialogue with platforms as regards their policies against disinformation.”

“While progress has been reported on the commitments monitored by the Commission from January to May ahead of the 2019 European Parliament elections, less is reported on the implementation of the commitments to empower consumers and the research community,” the Commission said. “The provision of data and search tools is still episodic and arbitrary and does not respond to the needs of researchers for independent scrutiny.”

“The scope of actions undertaken by each platform to implement their commitments vary significantly,” it added.

Improvement needed

A number of Commissioners made a joint statement on the matter.

“We welcome the publication of these self-assessments by the signatories to the Code of Practice on the implementation of their commitments,” they said. “In particular, we commend the commitment of the online platforms to become more transparent about their policies and to establish closer cooperation with researchers, fact-checkers and Member States.

“However, progress varies a lot between signatories and the reports provide little insight on the actual impact of the self-regulatory measures taken over the past year as well as mechanisms for independent scrutiny,” they added.

They said that while the 2019 European Parliament elections in May were clearly not free from disinformation, the actions and the monthly reporting ahead of the elections contributed to limiting the space for interference and improving the integrity of services.

“Still, large-scale automated propaganda and disinformation persist and there is more work to be done under all areas of the Code. We cannot accept this as a new normal,” they said. “While the efforts of online platforms and fact-checkers can reduce harmful virality through platforms’ services, there is still an urgent need for online platforms to establish a meaningful cooperation with a wider range of trusted and independent organisations. Access to data provided so far still does not correspond to the needs of independent researchers.”

The European Commission is currently drawing up regulations known as the Digital Services Act.

This will set out liability and safety rules for digital platforms, but the tech industry is concerned it will be a potentially heavy-handed intervention.

Quiz: Think you know all about Facebook?