Tech companies must do more to clamp down on fake news, amid a surge of disinformation during the global Coronavirus pandemic, says EU
The European Commission has once again urged online platforms and the advertising sector to do more to tackle the scourge of fake news.
It comes after the Commission in October 2019 said that tech giants need to do more to tackle fake news, after they had submitted self-assessment reports.
The Commission remains particularly concerned that fake news and disinformation is still a pressing issue, two years after tech firms signed a voluntary agreement to clamp down on it.
Code of Practice
Those self-assessments last October were part of the agreement that tech firms such as Google, Facebook and Twitter, had committed to when they signed the EU Code of Practice against disinformation in October 2018.
All of these firms were asked to report monthly on their actions.
In February 2019 the Commission was forced to rebuke Facebook, Google and Twitter over their efforts to crack down on fake news.
It comes after widespread concern that online platforms being used to spread disinformation.
These tech firms have been asked to tighten up on ad placement, transparency of political advertising, closure of fake accounts, and identifying automated bots.
And two years after this agreement with the European Commission, it still remains concerned at the level of disinformation being delivered when it presented an assessment of the implementation and effectiveness of the ‘Code of Practice on Disinformation.’
“The assessment shows that the Code has proven a very valuable instrument, the first one of its kind worldwide, and has provided a framework for a structured dialogue between relevant stakeholders to ensure greater transparency of platforms’ policies against disinformation within the EU,” it said. “At the same time, the assessment highlights certain shortcomings mainly due to the Code’s self-regulatory nature.”
“The Code of Practice has shown that online platforms and the advertising sector can do a lot to counter disinformation when they are put under public scrutiny,” said Věra Jourová, VP for Values and Transparency.
“But platforms need to be more accountable and responsible; they need to become more transparent,” said Jourová.
Another European official also added his two cents on the matter.
“Organising and securing our digital information space has become a priority,” said Thierry Breton, Commissioner for the Internal Market. “The Code is a clear example of how public institutions can work more efficiently with tech companies to bring real benefits to our society. It is a unique tool for Europe to be assertive in the defence of its interests and values. Fighting disinformation is a shared responsibility, which the tech and advertising sector must fully assume.”
The European Commission said the Code over the past 12 months had brought positive outcomes, namely it increased platforms’ accountability and public scrutiny of the measures taken by the signatories to counter disinformation within the EU.
But the European Commission said that the quality of the information disclosed by the Code’s signatories is still insufficient and shortcomings limit the effectiveness of the Code.
These shortcomings include the absence of relevant key performance indicators (KPIs) to assess the effectiveness of platforms’ policies to counter the phenomenon; the lack of clearer procedures, commonly shared definition and more precise commitments; the lack of access to data allowing for an independent evaluation of emerging trends and threats posed by online disinformation; missing structured cooperation between platforms and the research community; and the need to involve other relevant stakeholders, in particular from the advertising sector.