Facebook says it is developing ‘topic exclusion’ tools to help advertisers ensure their ads don’t appear alongside hate speech or other harmful content
Facebook said it is developing “topic exclusion” tools to help advertisers control what content is displayed near their advertisements.
The company said it would allow advertisers to select topics such as “Crime and Tragedy” that it didn’t want displayed near their ads.
“That selection would help define how we’ll show the ad on Facebook, including News Feed,” Facebook said in a blog post.
It said development of the tools would take “much of the year”.
The move comes after large advertisers such as Coca-Cola and Starbucks, along with more than 1,100 others, temporarily pulled their ads from Facebook last July over concerns the platform wasn’t doing enough to curb hate speech, misinformation and other harmful content.
The move followed the death of George Floyd, a US black man, in police custody.
This prompted calls by civil rights advocates for companies to rethink their support for Facebook, which they argued was allowing hate speech and misinformation to spread relatively unchecked.
In September Facebook, along with YouTube and Twitter, signed a deal with big advertisers through the Global Alliance for Responsible Media (GARM) to establish standards defining hate speech and other harmful content.
Advertisers do not want to be seen as supporting the spread of such content, and are particularly sensitive about their ads being displayed alongside harmful posts.
Facebook’s News Feed is a particularly sensitive feature of the service, as it can contain content or posts by users that advertisers feel is objectionable.
The News Feed is also Facebook’s most important ad placement area, helping it to expand its ad revenues to about $84.17 billion (£61bn) in 2020.
Facebook and other large internet platforms are also under pressure by European governments over hate speech and other harmful content.
The European Commission in December introduced proposed legislation that would, amongst other things, force large platforms to take more responsibility for such content.