Facebook has confirmed it has temporarily stopped recommendations for all political groups and all new groups ahead of the US presidential election taking place on Tuesday.
The move is part of a broader effort by the company to curb misinformation around the election.
Other moves include barring new political advertisements for one week ahead of the election and all political ads for at least a week after the election.
Facebook chief executive Mark Zuckerberg referred to the policy around groups during a Senate hearing last week.
“We have taken the steps to stop recommendations in groups for all political content or social issue groups as a precaution,” Zuckerberg said at the time.
Advocacy groups had pressured Facebook to place limits on algorithmic group recommendations ahead of the elections, saying some groups have been used to spread misinformation and organise extremist activity.
Facebook declined to specify when the changes had been made or how long they would last.
“We will assess when to lift them afterwards, but they are temporary,” the company told Reuters.
Facebook also said that as of Friday it was placing limits on the “save our children” hashtag because of its associations with the QAnon conspiracy theory.
The company said people were still able to use the hashtag but clicking on it would not bring up other instances of its use, while a search for the hashtag would display “credible child safety resources”.
Facebook’s policy on limiting political ads before and after the election caused a furore last week when it emerged that the company had mistakenly barred ads from the Trump and Biden campaigns that had been running on the site before the Tuesday deadline.
The company’s filters were apparently overwhelmed by a last-minute surge in political ads, leading to glitches.
In one case, ads for the Trump campaign apparently urged people to vote on the wrong day.
Trump campaign messages reading “Vote Today!” and “Election Day is Today!” went out on Tuesday, a week before the election.
They were seen by about 200,000 voters, mostly in critical swing states, before Facebook took them down for violating its misinformation policies, the Financial Times reported.
It was unclear whether the ads were intended to spread incorrect information or if they had been registered ahead of time with the intent of displaying them a week later.
The Trump campaign did not respond to a request for comment on the matter.