By Katie Paul
SAN FRANCISCO (Reuters) – Facebook Inc said on Thursday it would start promoting content from public Facebook Groups in users’ news feeds and in search engine results, defying calls by watchdog groups to limit algorithmic recommendations ahead of U.S. elections next month.
In a blog post, Facebook app chief Fidji Simo said the company would surface recommendations in news feeds by showing “related discussions” from public groups next to a link or post. Facebook Groups are communities that form around shared interests, and public groups’ posts can be viewed by any user.
Simo said Facebook would also start showing conversations from public groups outside the app, such as in web searches.
Facebook declined to specify when the changes would be rolled out, with a spokesman saying only that U.S. tests would begin “in a very limited capacity” in the coming weeks.
The company, which made groups a strategic priority last year, also said it was adding new moderation tools for the more than 70 million people running those communities to help them reject posts, lead discussions and arrange paid sponsorships.
According to an internal memo seen by The Verge https://www.theverge.com/21444203/facebook-leaked-audio-zuckerberg-trump-pandemic-blm, the groups push this year has been a success for engagement on the Facebook app, with posts up 31.9% year over year as of August.
At the same time, the company has faced a series of scandals over extremist activity within groups, including the spread of thousands of Boogaloo, QAnon and militia groups and communities promoting health misinformation.
Facebook has gradually removed some such groups and limited recommendations to others.
Its new groups push comes days after Accountable Tech and a dozen other advocacy organizations called on Facebook to pause its use of group recommendations altogether until the results of the Nov. 3 elections are formally certified.
The coalition accused Facebook of prioritizing groups despite red flags from researchers, warning the tools have become “hidden breeding grounds for disinformation campaigns and organizing platforms for extremists.”
Adam Conner, a former Facebook executive who now leads technology policy at the left-leaning Center for American Progress, said the company’s decision risked accelerating those dynamics.
“Giving content from those groups wider distribution and easier discovery is a dangerous choice right before a contentious election during a pandemic,” said Conner.
(Reporting by Katie Paul; Editing by Cynthia Osterman)