Facebook commissioned a study in 2018 that found its own algorithms spark divisions among its users, but brushed the study aside, The Wall Street Journal reported.
"Our algorithms exploit the human brain's attraction to divisiveness," a slide from a 2018 presentation read. "If left unchecked," it said, Facebook's feed would give users "more and more divisive content in an effort to gain user attention & increase time on the platform."
Although the the study was conducted after founder and CEO Mark Zuckerberg had publicly expressed concerns about "sensationalism and polarization" on the social media platform, he and other senior executives afterward "weakened or blocked efforts to apply its conclusions to Facebook products," the Journal reported.
Additionally, they worried that proposed changes would have affected conservatives users and publishers at a time Facebook was facing accusation of political bias from the right, the Journal noted.
While Pew research shows Americans had been pulling apart on political and social issues long before the advent of social media, the Journal noted that there was a concern at Facebook about the role the company was having in the deepening divide.
"There was this soul-searching period after 2016 that seemed to me this period of really sincere, 'Oh man, what if we really did mess up the world?'" Eli Pariser, co-director of Civic Signals, who has talked to Facebook officials about the issue, told the Journal.
But after March 2018, when it was revealed that Cambridge Analytica had obtained Facebook data about tens of millions of users, "The internal pendulum swung really hard to 'the media hates us no matter what we do, so let's just batten down the hatches," Pariser said.
At this point, Facebook has taken the exact opposite tack as when the study was commissioned. In January , Zuckerberg said he would stand up "against those who say that new types of communities forming on social media are dividing us."
© 2025 Newsmax. All rights reserved.