A new study claims Instagram recommended false claims about COVID-19 vaccines to people who appeared interested in related topics.
The study was compiled by the Center for Countering Digital Hate, and detailed in a Tuesday report by NPR.
The research also found Instagram had highlighted election misinformation, and conspiracy theories to some users.
"The Instagram algorithm is driving people further and further into their own realities, but also splitting those realities apart so that some people are getting no misinformation whatsoever and some people are being driven more and more misinformation," said Imran Ahmed, CEO of the Center for Countering Digital Hate, which conducted the study.
From Sept.-Nov. 2020, Instagram recommended over 100 posts containing misinformation to 15 profiles set up by CCDH, according to NPR. The recommendations appeared in several spots on Instagram and as "suggested posts" on the individual profiles.
Of 104 posts containing misinformation, 57.7% dealt with the coronavirus, 21.2% targeted the vaccines, 12.5% centered on the election, and 8.7% dealt with QAnon, the report said.
Instagram users who view everything posted recently from accounts they already follow now also see posts from accounts they don't follow at the bottom of their feeds, NPR noted.
"Putting it into the timeline is really powerful," Ahmed said. "Most people wouldn't realize they're being fed information from accounts they're not following."
But Facebook spokesperson Raki Wane said: "This research is five months out of date and uses an extremely small sample size of just 104 posts. This is in stark contrast to the 12 million pieces of harmful misinformation related to vaccines and COVID-19 we've removed from Facebook and Instagram since the start of the pandemic.”
Jeffrey Rodack ✉
Jeffrey Rodack, who has nearly a half century in news as a senior editor and city editor for national and local publications, has covered politics for Newsmax for nearly seven years.
© 2025 Newsmax. All rights reserved.