OPINION
When the Russians invaded Crimea in 2014, Igor Dobrovolsky, a Ukrainian digital marketing specialist and designer, noticed that social media was becoming a major source of information in his country.
I recently had the chance to interview him.
As he explained, during that recent discussion, "The problem was," he said, "that there was a tremendous amount of disinformation, and we could always trace it back to Russia."
Looking into the matter further, he and his team saw that Russia had been creating large "troll factories" where individuals, known as "trolls," create and disseminate misleading content.
The troll factories were fire-hosing their pro-Russia, anti-Ukraine information out across all the social media. Dobrovolsky could see that these efforts were sowing discord and manipulating political outcomes. This informaiton comes from my conversation with him.
He learned that a large Russian troll factory can employ hundreds or in some cases, even a thousand individuals. Troll factory employees may include content creators, social media operators, graphic designers, data analysts, and administrative staff. They work in shifts to maintain a continuous online presence across various platforms and time zones.
Knowing about the troll factories, he and some of his close friends began to monitor the situation more closely. He discovered that part of their way of operating in Ukraine is, they’d study what was going on in Ukraine and look for areas where people would disagree.
A whole team of Russians would then use fake internet accounts to do everything they could to get people on both sides riled up, with the goal of undermining what was once the country’s social cohesion.
To help make it easy for Westerners to understand how this works, Dobrovolsky gave a hypothetical case, to me, involving New York City:
"Let’s say there’s a person in Saint Petersburg," Dobrovolsky begins.
"We’ll call him Ivan, and he’s a team leader for an operation designed to sow discord in New York City. He’s been spending months on the internet, monitoring everything he can about what’s going on in New York. He’s a social scientist, and he knows what to look for."
Dobrovolsky continues with his hypothetical example. "Let’s say there’s been a recent case of violence in the New York subways. Ivan now knows he has a good subject to work with because large groups of individuals will care about the issue. He also knows that many of these groups hold opposing views on how to deal with the problem."
Dobrovolsky explains that Ivan is now off and running with a new influence operation. Ivan knows which affinity groups will care about subways, including the ages of the members of the affinity group, their gender, social status, educational level, and whether their solutions are pro or anti-police.
With this information, he knows exactly what messages to use when targeting the members of the groups.
Fortunately for Ivan, although not for New Yorkers, he and his team have had years to create thousands of false Facebook accounts.
They’ve also penetrated the other relevant social media outlets.
Ivan and his colleague begin creating false social media messages. They’ll post something along the lines of, "This is an outrage! The mainstream media says a child was shot, but actually, three are dead and two more aren’t expected to survive. The mayor is soft on crime! We need to vote him out!"
During the course of my interview, one of Ivan’s teammates will instantly chime in, on Facebook, with, "Yes, don’t trust the media, it’s worse than they say!"
Ivan’s army of trolls will multiply the message across many platforms. The trolls will also penetrate anti-law enforcement social media groups and the trolls will focus on enflaming their reactions. In this case, their efforts may include making up examples of outrageous police brutality on the subways.
The made-up examples on either side are carefully designed by experts in persuasion to get a, "Hey, did you see this" reaction from readers, who then pass on the misinformation. It becomes viral. The trolls have taken an existing conflict, inflamed it, and made it explosive.
They’ve deepened societal divisions, eroded trust in public institutions, and created a climate of confusion and mistrust among the public. They’ve undermined social cohesion and the democratic processes.
Dobrovolsky knows how Russian troll farms operate. He’d like the West to return the favor and target the Russian information space. However, as he points out, the West won’t need to lie when trying to influence Russian thinking.
Targeting Russian information space won’t be easy.
Obstacles include the fact that Facebook and many other social media outlets common in the West are currently banned in Russia. However, YouTube is still active, and at least some Russians know how to access other social media by using a Virtual Private Network (VPN) that hides their identity and where they’re from.
Dobrovolsky wants to work with people in the West to craft messages that the Russians who receive them would find persuasive.
He hopes that Westerners who want to do this will use Ukrainians for the job, since for many Ukrainians, Russian is their first language, and they are deeply familiar with Russian ways of thought.
He feels that the West needs to do more to counteract the falsehoods the Russians are spewing out. For him this is more than counteracting lies with truth.
It’s about preserving the fabric of democratic societies. Tyrannical governments flourish by spreading lies. Democratic governments can prevail by telling the truth.
Mitzi Perdue, is the wife of the late Frank Perdue (former CEO of Perdue Farms) and a humanitarian. She is a Harvard graduate, a writer, speaker, and author of the award-winning biography of "Relentless, the story of Mark Victor Hansen," the Chicken Soup for the Soul co-author.
© 2025 Newsmax. All rights reserved.