Word war: In Russia-Ukraine war, information became a weapon
WASHINGTON (AP) — Russia’s invasion of Ukraine is the deadliest conflict in Europe since World War II, and the first to see algorithms and TikTok videos deployed alongside fighter planes and tanks.
The online fight has played out on computer screens and smartphones around the globe as Russia used disinformation, propaganda and conspiracy theories to justify its invasion, silence domestic opposition and sow discord among its adversaries.
Now in its second year, the war is likely to spawn even more disinformation as Russia looks to break the will of Ukraine and its allies.
“The natural question is: What’s next to come? We know that Russia is preparing for a protracted conflict,” said Samantha Lewis, a manager of strategic geopolitics with the cybersecurity firm Recorded Future. “Ukrainian morale is almost certainly a key target of Russian psychological operations. And there’s the risk of international complacency.”
A look at Russia’s disinformation war since the conflict began:
DIVIDE AND CONQUER
The Kremlin’s propaganda efforts against Ukraine began many years ago and increased sharply in the months immediately before the invasion, according to Ksenia Iliuk, a Ukrainian disinformation expert who has tracked Russia’s information operations.
Russia tailored the messages for specific audiences around the world.
In Eastern Europe, Russia spread baseless rumors of Ukrainian refugees committing crimes or taking local jobs. In Western Europe, the message was that corrupt Ukrainian leaders couldn’t be trusted, and that a long war could escalate or lead to higher food and oil prices.
In Latin America, Russia’s local embassies spread Spanish-language claims suggesting its invasion of Ukraine was a struggle against Western imperialism. Similar messages accusing the U.S. of hypocrisy and belligerence were spread in Asia, Africa and other parts of the world with a history of colonialism.
Russia’s information agencies flooded Ukraine with propaganda, calling its military weak and its leaders ineffective and corrupt. But if the message was intended to reduce resistance to the invaders, it backfired in the face of Ukrainian defiance, Iliuk said.
“Russian propaganda has been failing in Ukraine,” she said. “Russian propaganda and disinformation are indeed a threat and can be very sophisticated. But it’s not always working. It’s not always finding an audience.”
BLAME THE VICTIM
Many of Russia’s fabrications try to justify the invasion or blame others for atrocities carried out by its forces.
After Russian soldiers tortured and executed civilians in Bucha last spring, images of charred corpses and people shot at close range horrified the world. Russian state TV, however, claimed the corpses were actors, and that the devastation was faked. Associated Press journalists saw the bodies themselves.
Russia initially celebrated a missile strike on a rail station in the Ukrainian town of Kramatorsk, until reports of civilian casualties surfaced. Suddenly Russian news outlets were insisting the missile wasn’t theirs.
“When they realized that civilians were killed and injured, they changed the messaging, trying to promote the idea that it was a Ukrainian missile,” said Roman Osadchuk, a research associate at the Atlantic Council’s Digital Forensic Research Lab, which has tracked Russian disinformation since before the war began.
One of the most popular conspiracy theories about the war also had Russian help. According to the claim, the U.S. runs a series of secret germ warfare labs in Ukraine — labs conducting work dangerous enough to justify Russia’s invasion.
Like many conspiracy theories, the hoax is rooted in some truth. The U.S. has funded biological research in Ukraine, but the labs are not owned by the U.S., and their existence is far from secret.
The work is part of an initiative called the Biological Threat Reduction Program, which aims to reduce the likelihood of deadly outbreaks, whether natural or manmade. The U.S. efforts date back to work in the 1990s to dismantle the former Soviet Union’s program for weapons of mass destruction.
As European governments and U.S.-based tech companies looked for ways to turn off the Kremlin’s propaganda megaphone, Russia found new ways to get its message out.
Early in the war, Russia relied heavily on state media outlets like RT and Sputnik to spread pro-Russian talking points as well as false claims about the conflict.
Platforms like Facebook and Twitter responded by adding labels to the accounts of Russian state media and government officials. When the European Union called for a ban on Russian state media, YouTube responded by blocking the channels of RT and Sputnik. TikTok, owned by a Chinese company now based in Singapore, did the same.
Russia then pivoted again to tap its diplomats, who have used their Twitter and Facebook accounts to spread false narratives about the war and Russian atrocities. Many platforms are reluctant to censor or suspend diplomatic accounts, giving ambassadors an added layer of protection.
After its state media was muzzled, Russia expanded its use of networks of fake social media accounts. It also evaded bans on its accounts by taking identifying features — such as RT’s logo — off of videos before reposting them.
Some efforts were sophisticated, like a sprawling network of fake accounts that linked to websites created to look like real German and British newspapers. Meta identified and removed that network from its platforms last fall.
Others were far cruder, employing fake accounts that were easily spotted before they could even attract a following.
“These campaigns resembled smash and grab operations that used thousands of fake accounts,” Nick Clegg, Meta’s president of global affairs told reporters on a conference call Wednesday. “This covert activity is aggressive and it is persistent.”
GETTING AHEAD OF THE CLAIMS
Ukraine and its allies scored early victories in the information war by predicting Russia’s next moves and by revealing them publicly.
Weeks before the war, U.S. intelligence officials learned that Russia planned to carry out an attack that it would blame on Ukraine as a pretext for invasion. Instead of withholding the information, the government publicized it as a way to disrupt Russia’s plans.
By “ prebutting ” Russia’s claims, the U.S. and its allies were attempting to blunt the impact of disinformation. The next month, the White House did it again when it disclosed suspicions that Russia might seek to blame a chemical or biological attack on Ukraine.
The invasion prompted tech companies to try new strategies, too. Google, the owner of YouTube, launched a pilot program in Eastern Europe designed to help internet users detect and avoid misinformation about refugees fleeing the war. The initiative utilized short online videos that teach people how misinformation can trick the brain.
The project was so successful that Google now plans to roll out a similar campaign in Germany.
Iliuk, the Ukrainian disinformation researcher, said she believes there’s a greater awareness now, a year after the invasion, of the dangers posed by Russian disinformation, and a growing optimism that it can be checked.
“It is very hard, especially when you hear the bombs outside of your window,” she said. “There was this huge realization that this (Russian disinformation) is a threat. That this is something that could literally kill us.”