s war engulfed Ukraine, Mykhailo Federov, Ukraine’s Digital Transformation Minister, led an army of information technology volunteers to counter Russian misinformation on the cyber-battlefield. The 31-year-old minister recruited hundreds of thousands of civilian volunteers to use simple tools in their phones and the internet to anticipate and undermine Russian disinformation.
Fresh off its successes disrupting democracy in the EU, UK, and U.S., Russia came into the war confident in its digital disinformation playbook. But Federov’s digital recruits are more than up to the task of countering them. As told by Time Magazine correspondent Vera Bergengruen on a recent episode of NPR’s Fresh Air, Ukraine’s social media campaign has successfully pre-bunked and debunked Russian misinformation. It has also united Ukrainians and garnered support all over the globe in the face of the atrocities committed by Russia in Ukrainian territory.
Some of the tactics are similar to the recommendations we put forth in our report, which was published last year before the invasion. In particular, five of the report’s recommendations offer a potential checklist for combatting disinformation in wartime and beyond:
- Know the difference between misinformation and disinformation so that you can know your enemy. Misinformation is inaccurate information. Disinformation is inaccurate information deliberately spread. This differentiation is important as it helps to distinguish the four major players on the digital battlefield: the “bad guys” deliberately spreading disinformation; the majority of us “passive consumers” just reading or watching information; the “unwitting accomplices” that unknowingly pass on disinformation often for benign reasons; and the “good guys” trying to actively stop the bad guys. The #1 job in Ukraine has been to fight the bad guys and, to do so, the Ministry of Digital Transformation has armed and empowered the good guys.
- Arming and empowering the good guys with widely accessible resources. It’s critical to put resources and support tools at the disposal of a great number of users to counter at scale the spread of disinformation. Federov’s team has developed guidelines to enable anyone with a cell phone and/or access to the internet to counter the massive amount of computational propaganda Russia has thrown around. The Ukrainians have armed the “good guys” with smart tools and tactics accessible to a critical mass. For example, the interview with Bergengruen highlights how the Ukrainian government repurposed Telegram bots originally used for basic customer service functions, such as registering for a driver’s license or vaccine appointment, to allow ordinary citizens to report Russian Army movements.
A few key additional tips to arm the good guys:
- Know your users. Ukrainians don’t really use Twitter, preferring Telegram instead. The Ministry chose to deploy domestic services (including the bots to report Russian troop movement) on Telegram but posts on Twitter in English to reach American and other European audiences.
- Recruit and recognize champions. When users first started reporting Russian troop movements on Telegram, the Defense Ministry thanked them on the platform, crediting them for a successful counterattack.
- Updating the limits of the free distribution of speech. You’re entitled to the free expression of your opinion but not to its free distribution. Editors and publishers have long exercised editorial control over what goes out on their platforms. Big Tech platforms have shown justified caution in taking on this responsibility, but they do need to exercise judgment. We see this recommendation play out currently as Ukraine’s Digital Ministry successfully pressured YouTube, Twitter, and others to remove RT, Russia’s English-language news channel, from their platforms.
- Focus on fact production (not correction). Our research revealed how the bad guys—especially those sponsored by the Russian state—have switched tactics. In the past they used propaganda to try to pressure people into taking a particular action. A few years ago, they switched to “flooding the zone” with confusing, disorienting, and often contradictory information with the explicit goal of eroding readers’ trust in traditional sources of news and information. They wanted us all to shut down and exit the civic space. This worked exceedingly well in Russia itself, where many citizens have simply chosen not to pay attention to politics anymore. But the Ukrainians have shown how this blunt-force approach can be countered with agile, flexible tactics including:
- Leveling the playing field by reinforcing social norms to encourage sharing verified information and discourage the sharing of unverified information on both “open” and “closed” platforms.
- Pre-bunking is better than debunking. Correcting a message after it enters the digital space often backfires by unintentionally reinforcing the very message it sought to discredit. Instead, Ukrainians have focused on the wider fact production ecosystem. Bergengruen explains one of their successful tactics as “pre-bunking,” getting ahead of anticipated propaganda before readers have the chance to come across it. Pre-bunking anticipates potential lies, tactics, or sources before they strike. While it can’t always prevent disinformation, it does give readers the mental blueprint to identify disinformation when they see it. Similar to how vaccines train your immune response against a virus, knowing more about the nature of disinformation has helped Ukrainians, as well as an international audience, to more easily dismiss it.
- Investing in longitudinal studies on policy effectiveness. We need a rigorous evaluation of different policy approaches to contribute to effective implementation and policy harmony over time. In the NPR Fresh Air podcast, Bergengruen notes the importance of investing in follow-on studies (using Ukraine as a case study), with the goal of revealing what methods worked in the fight against disinformation and codifying these best practices for future use once the conflict ends.
While Russia and its allies have weaponized disinformation in Ukraine, the spread of mis- and disinformation are not new. What is new: the channels and the speed at which misinformation can flow. The technologies and platforms that connect billions of people around the world enable the exponential creation and dissemination of more sophisticated and dangerous forms of distortion than ever before. The growing scope and scale of the threat posed by disinformation has also grown exponentially in politics, health, the environment, and technology, and other critical areas of society.
a global affairs media network
Combatting War-Fueling Disinformation on Ukraine’s Digital Battlefield
Image via Adobe Stock.
May 26, 2022
Amid Russia's physical invasion, Ukraine has effectively won the information war, preempting and debunking the same Russian disinformation that disrupted democracies in Europe and the U.S. We can learn from Ukraine's successes, write CollaborateUp's Richard Crespin and Caroline Logan.
A
s war engulfed Ukraine, Mykhailo Federov, Ukraine’s Digital Transformation Minister, led an army of information technology volunteers to counter Russian misinformation on the cyber-battlefield. The 31-year-old minister recruited hundreds of thousands of civilian volunteers to use simple tools in their phones and the internet to anticipate and undermine Russian disinformation.
Fresh off its successes disrupting democracy in the EU, UK, and U.S., Russia came into the war confident in its digital disinformation playbook. But Federov’s digital recruits are more than up to the task of countering them. As told by Time Magazine correspondent Vera Bergengruen on a recent episode of NPR’s Fresh Air, Ukraine’s social media campaign has successfully pre-bunked and debunked Russian misinformation. It has also united Ukrainians and garnered support all over the globe in the face of the atrocities committed by Russia in Ukrainian territory.
Some of the tactics are similar to the recommendations we put forth in our report, which was published last year before the invasion. In particular, five of the report’s recommendations offer a potential checklist for combatting disinformation in wartime and beyond:
- Know the difference between misinformation and disinformation so that you can know your enemy. Misinformation is inaccurate information. Disinformation is inaccurate information deliberately spread. This differentiation is important as it helps to distinguish the four major players on the digital battlefield: the “bad guys” deliberately spreading disinformation; the majority of us “passive consumers” just reading or watching information; the “unwitting accomplices” that unknowingly pass on disinformation often for benign reasons; and the “good guys” trying to actively stop the bad guys. The #1 job in Ukraine has been to fight the bad guys and, to do so, the Ministry of Digital Transformation has armed and empowered the good guys.
- Arming and empowering the good guys with widely accessible resources. It’s critical to put resources and support tools at the disposal of a great number of users to counter at scale the spread of disinformation. Federov’s team has developed guidelines to enable anyone with a cell phone and/or access to the internet to counter the massive amount of computational propaganda Russia has thrown around. The Ukrainians have armed the “good guys” with smart tools and tactics accessible to a critical mass. For example, the interview with Bergengruen highlights how the Ukrainian government repurposed Telegram bots originally used for basic customer service functions, such as registering for a driver’s license or vaccine appointment, to allow ordinary citizens to report Russian Army movements.
A few key additional tips to arm the good guys:
- Know your users. Ukrainians don’t really use Twitter, preferring Telegram instead. The Ministry chose to deploy domestic services (including the bots to report Russian troop movement) on Telegram but posts on Twitter in English to reach American and other European audiences.
- Recruit and recognize champions. When users first started reporting Russian troop movements on Telegram, the Defense Ministry thanked them on the platform, crediting them for a successful counterattack.
- Updating the limits of the free distribution of speech. You’re entitled to the free expression of your opinion but not to its free distribution. Editors and publishers have long exercised editorial control over what goes out on their platforms. Big Tech platforms have shown justified caution in taking on this responsibility, but they do need to exercise judgment. We see this recommendation play out currently as Ukraine’s Digital Ministry successfully pressured YouTube, Twitter, and others to remove RT, Russia’s English-language news channel, from their platforms.
- Focus on fact production (not correction). Our research revealed how the bad guys—especially those sponsored by the Russian state—have switched tactics. In the past they used propaganda to try to pressure people into taking a particular action. A few years ago, they switched to “flooding the zone” with confusing, disorienting, and often contradictory information with the explicit goal of eroding readers’ trust in traditional sources of news and information. They wanted us all to shut down and exit the civic space. This worked exceedingly well in Russia itself, where many citizens have simply chosen not to pay attention to politics anymore. But the Ukrainians have shown how this blunt-force approach can be countered with agile, flexible tactics including:
- Leveling the playing field by reinforcing social norms to encourage sharing verified information and discourage the sharing of unverified information on both “open” and “closed” platforms.
- Pre-bunking is better than debunking. Correcting a message after it enters the digital space often backfires by unintentionally reinforcing the very message it sought to discredit. Instead, Ukrainians have focused on the wider fact production ecosystem. Bergengruen explains one of their successful tactics as “pre-bunking,” getting ahead of anticipated propaganda before readers have the chance to come across it. Pre-bunking anticipates potential lies, tactics, or sources before they strike. While it can’t always prevent disinformation, it does give readers the mental blueprint to identify disinformation when they see it. Similar to how vaccines train your immune response against a virus, knowing more about the nature of disinformation has helped Ukrainians, as well as an international audience, to more easily dismiss it.
- Investing in longitudinal studies on policy effectiveness. We need a rigorous evaluation of different policy approaches to contribute to effective implementation and policy harmony over time. In the NPR Fresh Air podcast, Bergengruen notes the importance of investing in follow-on studies (using Ukraine as a case study), with the goal of revealing what methods worked in the fight against disinformation and codifying these best practices for future use once the conflict ends.
While Russia and its allies have weaponized disinformation in Ukraine, the spread of mis- and disinformation are not new. What is new: the channels and the speed at which misinformation can flow. The technologies and platforms that connect billions of people around the world enable the exponential creation and dissemination of more sophisticated and dangerous forms of distortion than ever before. The growing scope and scale of the threat posed by disinformation has also grown exponentially in politics, health, the environment, and technology, and other critical areas of society.