.
O

ne of the biggest stories of 2024 was that half of the world was going to vote—whether in democracies or autocracies.

A key concern for most democracies—especially the more polarized ones like the U.S., Poland, and UK—was whether their elections would be transparent, and free from internal or external interference (including cyber–attacks) and, thus, whether the outcomes could be trusted. 

Permeating these concerns was an overarching dread about the deployment of new technologies—especially those infused with Generative AI (GenAI) capabilities—to turbocharge the spread of mis– and dis–information weaponized social engineering, surveillance, cyber insecurity, and more. 

What was a functioning democracy to do in the face of the spread of what Putin’s Russia has perfected over many years and coined “political technology”—the art of weaponized disinformation, social engineering, and propaganda? 

Thus in 2024, leaders in democracies were hyper–focused on this concern and for good reason. While the consensus on the U.S. election was that neither traditional fraud nor cyber–incidents were widespread or material, the deployment of political technology was rife, unprecedented, and extraordinary—accelerated through the use of dark money, the manipulation of social media platforms such as X, micro–targeting of specific communities with different and opposite disinformation, and other deceptive communications campaigns. 

Moreover, the deployment of political tech and frontier technologies is not just within internal politics; it is a vast and growing part of geopolitics. Witness the use of high altitude spy balloons (China), drones and killer drones (Russia, Ukraine, Israel), remote controlled exploding pagers (Israel and Lebanon), device surveillance (NSO/Pegasus), and next generation satellites (Starlink) for all manner of geopolitical engagement and rivalry. 

Everything we have seen in the deployment of political technology in 2024 will carry over and become exponential in 2025 and beyond. 

The thread that must run through all of this? We need to place responsible and technologically expert humans into the political technology loop. While GenAI has turbocharged political technology and will certainly be used by those seeking to disrupt or destroy democracies by weakening its pillars (i.e. rule of law and independent media), and creating or worsening geopolitical tensions and wars, those of us still living in democracies (as imperfect as they may be) must step into the breach, and in a concerted manner, construct and reconstruct our information environments to increase their trustworthiness. This includes being fast, expert, and nimble in intercepting, pre–bunking, and debunking information toxicity enabled by political technology. Now would be a good time to start.

About
Andrea Bonime-Blanc
:
Dr. Andrea Bonime–Blanc is the Founder and CEO of GEC Risk Advisory, a board advisor and director, and author of multiple books.
The views presented in this article are the author’s own and do not necessarily represent the views of any other organization.

a global affairs media network

www.diplomaticourier.com

Putting responsible humans in the political tech loop in 2025

Photo by Soumil Kumar via Pexels.

December 11, 2024

As political tech evolves, 2025 demands responsible human oversight to safeguard democracies from AI-driven disinformation and geopolitical threats, write Andrea Bonime-Blanc.

O

ne of the biggest stories of 2024 was that half of the world was going to vote—whether in democracies or autocracies.

A key concern for most democracies—especially the more polarized ones like the U.S., Poland, and UK—was whether their elections would be transparent, and free from internal or external interference (including cyber–attacks) and, thus, whether the outcomes could be trusted. 

Permeating these concerns was an overarching dread about the deployment of new technologies—especially those infused with Generative AI (GenAI) capabilities—to turbocharge the spread of mis– and dis–information weaponized social engineering, surveillance, cyber insecurity, and more. 

What was a functioning democracy to do in the face of the spread of what Putin’s Russia has perfected over many years and coined “political technology”—the art of weaponized disinformation, social engineering, and propaganda? 

Thus in 2024, leaders in democracies were hyper–focused on this concern and for good reason. While the consensus on the U.S. election was that neither traditional fraud nor cyber–incidents were widespread or material, the deployment of political technology was rife, unprecedented, and extraordinary—accelerated through the use of dark money, the manipulation of social media platforms such as X, micro–targeting of specific communities with different and opposite disinformation, and other deceptive communications campaigns. 

Moreover, the deployment of political tech and frontier technologies is not just within internal politics; it is a vast and growing part of geopolitics. Witness the use of high altitude spy balloons (China), drones and killer drones (Russia, Ukraine, Israel), remote controlled exploding pagers (Israel and Lebanon), device surveillance (NSO/Pegasus), and next generation satellites (Starlink) for all manner of geopolitical engagement and rivalry. 

Everything we have seen in the deployment of political technology in 2024 will carry over and become exponential in 2025 and beyond. 

The thread that must run through all of this? We need to place responsible and technologically expert humans into the political technology loop. While GenAI has turbocharged political technology and will certainly be used by those seeking to disrupt or destroy democracies by weakening its pillars (i.e. rule of law and independent media), and creating or worsening geopolitical tensions and wars, those of us still living in democracies (as imperfect as they may be) must step into the breach, and in a concerted manner, construct and reconstruct our information environments to increase their trustworthiness. This includes being fast, expert, and nimble in intercepting, pre–bunking, and debunking information toxicity enabled by political technology. Now would be a good time to start.

About
Andrea Bonime-Blanc
:
Dr. Andrea Bonime–Blanc is the Founder and CEO of GEC Risk Advisory, a board advisor and director, and author of multiple books.
The views presented in this article are the author’s own and do not necessarily represent the views of any other organization.