Putin's New Russian AI Disinformation Tool

Could this impact the election?

When discussing AI, we often get stuck in conversations about how it will impact us in our daily lives—conversations around copyright law, losing our jobs, and what is actually considered art. Sometimes, we lose track that these AI tools are also being used and weaponized by governments. And they are starting to have major implications for what the future might hold. 

Security researchers have uncovered a major Russian disinformation campaign using  AI, dubbed "CopyCop," to manipulate content from major news organizations.

While disinformation campaigns aren’t unique to Russia, ones using AI are relatively new and surprisingly effective. CopyCop employs AI to clone and subtly alter articles from trusted sources like Al-Jazeera, Fox News, and the BBC, injecting political bias to tailor these stories for specific audiences. 

This AI-driven network doesn’t just steal and tweak existing articles—it also creates spoofed versions of legitimate sites. This alters the name just enough that you won’t instantly recognize that you are on a fake site.

So, while you think you are on the actual BBC website, you're actually on a Russian clone of that website. Beyond that, they have created entirely fictitious news platforms like London Crier. 

But Why?

Well, to try to gain support or cause more division about topics that are of great importance to the Kremlin. They have recently been shown to spread conflicting reports about the Israel-Hamas conflict. As you would expect, some articles are aimed at undermining support for Ukraine. And it wouldn’t be a Russian propaganda campaign without it trying to get people living in democracies to be overly critical of their governments.

The scary part is that these stories aren't always obviously fake. Yes, they have produced articles claiming that the UK government criminalizes Islam or plans a NATO buffer zone around Ukraine. You can dispel most of those with a quick Google search. But some reports are only changed just enough to sow conflict between well-read individuals. While the campaign is trying to influence people, it also serves the larger goal of getting people to distrust all media. Because if you don’t know who is telling the truth, does the truth really matter?

Clément Briens, a threat intelligence analyst, warns that CopyCop's use of AI achieves unprecedented reach and effectiveness in swaying public opinion. This isn’t the old KGB spread of misinformation; this is happening in real time a thousand times a month.

If successful, CopyCop could inspire similar future operations, posing significant threats to Western democracy and trust in mainstream media. Combating this requires enhanced collaboration between governments, tech companies, and civil society to counter the proliferation of disinformation online. That is a huge ask for us, considering most of us will inevitably be fooled by at least one of these articles. 

In a world of fake news, these threats are very real.

Rap beef with AI voices is fun.

Propaganda campaigns aimed at eroding public trust in institutions - less fun.

Reply

or to participate.