It's Happening: Fake Biden Robocalls

Will this influence the election?

Sponsored by

Your info is on the dark web

Every day, data brokers profit from your sensitive info—phone number, DOB, SSN—selling it to the highest bidder. And who’s buying it? Best case: companies target you with ads. Worst case: scammers and identity thieves.

It's time you check out Incogni. It scrubs your personal data from the web, confronting the world’s data brokers on your behalf. And unlike other services, Incogni helps remove your sensitive information from all broker types, including those tricky People Search Sites.

Help protect yourself from identity theft, spam calls, ID theft and health insurers raising your rates. Plus, just for our readers: Get 55% off Incogni using code PRIVACY.

Joe Biden tweeting on mobile phonee

Joe Biden

We like to write about how AI impacts real people in the real world. So, let’s talk about elections and AI's role in this next big one in the US.

With AI companies and social networks gearing up and releasing statements on how they plan to tackle the rise of AI-generated misinformation, this week’s story comes in the form of a less techy communication style — phone calls.

On January 22nd, the New Hampshire Department of Justice released a statement informing the public that people had received calls from a recorded audio deepfake of Joe Biden.

This means when they answered the robocall, they heard what sounded like a prerecorded message to voters from President Joe Biden himself. When, in fact, the audio was entirely deepfaked and generated with AI.

Voters were encouraged by the fake Joe Biden not to vote in the state primary election. They heard him encourage them to “save” their vote, telling them, “Your vote makes a difference in November, not this Tuesday.”

Voters can and should vote in the primary and November elections.

What’s remarkable about this story is how quickly the industry could determine who was behind the robocalls and how they’d made the audio. It was a voice-fraud detection company called Pindrop Security who found out that the fake Biden robocall was made using ElevenLabs.

ElevenLabs is an AI voice generator that can add human-like inflexion to AI voices based on context. You can use thousands of their pre-made AI voices to read whatever you want aloud and in whatever tone you wish to hear. You can also use their platform to create custom voices.

Once ElevenLabs was made aware of the Biden deepfake, it investigated and suspended the account that created it. But it’s unclear what the legal repercussions for something like this might be.

What stands out to us about this story is how easy it appears to have been for one user to create such startlingly realistic election misinformation with the voice of the current president.

If anyone can use simple, affordable software to produce things like this quickly, and if everyone else’s parents believe the things they see on Facebook as much as ours do… what’s going to happen when serious players decide to wade in and push out AI-generated misinformation?

We’ve seen how much countries like to meddle in elections, creating hundreds of thousands of fake profiles and user accounts online—posting and sharing misinformation and stoking the fires of division. How will we cope when AI misinformation gets so realistic and frequent that we cannot tell what’s real?

Half the videos of Trump or Biden we see every week could just as believably be AI-generated. We’ll have to keep an eye on how things go this year.

We fulfil referral rewards at the end of the month.

Join the conversation

or to participate.