Columnists

US election officials on alert for AI robocalls

WHILE fake videos of Democratic candidate Kamala Harris spread on social media but fail to capture much interest, state officials are girding for what they consider a far more dangerous deception days before the United States presidential election: deepfake robocalls.

Officials in states from Arizona to Vermont are preparing for fake audio messages piped directly to home and mobile phones and out of public view, a concern exacerbated by rapidly advancing generative AI technology.

And unlike AI-generated photos and videos, which often have small, telltale signs of manipulation, such as an extra finger on a person's hand, it is more difficult for the average voter to spot a fake phone call, experts said.

Ahead of the Nov 5 election that pits Harris against Republican Donald Trump, election officials are on alert given early examples of such calls.

In January, a robocall impersonating US President Joe Biden circulated in New Hampshire, urging Democrats to stay home during the primary and "save your vote for the November election".

The political consultant behind the robocall was fined US$6 million in September.

Colorado Secretary of State Jena Griswold said: "We've seen examples of audio deepfakes. It's not something that is this imaginary technology. It's here."

Audio is most concerning because it is difficult to track and verify, said Amy Cohen, executive director of the National Association of State Election Directors, a nonpartisan professional organisation for election directors.

"Even without AI, every election official spends hours chasing their tails because of robocalls."

That's because investigating robocalls — automated calls delivering a recorded message — depends on people hearing the call correctly, recognising the call is fake and then reporting it to authorities.

Rarely do election officials receive a recording of the robocall, Cohen added.

To prepare, election directors have considered potential scenarios in training sessions and discussions throughout the year.

To arm themselves, officials are using old-school strategies.

In Colorado, election officials have considered how to react if they themselves are targeted with deepfake calls.

For example, what should officials do if they receive a call with a voice that sounds like Griswold's, instructing them to alter voting hours at polling locations?

Griswold says she has instructed officials to hang up and call her office if they suspect anything out of the ordinary.

"The issue with AI technology is that we need to train ourselves to not believe our eyes and ears."

Another tactic is more commonly seen in spy novels: election officials can agree on a secret code word with their colleagues as an added measure to verify identities over the phone, Cohen said.

State officials say they are particularly worried about false information spreading just days before the vote, leaving them with little time to respond.

When thousands of New Hampshire residents received the purported call in January from "Biden" urging them not to vote, Secretary of State David Scanlan said his office sprung into action.

The state attorney-general and law enforcement officials issued a statement about the fake call, prompting coverage on local radio and television.

And while there was no indication that the fake Biden call swayed any voters, the incident showed that officials need to be prepared for new risks emerging from the advent of AI.

*The writer is from Reuters


The views expressed in this article are the author's own and do not necessarily reflect those of the New Straits Times

Most Popular
Related Article
Says Stories