Robocaller spoofing Joe Biden is telling people not to vote in New Hampshire

Robocaller spoofing Joe Biden is telling people not to vote in New Hampshire

Robo-calls featuring the faked voice of US president Joe Biden’s are advising voters not to participate in the state of New Hampshire’s Tuesday presidential Primary Election, the state’s Attorney General Office warned on Monday.

“These messages appear to be an unlawful attempt to disrupt the New Hampshire presidential Primary Election and to suppress New Hampshire voters,” the AG’s office declared in a statement. “New Hampshire voters should disregard the content of this message entirely.” That’s a reference to the phone calls, not the AG’s statement!

The calls started circulating over the weekend and feature a faked Biden voice saying: “What a bunch of malarkey. Your vote makes a difference in November, not this Tuesday.”

To trick voters into believing it’s genuine advice, the voice delivering the message sounds like Biden and claims it was sent by the treasurer of a political committee supporting the leader’s campaign in the New Hampshire Democratic presidential Primary.

But the voice is spoofed, and appears to be artificially generated, officials warned. The state’s Department of Justice Election Law Unit is investigating the robocalls after receiving multiple complaints.

Several states – including California, Texas, Michigan, Washington, and Minnesota – have passed laws forbidding politicians from using deepfakes in election campaigns. The rules, however, are fuzzier when it comes to individuals using AI to create and distribute disinformation.

OpenAI bans long-shot presidential candidate bot for breaking T&Cs

AI political disinformation is a huge problem – but harder to fight than ever

Pakistani politician deepfakes himself to deliver a speech from behind bars

Meta: If you’re in our house running AI-massaged political ads, you need to ‘fess up

Tech providers – particularly those in the AI biz – are trying to prepare for the upcoming presidential election and prevent their tools from being misused. OpenAI confirmed to The Register that it had removed an account that was “knowingly violating our API usage policies which disallow political campaigning, or impersonating an individual without consent.”

That account was used by an AI startup named by Delphi that builds ChatGPT bots based on real personalities, in this case mimic congressman Dean Phillips (D-MN) with a tool called Dean.bot. Collins is seeking the Democratic Party’s presidential nomination with a longshot campaign.

OpenAI bans developers from using its models to create chatbots impersonating people, or applications that interfere with democratic processes, such as voting.

OpenAI is also currently advertising for an elections program manager, who will help guide company’s efforts to bolster election security across Europe, the Middle East and Africa.

The role includes identifying election-related risks, designing, coordinating, and rolling out mitigation strategies, with a salary starting from $190,000. ®

>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : The Register – https://go.theregister.com/feed/www.theregister.com/2024/01/23/robocaller_biden_new_hampshire/

Exit mobile version