Thursday, February 22, 2024

FCC declares AI-generated voices in robocalls are illegal

The Federal Communications Commission said Thursday that the use of voice cloning technology in robocalls is illegal, giving states another tool to pursue the scammers behind the calls.

The decision takes effect immediately and comes amid an increase in the number of calls of this type due to technology that offers the possibility of confusing people with recordings imitating the voices of celebrities, political candidates and even close family members.

“Malicious actors are using AI-generated voices in unsolicited robocalls to extort vulnerable family members, impersonate celebrities, and misinform votes,” said FCC Chairwoman Jessica Rosenworcel. “State attorneys general will now have new tools to crack down on these scams and ensure the public is protected against fraud and misinformation.”

The FCC’s action follows an incident preceding New Hampshire’s presidential primary last month, in which a fake robocall posing as President Biden encouraged voters do not vote in the competition. An estimated 5,000 to 25,000 calls were made.

Fake Biden robocall tells Democrats to skip New Hampshire primary


New Hampshire Attorney General John Formella on Tuesday said the AI-generated recording was created to sound as the president has been linked to two Texas companies, with a criminal investigation underway.

AI-produced disinformation targeting voters prompted two U.S. senators, Minnesota Democrat Amy Klobuchar and Maine Republican Susan Collins, to recently pressure the U.S. Election Assistance Commission to take action to combat such disinformation campaigns.

The New Hampshire robocall is just the latest flashpoint for AI-generated images, videos and audio propagated online during an already contentious 2024 campaign cycle.

Note: The content and images used in this article is rewritten and sourced from



Please enter your comment!
Please enter your name here

Most Popular

Most Trending