“Bad actors are using AI-generated voices in unsolicited robocalls to extort vulnerable family members, imitate celebrities, and misinform voters. We’re putting the fraudsters behind these robocalls on notice,” said FCC Chairwoman Jessica Rosenworcel.
While currently, State Attorneys Generals can target the outcome of an unwanted AI-voice generated robocall-such as the scam or fraud they are seeking to perpetrate-this action now makes the act of using AI to generate the voice in these robocalls itself illegal, expanding the legal avenues through which state law enforcement agencies can hold these perpetrators accountable under the law.
In November of 2023, the FCC launched a Notice of Inquiry to build a record of how the agency can combat illegal robocalls and how AI might be involved.
The agency asked questions on how AI might be used for scams arising from junk calls by mimicking the voices of those we know and whether this technology should be subject to oversight under the TCPA. Similarly, the FCC also asked about how AI can help us with pattern recognition to turn this technology into a force for good that can recognize illegal robocalls before they ever reach consumers on the phone.
The Commission can also take steps to block calls from telephone carriers facilitating illegal robocalls.
The FCC currently has a Memorandum of Understanding with 48 State Attorneys General to work together to combat robocalls.