Sign In

Communications of the ACM

ACM News

The FTC Wants Your Help Fighting AI Vocal Cloning Scams

View as: Print Mobile App Share:
Sound-level visualization of an audio clip.

The U.S. Federal Trade Commission is soliciting for the best ideas on keeping up with tech-savvy con artists.

Credit: Deposit Photos

The Federal Trade Commission is on the hunt for creative ideas tackling one of scam artists' most cutting edge tools, and will dole out as much as $25,000 for the most promising pitch. First announced last fall, submissions are now officially open for the FTC's Voice Cloning Challenge. The contest is looking for ideas for "preventing, monitoring, and evaluating malicious" AI vocal cloning abuses.

Artificial intelligence's ability to analyze and imitate human voices is advancing at a breakneck pace—deepfaked audio already appears capable of fooling as many as 1-in-4 unsuspecting listeners into thinking a voice is human-generated. And while the technology shows immense promise in scenarios such as providing natural-sounding communication for patients suffering from various vocal impairments, scammers can use the very same programs for selfish gains. In April 2023, for example, con artists attempted to target a mother in Arizona for ransom by using AI audio deepfakes to fabricate her daughter's kidnapping. Meanwhile, AI imitations present a host of potential issues for creative professionals like musicians and actors, whose livelihoods could be threatened by comparatively cheap imitations.

From Popular Science
View Full Article



No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account