Recently, the development of AI voice manipulation software has improved to a remarkable extent, in both accessibility and quality. However, while some may use AI voice software to make funny videos of U.S presidents playing videogames, or more ill-intentioned people can produce high-quality misinformation campaigns on notable figures, there have been instances where voice deepfake technology has been used on everyday people to manipulate them in intricate, efficient phone call scams.
To understand how these phony phone calls work to scam people out of thousands of dollars, one must first understand the basics of how the technology works. The process of voice cloning is broken down in an article by Voicemod.net, stating that users first feed the software audio recordings of the voice they want to replicate, which is then stored in a database for the AI to dissect and analyze. After enough data is sent to the cloning program, the software is then able to recreate the voice's manner of speaking, albeit with some technical errors here and there. Once an accurate enough iteration of the voice is achieved, the user feeds the AI text prompts for it to read aloud, resulting in a near perfect simulation of anyone's voice. Websites like Murf.ai, Resemble.ai, and Speechify all operate under similar systems, with userbases of thousands for each respective site. With such a powerful technology being offered to the masses so freely, one would think that the restrictions pertaining for this technology are extremely rigid and specific, however, due to how new this technology is, the sheer scope of how influential this technology can be has not yet registered in the minds of many. At the time of writing, there is little to no legislation passed that protects the rights of people from being unwillingly inserted into these programs' algorithms and databases, putting the voices of potentially millions at risk.
When one considers who may be the victims of the abuse of this technology, the most immediate thoughts tend to be politicians, celebrities, and other characters of note. However, the application of this technology threatens many more than simply the rich and powerful. For many voice actors, this technology threatens their very livelihood, since people looking to save expenses may simply use voice cloning AI to replicate their voices instead of outright hiring the actors. But even more concerningly, AI voice software can threaten everyday people through terrifying scams that are sure to trick the unexpecting into surrendering thousands of dollars in the face of an abusive user of this AI voice cloning software.
One such instance of this had occurred in Scottsdale, Arizona, where a mother had received an ominous phone call of her daughter crying, sobbing, and begging for help. Before the woman had a moment to let the situation sink in, she heard another voice on the line- a man telling her daughter to get off of the call, and then threatening to drug, rape, and abandon her daughter in Mexico if the woman didn't wire him one million dollars. In a panic, the woman had other people call her husband and 911, believing her daughter to be in real danger. However, her husband confirmed that her daughter was safe upstairs in their home, at which point the malicious caller hung up.
Luckily, she had the foresight to call her husband at home to confirm that the claim was legitimate, otherwise this woman may have fallen victim to a scam that has amassed hundreds of victims. When interviewed following the incident, she recalled tearfully that she had no question of whether or not that was her daughter's voice on the other side of the call, saying "I never doubted for one second it was her. That's the freaky part that really got me to my core." This is a fortunate, but uncommon instance of someone who didn't fall victim to this voice cloning ransom scam.
When faced with the difficult reality of how hard it is to distinguish real voice from those generated by the cloning software, it can be terrifying to consider that the next person you talk with over the phone may not be a real person at all. However, there are some remedies for avoiding falling victim to scams like the one previously mentioned! Always ensure you don't pick up unknown caller numbers, for starters. In extreme cases, always have some sort of key word or phrase to identify a person's identity, to make sure there's always a way to make sure you're talking to the real deal.
As long as people practice basic Internet safety, with a healthy amount of skepticism and carefulness to these sorts of claims, it's likely that these scams will grow to be ineffective, until their eventual disappearance when more strict legislation prohibits these scams from existing in the first place.


Comments
Post a Comment