In a chilling trend, scammers are increasingly turning to artificial intelligence (AI) tools to clone the voices of their targets on social media. Their nefarious goal? To place panic-inducing calls to victims’ family and friends, all in a bid to convince them to part with their money or divulge sensitive information.
Mike Scheumack, the Chief Innovation Officer at IdentityIQ, a firm specializing in identity theft protection and credit score monitoring, reveals that AI’s infiltration into the world of cybercrime has been swift and sinister. According to Scheumack, the use of AI in cybercrimes has surged dramatically over the past year, with scammers employing advanced phishing and voice cloning techniques.
Voice cloning scams, a particularly disturbing development, involve recording a person’s voice or sourcing an audio clip from social media or the internet. Astonishingly, all it takes is as little as three seconds to create a highly convincing voice clone. These audio samples are then processed through AI programs that replicate the voice, allowing scammers to make it say virtually anything, complete with emotions like laughter and fear, as scripted for the scam.
To illustrate the alarming sophistication of these AI voice cloning programs, IdentityIQ conducted an experiment using an audio sample from a podcast interview. They crafted an AI voice clone for a simulated panicked phone call to a family member, requesting a cash app transfer following a fictional car accident. The result was eerily convincing, with the AI-generated voice pleading for financial help.
Typically, these voice clone calls are brief, designed to generate a sense of urgency in the victim. Scammers may abruptly end the conversation with statements like “I can’t talk right now” as they press for money, account access, or sensitive information. Their ultimate aim is to trigger the victim’s “fight or flight” response, creating an immediate sense of crisis.
Scheumack emphasizes the importance of verification in such situations. He recommends that recipients of suspicious calls hang up and immediately contact their loved ones to confirm their safety. This simple step can thwart the scammers’ attempts to exploit fear and urgency.
One recent case highlights the chilling effectiveness of AI voice cloning. An individual received what she believed to be a panicked call from her daughter at a camp. However, it turned out to be an AI-generated voice clone of her daughter. The scammers had gleaned information from social media to make the call alarmingly convincing.
Notably, fraudsters executing AI voice scams also employ AI programs to scour the internet for information about their targets. They hunt for audio or video content on social media or other platforms, gathering details that can be used to craft more compelling calls to unsuspecting victims.
What makes these scams particularly unsettling is their level of sophistication. Scheumack points out that these aren’t amateur endeavors but the work of well-organized groups. Multiple individuals play distinct roles, from gathering data on social media to cloning voices and executing the scam. It’s a highly coordinated operation that preys on the unwary.
To safeguard against falling victim to AI voice cloning scams, Scheumack offers practical advice. First and foremost, individuals should exercise caution when posting personal information online. Secondly, when receiving a call from an unknown number, especially one claiming to be a loved one in distress, it’s crucial to pause and evaluate the situation. Don’t rush to provide assistance; instead, verify the caller’s identity through a separate means.
Scheumack also suggests that families consider implementing a verification process involving a unique passphrase. This extra layer of security can help ensure that a caller claiming an emergency is indeed the family member they purport to be.
As the use of AI in cybercrime continues to evolve, individuals must remain vigilant, protecting themselves and their loved ones from the increasingly sophisticated tactics employed by scammers. Trust, but verify, should be the mantra in this age of AI-driven deception.
Download our app MadbuMax on the Apple App Store for the latest news and financial tools. Interested in getting your finances in order do not forget to check Dr. Paul Etienne’s best-seller book on personal finance. To access more resources, tools, and services please click here. Also, do not forget to follow Dr. Etienne on IG or Twitter.