|
You may be familiar with scam texts or cold emails, but what happens when it’s your child’s voice on the other line? The rise of Artificial Intelligence has brought with it more than just advanced image generation; it’s redefined how scammers operate.
We recently saw a client who has a brother that posts frequently on social media. A scammer was able to take his voice and manipulate it to sound as if he was in trouble and needed to be sent money to get to safety. Our client’s family was contacted through Facebook, which the scammer had hacked, and asked to send large sums of money to get them out of trouble. Fortunately they were wary and avoided sending any money to the scammer. Since this event, they have instituted a code word that can be used when calling one another to check that they aren’t talking to an artificial voice. How does this happen? AI models only need a few seconds of your voice and some prompts to make a decent clone, and only a few more for a very convincing one. These clips can be taken from social media or even your phone’s voicemail. People with an online presence are even more susceptible due to the volume of training data available. How can I tell if I’m talking to a voice clone? While these tools are difficult to distinguish from real voices, there are a few things to look for: Does it make sense?
How does it sound?
What are they saying?
If the answer to any of these questions is yes, you should double check to verify that the voice hasn’t been modified by AI. How do I prevent my voice being stolen? Unfortunately there isn’t much you can do if you have a social media presence. You can make your page private, or avoid using your voice on videos you post. You might also like to use an automated voice mail that can’t replicate your voice. If you think you have been scammed, you should contact your bank immediately. You can also access the following resources for more information on reporting and prevention: Comments are closed.
|
Categories
All
Archives
April 2026
|