Be cautious of ‘relatives’ requesting money: AI voice cloning scam surges
Generative artificial intelligence and improved machine learning algorithms have opened the floodgates for a new wave of fraud. You’ve probably heard of deepfakes, where scammers use artificial intelligence to create convincing fake videos of popular celebrities, and in some cases even a person’s relatives, to trick them into giving them money. Now users face another threat: AI voice cloning.
In this scenario, scammers can impersonate your family members using advanced voice cloning methods. Just a few weeks ago, NDTV reported a case where an elderly man in Delhi fell victim to a scam and lost Rs 50,000. He was deluded into thinking that his cousin’s son had been kidnapped. Using sophisticated tactics, the scammer pressured by playing a cloned voice recording of a child – successfully convincing an elderly man to transfer Rs 50,000 through Paytm in a panic.
It was later revealed that the kidnapping was a hoax to extort money from him. These types of scams are slowly emerging as scammers find new technologies and techniques to become more convincing
AI Voice Cloning Scam: How to Stay Safe
First, you should always verify anything someone tells you, whether it’s an emergency or someone trying to blackmail you. Scammers use fear to scare their victims and make them panic so they can extort money from them. So it’s important to stay calm when you get a call like this.
And second, it should be noted that AI voice cloning is a relatively new technology, especially in the hands of scammers using consumer apps. For now, voice cloning still has an element of sounding robotic and digital. Pay attention to how sentences end and watch for a robotic undertone in the voice.
You might have come across popular Instagram reels where Prime Minister Narendra Modi’s voice has been cloned into meme clocks, but you can tell that it has a robotic feel to it, and the same goes for scam calls.