Voice cloning isn't a new technology. It came to the forefront with the arrival of deepfakes. A deepfake is a video or audio that can look and sound like another person. The deepfake video will mimic the target person's movement. A deepfake voice can also be added so there is a full impression of the target person. These are not streamed in real-time and instead are video or audio clips.
What has changed recently is that artificial intelligence (AI) has made these deepfakes much better and quicker to assemble. AI voice cloning has gotten so good that scammers now use it to bait people into giving them money.
The way a voice clone scam works is that a scammer will create a replication of the target voice. They then use this voice to call relatives and ask or demand money. There are a number of ways that money can be sent digitally, including through the many available instant pay networks, such as Zelle.
Once someone sends money through an instant payment network, it can be very difficult, if not impossible, to get the funds back. The reason is that banks are reluctant to return funds via Zelle or similar services since the payer has already authorized the transfer.
Scammers only need a few minutes of the target voice to create the clone. AI is pushing this into the seconds range. The sample can be obtained from social media such as TikTok and Youtube. Once the scammer has a sample, they can leave a voice mail with a disturbing message and instructions on how to send money. If someone picks up the call, the scammer can play the sample and then interrupt it with another voice demanding ransom.
These calls can be very convincing since, in some cases, scammers can spoof the phone number of the person they are voice cloning.
Anyone who receives a call like this should consider hanging up and calling the family member or checking with other family members to verify that everything is ok.
Let Us Know What You Thought about this Post.
Put your Comment Below.