|
Our office has been conducting fraud presentations for years and we always warn against believing a common impersonator scam call, often referred to as the grandparent scam. Many fall victim to this scam particularly because the scammer creates such a level of fear that victims respond before they can think straight.
Some victims realize that the voice on the other end of the phone doesn’t sound like their favorite grandchild and quickly hang up, saving themselves from losing money to a scammer.
But what if the voice on the other line sounds just like your grandchild? That can happen now through the magic of AI (artificial intelligence). Using this technology, scammers can re-create the pitch, timbre, and accent of an individual. Scammers are making use of this technology to replicate a voice, convincing the victim that the voice on the other line is truly their loved one.
Where do they get the voice sample? If you have ever made a TikTok or a video on Facebook, your voice is out there, ready to be replicated. The Federal Trade Commission is warning consumers if you receive a call from an unknown number, let the other person speak first just in case the scammer is mining for your voice.
Scammers often try to bully victims into transferring money through a mobile payment app, by wiring money or by purchasing gift cards or money orders. Some may even request to meet to receive money in person. If you get a call like this, hang up and report it immediately to local law enforcement.
The bottom line is, if you receive a call saying your loved one is in trouble, do not trust the voice. Practice patience and control before you react. Call your loved one directly to verify they are where they should be, at home and not in a Mexican jail.
Read about it here.
|