Don't miss the big stories. Like us on Facebook.  

It’s the phone call every parent dreads: “A strange man forced me into his car!” It sounds like your child; the voice included the correct name of the child. The reality: the call includes a recording that was created with Artificial Intelligence. The crime is one more type of imposter scam called “virtual kidnapping,” and it is increasing in frequency.

According to the Federal Trade Commission, imposter scams cost Americans in excess of about $3 billion last year. The basic scenario works this way: a criminal calls the intended victim with the message that a family member has been kidnapped. A demand for payment of a ransom is made with the threat that failure to pay will result in harm to the “kidnap victim.” Instructions include the use of an untraceable payment method such as Bitcoin or a third-party processor and an additional threat that is notifying law enforcement will also result in harm.

But there was no kidnapping; it was all made up. The criminal conducted research on the intended kidnap victim using AI. Results included descriptions, photos, behaviors, positive and negative personal information, and possibly voice patterns. The information is then used to create a personal profile that may include an AI-created synthesized voice recording. Now, the trap is set, and the call is made.

All this takes time on the part of the criminal, but time is on their side, and the payoff can be quite rewarding. Scammers have weaponized technology. The length of the “cloned” voice message needs only to be a few seconds in length, long enough to convey fear when paired with unidentifiable screams and noise.

While fake kidnappings are relatively infrequent, the number of reported attempted scams of this nature has increased in the post-pandemic environment. The crime is rooted in an attempt to create an emotional response from the targeted victim; fear, anger, and compliance. Emotion distracts us from using logic and reason to address situations.

The FBI suggests the following to help avoid being scammed in a virtual kidnapping:

Don’t trust the voice you hear on the phone. Voice-cloning technology creates a very deceptive sound, plus the typical “victim” message is very short and does not allow you time to process the message.

Be cautious about posting information on social media. Personal data, travel plans, and medical conditions provide criminals with the facts that can make scams believable.

Develop a family plan to thwart fake kidnapping. Include a family password and share it verbally, not via email or text message. Consider the use of tracking devices or software. GPS features on cell phones and small transmitters such as the Apple iTag can provide you some degree of security but have implications in terms of privacy. Have open conversations before using these features.

Buy time. If someone is present with you, mute your phone and have that person call 911 to have the dispatcher notify the FBI.

If you can’t mute the phone, write a note.

Withhold financial information from anyone you do not know such as account numbers, balances, banks or credit union names, or brokerage firm names.

Virtual kidnappings provide criminals with one more way to separate you from your money while creating emotional distress that can be difficult to overcome. A little planning and discussion can provide a wall of protection.

Elliott Greenblott is a retired educator and coordinator of the AARP Vermont Fraud Watch Network. He hosts a CATV program, Mr. Scammer, distributed by GNAT-TV in Sunderland, VT – www.gnat-tv.org.


TALK TO US

If you'd like to leave a comment (or a tip or a question) about this story with the editors, please email us.
We also welcome letters to the editor for publication; you can do that by filling out our letters form and submitting it to the newsroom.