Virtual Kidnapping: AI Tools Are Enabling IRL Extortion Scams

  /     /     /  
Publicated : 23/11/2024   Category : security


Virtual Kidnapping: AI Tools Are Enabling IRL Extortion Scams


With AI and publicly available data, cybercriminals have the resources they need to fake a real-life kidnapping and make you believe it.



If your spouse or your child called you on the phone, screaming and crying, telling you theyve been kidnapped, how likely would you be to meet it with calm, measured skepticism?
At this years Black Hat Europe
, two researchers from Trend Micro will be discussing the
real, emerging new trend of virtual kidnapping,
perhaps artificial intelligences most terrifying malicious application yet.
In a typical virtual kidnapping, attackers combine cyber compromise, intel gathered from social media or the Dark Web, and AI voice cloning software to realistically convince targets that their loved ones are missing. The result can feel near indistinguishable from the real thing, which is why more attackers are leveraging advanced AI technology to try it out.
Looking at underground chats with potential virtual kidnappers, it used to be that I would see maybe a dozen [posts about it] at any given time, claims Craig Gibson, principle threat defense architect with Trend Micro Research. Now its more like 150 at any given time.
Criminals begin a virtual kidnapping by identifying the victims — the one being kidnapped and, even more so, the relative wholl be contacted for negotiations.
If a perpetrator doesnt already have targets in mind, Gibson posits, some social media or Dark Web data harvesting might help identify prime candidates. Just as one would for an advertising campaign, if you already have vast bodies of data that have previously been hacked, Gibson says, you can then populate software like those which do advertising analytics to define the best target for a particular kind of attack.
Social media intel would also help with determining when to conduct an attack and filling in details to make it more realistic.
This, presumably, is how Jennifer DeStefanos hackers did it.
On an afternoon in April, Arizona-based DeStefano received a call from an unknown number. Mom, I messed up! her older daughter cried from the other end of the line.
As reported by CNN,
the 15-year-old was up north, training for a ski race. DeStefano worried shed been injured, but what followed was far scarier:
Listen here, her captor interjected. I have your daughter. You call the police, you call anybody, I’m gonna pop her something so full of drugs. I’m gonna have my way with her then drop her off in Mexico, and you’re never going to see her again. He demanded a million-dollar ransom, paid in cash.
Unknown to DeStefano, her daughters voice — crying Help me, help me! in the background of the call — was pre-recorded using an AI voice emulator. It was so realistic that even after her son was able to get her actual daughter on the phone, she said, at first I didnt believe she was safe, because her voice was so real to me.
DeStefanos false captors managed to identify her as wealthy, determine when she would be at her most vulnerable, and gather voice data to feed back through a phone call, probably from halfway across the world, using only the data otherwise available on the Internet.
And as convincing as they were in this case, there are plenty more areas in which virtual kidnappers can improve, using even just the technologies available to them today.
An attacker might preempt a virtual kidnapping, for example, with
a classic SIM swap
. Had DeStefanos attackers blacked out her daughters phone, they might have gotten their ransom before she became wise to their scheme.
Emerging AI, in particular, can help with the planning and execution of such a story. By using ChatGPT, an attacker can fuse large datasets of potential victims with not only voice and video information but also other signal data such as geolocation via Application Processing Interface (API) connectivity, Trend Micros researchers
wrote in a June blog post
.
Theoretically, ChatGPT can also be used in collaboration with text-to-speech and automation software to generate near-real-time responses from an AI-generated victim. Such a system has already been demonstrated, for a better cause, by
Jolly Roger Telephone
, which uses a Rube Goldberg-like collection of AI and non-AI software to facilitate calls with telemarketers using only automated bots.
In such a world as we live in now, even understanding the language spoken by ones victims isnt a requirement. With ChatGPT and translation functions, Gibson reminds us, you suddenly have this leap forward in which people who dont speak English — or dont speak it well enough to be able to negotiate — suddenly can. They can manage all of these cultural and language gaps using pure technology regardless of where theyre actually from.
Theres a good reason why criminals would choose virtual kidnapping over more traditional cyber extortion.
Traditional security architecture you could imagine being like a circle, and everything inside that circle is governed by traditional security products. Everything outside it is the human world. Virtual kidnapping attacks the human world where there are no security products, sure, and therefore the chance of getting caught are far less, Gibson laments.
In the end, there are few real technical solutions to a virtual kidnapping, besides perhaps blocking unknown phone numbers, or trying to confuse an AI voice generator by speaking in a second language.
So as vendors get better and better at securing that circle, Gibson concludes, it starts to push [cybercriminals] out of the circle, into this other attack region.

Last News

▸ New threat discovered: Mobile phone ownership compromised. ◂
Discovered: 23/12/2024
Category: security

▸ Some DLP Products Vulnerable to Security Holes ◂
Discovered: 23/12/2024
Category: security

▸ Scan suggests Heartbleed patches may not have been successful. ◂
Discovered: 23/12/2024
Category: security


Cyber Security Categories
Google Dorks Database
Exploits Vulnerability
Exploit Shellcodes

CVE List
Tools/Apps
News/Aarticles

Phishing Database
Deepfake Detection
Trends/Statistics & Live Infos



Tags:
Virtual Kidnapping: AI Tools Are Enabling IRL Extortion Scams