The FBI Says This Terrifying Photo Scam Could Target Anyone With Social Media
You are working at your computer one day and get an email that claims your loved one has been kidnapped. It comes with a photo of them being held prisoner and a voice recording of them pleading for help. What do you do? The first thing you may have to ask yourself in this age of artificial intelligence (AI) is if these photos and recordings are even real. This is what the Federal Bureau of Investigation (FBI) refers to as a virtual kidnapping scam, and almost anyone can fall victim to it.
Criminals will go onto a victim's social media account to gather materials that they need to successfully create these altered photos and other media, often known as deepfakes. Using AI and other media manipulation tools, they can create convincing fake pictures and recordings to make someone believe they have their loved one. Fake AI-generated evidence is even finding its way into courtrooms.
This can happen if someone actually has been kidnapped, and bad actors are looking to get rich off of the situation by pretending that they are the kidnapper. It can also be completely fake, with no kidnapping taking place but with the criminal using deepfake photos and a sense of urgency to get you to send money or risk losing your loved one.
Real world cases of deepfake scams
The kidnapping of TV host Savannah Guthrie's mother, Nancy Guthrie, from her home in Arizona caught the country's attention. One element of this case has been law enforcement's concern about scammers, deepfakes, and other bad actors pretending they have Nancy when they do not. In a video released to the supposed kidnappers, Savannah and her siblings even spoke about how it is easy to fake proof-of-life images these days and that they need undisputed proof their mother is alive and with the alleged kidnapper.
Sometimes, audio cloned with AI will be used to make the virtual kidnapping more convincing. This was the case for a mother in St. Louis, Missouri, who got a fake call from her daughter in college claiming she'd been kidnapped. The mother was fully certain it was her daughter's voice and quickly wired thousands of dollars to an account, only to have her real daughter call her later, completely safe and unharmed. Cloned audio from social media is just one way scammers are using AI to trick you.
Virtual kidnapping also isn't the only type of fraud coming from this technology and prevalence of social media content. An employee in Hong Kong sent millions of dollars to a criminal after believing they were in a video call with company management. The entire thing was actually AI generated.
What you can do to protect yourself
The rising use of AI for deepfakes have led to these complex types of crimes that easily scam a victim. This has even led to Google's Nano Banana AI image generator being called out for being too realistic and potentially used for such criminal activity. If you are contacted with supposed proof of life for a kidnapped victim, there are some things you should do to not fall for any scams.
Attempt to contact your loved one yourself to see if they actually have been kidnapped through phone, email, or social media. Ask the kidnapper about specific details of your loved one that they cannot easily get through social media channels. Reach out to your local police department as soon as possible so they can put their own expertise and technology at work to check for scams. The kidnappers asking for money via wire transfer is also a red flag.
If you get a proof-of-life photo or other media, try to record it so it can be analyzed. Typically these deepfakes are not perfect when compared to actual photos of the kidnapping victim. If you share personal photos or videos of yourself on social media, try to keep them as private as possible to reduce the chances of them being used against your family and friends.