Scammers Are Now Using AI to Generate Fake Kidnapping Images, Then Using Them to Extort Family Members
It’s time to regulate this.
Published 2 hours ago in Wtf
Have you recently received an image of a loved one being kidnapped? Yes? Then what the heck are you doing reading this?! You have other things to take care of!
But if you haven’t (or if that situation has resolved itself), you should be aware of a new scam about which the FBI is issuing warnings. Specifically, there’s been an increase in cases of people using AI to fake kidnapping photos, then using those photos to get money from friends and relatives.
How it works is frustratingly simple. First, a scammer will find your image on Instagram or Facebook. Then, they’ll use an AI program to make a picture of you trapped in some sort of compromising position. After that, they’ll send that image to your family — either claiming to be you or the kidnapper demanding funds.
Of course, if this happens to you, you should immediately run the image through an AI detector and report it to the police, no matter what the person on the other end says. Then, you should write to your Congressman and tell them that we need to reign in this AI business. It’s getting ridiculous!