Cops Were Caught Using A.I. to Generate Fake Evidence Photos
Well this feels like it will be bad for society.
Published 2 weeks ago in Facepalm
One of the many fears surrounding A.I. is that people will use it to generate illegal content. Whether that’s somebody creating a nonconsensual image of someone they know, or a criminal fabricating an alibi, the proliferation of A.I. technology has left many people clutching their pearls wondering what terrible ways bad people will use this development.
Well, now, someone *has* used A.I. for evil — the cops.
Police in Westbrook, Maine were making one of those posts that police love to make — you know, the “look at how much stuff we seized” posts — when viewers noticed something was amiss.
Looking closer at the image, viewers could see everything was oddly smooth, with the characteristic tan of A.I. content. Plus, if you looked closely at any text in the image, you could see that it made literally zero sense whatsoever. The conclusion was clear: This police station had used A.I. to generate their criminal haul photo! My God, is nothing sacred?
They were quickly called out for this and deleted the post, but I’m going to need more information here. Did you simply not have enough time to take a real photo? Is the story of the bust fake from top to bottom?
I need answers, darn it!
BREAKING: Cops post AI slop of a ‘drug bust’ then try to delete evidence once busted https://t.co/HUQ5w0e1hzpic.twitter.com/Z5jM5n9Vgg
— The Meme-Industrial Complex (@MemeIndustrial) July 1, 2025