Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

If you’re going to contact the police and rat someone out to reveal their interest in child sexual abuse material (CSAM) to you, it’s not the best idea to have the same material on your own device. or to further consent to a search so that law enforcement can gather more information. But an Alaskan man allegedly did just that. This took him into police custody.
404 media Report Earlier this week, Anthony O’Connor, who was arrested himself after a police search of his devices revealed AI-generated child sexual abuse material (CSAM).
From 404:
As recently filed Charge documentsAnthony O’Connor reached out to law enforcement in August to alert an unidentified airman who shared child sexual abuse (CSAM) material with O’Connor. While investigating the crime, and with O’Connor’s consent, federal authorities searched his phone for additional information. A review of the electronics revealed that O’Connor proposed creating a virtual reality CSAM for airmen, according to the criminal complaint.
According to police, the unidentified airman shared with O’Connor a photo of a child at a grocery store, and the two discussed how they could project the minor into an immersive virtual reality world.
Law enforcement claims to have found at least six explicit, AI-generated CSAM images on O’Connor’s device, which he said were intentionally downloaded, along with some “real” images that were inadvertently mixed in. Through O’s search of Connor’s home, law enforcement uncovered a computer with multiple hard drives hidden in a vent in the home; A computer review allegedly revealed a 41-second video of child rape.
In an interview with authorities, O’Connor said she regularly reported CSAM to Internet service providers “but received sexual gratification from the images and videos.” It’s unclear why he decided to report the airman to law enforcement. Maybe he had a guilty conscience or maybe he really believed his AI didn’t break CSAM laws.
AI image generators are usually trained using real photos; That is, images of children “generated” by AI are fundamentally based on real images. There is no way to separate the two. AI-based CSAM is not a victimless crime in that sense.
There has just been the first arrest of someone for possessing AI-generated CSAM Back in May When the FBI arrested a man for using stable diffusion to create “thousands of realistic images of prepubescent minors.”
Proponents of AI will say that it has always been possible to create candid images of minors using Photoshop, but AI tools make it easier for anyone to do it. This is known in a recent report One in six Congress women Targeted by AI-generated deepfake porn. Many products have guardrails to prevent the worst possible use such that the printer does not allow photocopying of coins. Implementing constraints prevents at least some of this behavior.