A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.
Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.
Since that study, every legit AI model has removed said images from their datasets and all models trained afterwards no longer include knowledge about those source images.
I know one AI model has specifically not included photos of underage people at all, to minimize the possibility this can happen even on accident. Making CSAM from an AI model is something anyone determined and patient enough can do with a good model trainer and a dataset of source images that have the features they want, even if the underage images are completely clean.
Making CSAM with an AI model is a deliberate act in almost every case… and in this case, he was arrested for distributing these images, which is super illegal for obvious reasons.