Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.
Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.
Sexual attraction doesn’t necessarily involve dehumanization. Unlike most other kinds of interest in a human being, it doesn’t require interest in their personality, but these are logically not the same.
In general you are using emotional arguments for things that work not through emotion, but through literal interpretation. That’s like using metric calculations for a system that expects imperial. Utterly useless.
No, it’s not. It’s literally a photorealistic drawing based on a photo (and a dataset to make the generative model). No children have been abused to produce it. Laws work literally.
No, because the woman is not being literally sexually exploited. Her photo being used without consent is, I think, subject of some laws. There are no new fundamental legal entities involved.
I think I agree. But it’s neither child pornography nor sexual exploitation and can’t be equated to them.
There are already existing laws for such actions, similar to using a photo of the victim and a pornographic photo, paper, scissors, pencils and glue. Or, if you think the situation is radically different, there should be new punishable crimes introduced.
Otherwise it’s like punishing everyone caught driving while drunk for non-premeditated murder. One is not the other.
Hey so, at least in the US, drawings can absolutely be considered CSAM
Well, US laws are all bullshit anyway, so makes sense
Normally yeah, but why would you want to draw sexual pictures of children?
Suppose I’m a teenager attracted to people my age. Or suppose I’m medically a pedophile, which is not a crime, and then I would need that.
In any case, for legal and moral purposes “why would you want” should be answered only with “not your concern, go eat shit and die”.