Uncategorized

Icky horrible thought of the day: what do we think about AI-generated child porn/CSAM?

On one hand, it’s horrible, and icky.

On the other hand, as an alternative to actual real CSAM, the harm reduction seems huge.

On the other other hand, are there other dangers to allowing this kind of material to proliferate more widely? Eg for people who want it, does easier access sate them? Or fuel their urge for more? Does it make them less likely to go do something bad in real life? Or more?

Icky horrible thought of the day 🤮, now I need to go take a shower.

Standard

8 thoughts on “

  1. @snarfed.org I get your point that its a different kind of harm but I think that there is a huge risk that the source material used to produce the AI generated SCAM actually comes from a non-consenting usage of a real victim, even just by having her or his image or part of it, identified as being his or hers.I remember a reddit post where one girl was 200% rightfully traumatized to see her actual face being used to generate fake nudes, imagine abuses.

  2. Another angle on this is, as a society, for better or worse, we generally allow fictional depictions of violent crime against adults, including sexual abuse. Should AI-generated fictional CSAM be different?

  3. @nelson Ah, great point! Absolutely right, AI isn’t special here, other human-created art forms raise the same questions. Thanks for the nudge, I’ll go read up.

  4. ^ I wonder if the potential for AI-generated CSAM to be (more) photorealistic changes this at all? Probably not that much, overall.

  5. Sadly, this is behind a paywall, but seems relevant: https://link.springer.com/article/10.1007/s12119-021-09820-1

    I think your original question bears additional research. I don’t understand the use of CSAM (i.e. why people are drawn to it), but if you can provide a substitute for which there is no human subject that is the source of the material, I do wonder if, as you suggest, it creates a kind of victimless scenario. I’m sure there’s much more psychology harms and considerations to weigh, but compared with abolishing altogether, if it can reduce real-world risks to human children (i.e. not virtual children that never existed) it seems worth exploration.

    “There are a variety of actual and potential societal and technical outcomes resulting from the widespread ability to produce CG-CSAM. For completeness, it is worth noting some have argued, under the right controls, such material could be used to reduce offender risk in some instances. Some theorize that the use of CG-CSAM in place of CSAM produced from the non-virtual abuse of living children could serve a preventative purpose—potentially for treatment/impulse management of those identifying with a sexual attraction to minors. However, neither the viability nor efficacy of such a practice has been sufficiently studied and many warn that, for some, this material could have an adverse effect— lowering barriers of inhibition or contributing to existing fantasies of real-world abuse.”

Leave a Reply

Your email address will not be published. Required fields are marked *