Insofar as we create laws to protect people from undue harm, it is a well known fact that victims of CSAM are victimized twice: first when the CSAM is produced and again when it’s viewed.
“Yeah everyone at school is looking at it, but don’t worry it’s only pixels that represent your naked body!”
The fact people can now produce CSAM that doesn’t inflict the first victimization does not eliminate the second, and the fact the current law wasn’t designed with this technology in mind doesn’t mean no harm is inflicted and we shouldn’t be responding to it.
Of course you’ll argue that someone could paint a nude child with a real person’s likeness and I’ll point out that an actual photograph of a nude child is also not “really them.” We gauge the badness of any location on this spectrum by the harm felt. A thing that ~never happens, ~never causes harm, and is very difficult to produce (high quality paintings of nude children) is strictly less bad than something that costs $0, 0 hours, 0 talent to produce at significantly higher quality (synthetic CSAM).
I don’t buy the “it’s easier to do now so the law must change” argument. You actually just admitted that there isn’t anything fundamentally different between a hand drawn depiction and an AI generated one. Both may be tasteless in most situations, but in neither case is there a victim.
“Yeah everyone at school is looking at it, but don’t worry it’s only pixels that represent your naked body!”
The fact people can now produce CSAM that doesn’t inflict the first victimization does not eliminate the second, and the fact the current law wasn’t designed with this technology in mind doesn’t mean no harm is inflicted and we shouldn’t be responding to it.
Of course you’ll argue that someone could paint a nude child with a real person’s likeness and I’ll point out that an actual photograph of a nude child is also not “really them.” We gauge the badness of any location on this spectrum by the harm felt. A thing that ~never happens, ~never causes harm, and is very difficult to produce (high quality paintings of nude children) is strictly less bad than something that costs $0, 0 hours, 0 talent to produce at significantly higher quality (synthetic CSAM).