> It can. In several public cases it seems fairly clear that there is a "community" aspect to these productions and many of these sites highlight the number of downloads or views of an image. It creates an environment where creators are incentivized to go out of their way to produce "popular" material.
So long as it's all drawn or generated, I don't see why we should care.
> I entirely disagree. Offenders tend to increase their level of offense.
This claim reminds me of similar ones about how video games are "on-ramp" to actual violent crime. It needs very strong evidence to back, especially when it's used to justify harsh laws. Evidence which we don't really have because most studies of pedophiles that we have are, by necessity, focused on the ones known to the system, which disproportionally means ones that have been caught doing some really nasty stuff to real kids.
> I entirely disagree. Offenders tend to increase their level of offense. This is about preventing the problem from becoming worse and new victims being created. It's effectively the same reason we harshly prosecute people who torture animals.
Strict liability for possession means that you can imprison people who don't even know that they have offending material. This is patent nonsense in general, regardless of the nature of what exactly is banned.
> That's a bold claim. Is it based on any facts or study?
It is based on the lack of studies showing a clear causal link. Which is not definitive for the reasons I outlined earlier, but I feel like the onus is on those who want to make it a crime with such harsh penalties to prove said causal link, not the other way around.
Note also that, even if such a clear causal link can be established, surely there is still a difference wrt imputed harm - and thus, culpability - for those who seek out recordings of genuine sexual abuse vs simulated? As things stand, in many jurisdictions, this is not reflected in the penalties at all. Justice aside, it creates a perverse incentive for pedophiles to prefer non-simulated CSAM.
> It's about the potential class of victims and the outrageous life long damage that can be done to them. The appropriate response to recognizing these feelings isn't to hand them AI generated material to sate their desires. It's to get them into therapy immediately.
Are you basically saying that simulated CSAM should be illegal because not banning it would be offensive to real victims of actual abuse? Should we extend this principle to fictional representations of other crimes?
As far as getting them into therapy, this is a great idea, but kinda orthogonal to the whole "and also you get 20+ years in the locker" thing. Even if you fully buy into the whole "gateway drug" theory where consumption of simulated CSAM inevitably leads to actual abuse in the long run, that also means that there are pedophiles at any given moment that are still at the "simulated" stage, and such laws are a very potent deterrent for them to self-report and seek therapy.
With respect to "handing them AI-generated material", this is already a fait accompli given local models like SD. In fact, at this point, it doesn't even require any technical expertise, since image generator apps will happily run on consumer hardware like iPhones, with UI that is basically "type what you want and tap Generate". And unless generated CSAM is then distributed, it's pretty much impossible to restrict this without severe limitations on local image generation in general (basically prohibiting any model that knows what naked humans look like).
> Are you basically saying that simulated CSAM should be illegal because not banning it would be offensive to real victims of actual abuse?
No, it's because it will likely lead to those consuming it turning to real life sexual abuse. The behavior says "I'm attracted to children." We have a lot of good data on where this precisely leads to when entirely unsupervised or unchecked.
> but kinda orthogonal to the whole "and also you get 20+ years in the locker" thing.
You've constantly created this strawman but it appears nowhere in my actual argument. To be clear it should be like DUIs, with small penalties on first time offenses increasing to much larger ones upon repetition of the crime.
> it's pretty much impossible to restrict this
Right. It's impossible to stop people committing murder as well. It's also impossible to catch every perpetrator. Yet we don't strain ourselves to have the laws on the books, and it's quite possible the laws, and the penalties themselves, have a "chilling effect" when it comes to criminality.
Or, if your sensibility of diminished rights is so offended, then it can be a trade. If you want to consume AI child pornography you have to voluntarily add your name to a public list. Those on this list will obviously be restricted from certain careers, certain public settings, and will be monitored when entering certain areas.
> No, it's because it will likely lead to those consuming it turning to real life sexual abuse. The behavior says "I'm attracted to children." We have a lot of good data on where this precisely leads to when entirely unsupervised or unchecked.
For one thing, again, we don't have quality studies clearly showing that.
But let's suppose that we do, and they agree. If so, then shouldn't the attraction itself be penalized, since it's inherently problematic? You're essentially saying that it's okay to nab people for doing something that is in and of itself harmless, because it is sufficient evidence that they will inevitably cause harm in the future.
I do have to note that it is, in fact, fairly straightforward to medically diagnose pedophilia in a controlled setting - should we just routinely run everyone through this procedure and compile the "sick pedo list" preemptively this way? If not, why not?
> You've constantly created this strawman but it appears nowhere in my actual argument.
My "strawman" is the actual situation today that you were, at least initially, trying to defend.
> Right. It's impossible to stop people committing murder as well. It's also impossible to catch every perpetrator. Yet we don't strain ourselves to have the laws on the books, and it's quite possible the laws, and the penalties themselves, have a "chilling effect" when it comes to criminality.
That can be measured, and we did - and yes, they do, but it's specifically the likelihood of getting caught, not so much the severity of the punishment (which is one of the reasons why we don't torture people as form of punishment anymore, at least not officially).
The point, however, was that nobody is "handing" them anything. It's all done with tools that are, at least at present, readily available and legal in our society, and this doesn't change whether you make some ways of using those tools illegal or not, nor is it impossible to detect such private use unless you're willing to go full panopticon or ban the tools.
Laws don't need to be absolutely enforceable to still work. You probably will not go to jail for running that stop sign at the end of your street (but please don't run it).
So long as it's all drawn or generated, I don't see why we should care.
> I entirely disagree. Offenders tend to increase their level of offense.
This claim reminds me of similar ones about how video games are "on-ramp" to actual violent crime. It needs very strong evidence to back, especially when it's used to justify harsh laws. Evidence which we don't really have because most studies of pedophiles that we have are, by necessity, focused on the ones known to the system, which disproportionally means ones that have been caught doing some really nasty stuff to real kids.
> I entirely disagree. Offenders tend to increase their level of offense. This is about preventing the problem from becoming worse and new victims being created. It's effectively the same reason we harshly prosecute people who torture animals.
Strict liability for possession means that you can imprison people who don't even know that they have offending material. This is patent nonsense in general, regardless of the nature of what exactly is banned.
> That's a bold claim. Is it based on any facts or study?
It is based on the lack of studies showing a clear causal link. Which is not definitive for the reasons I outlined earlier, but I feel like the onus is on those who want to make it a crime with such harsh penalties to prove said causal link, not the other way around.
Note also that, even if such a clear causal link can be established, surely there is still a difference wrt imputed harm - and thus, culpability - for those who seek out recordings of genuine sexual abuse vs simulated? As things stand, in many jurisdictions, this is not reflected in the penalties at all. Justice aside, it creates a perverse incentive for pedophiles to prefer non-simulated CSAM.
> It's about the potential class of victims and the outrageous life long damage that can be done to them. The appropriate response to recognizing these feelings isn't to hand them AI generated material to sate their desires. It's to get them into therapy immediately.
Are you basically saying that simulated CSAM should be illegal because not banning it would be offensive to real victims of actual abuse? Should we extend this principle to fictional representations of other crimes?
As far as getting them into therapy, this is a great idea, but kinda orthogonal to the whole "and also you get 20+ years in the locker" thing. Even if you fully buy into the whole "gateway drug" theory where consumption of simulated CSAM inevitably leads to actual abuse in the long run, that also means that there are pedophiles at any given moment that are still at the "simulated" stage, and such laws are a very potent deterrent for them to self-report and seek therapy.
With respect to "handing them AI-generated material", this is already a fait accompli given local models like SD. In fact, at this point, it doesn't even require any technical expertise, since image generator apps will happily run on consumer hardware like iPhones, with UI that is basically "type what you want and tap Generate". And unless generated CSAM is then distributed, it's pretty much impossible to restrict this without severe limitations on local image generation in general (basically prohibiting any model that knows what naked humans look like).