I know you're not saying that generating CSAM is ok, can you rephrase that?If someone is doing no harm to anyone else, I see no problem with a bit of fantasy photo generation.
Not really, all the big LLMs have guardrails to prevent this. Most image generation models have specific protection for children and sexual material.100s of other tools can create the same
There will be services that can do this, but it really is an outlier.
