I know you're not saying that generating CSAM is ok, can you rephrase that?
Not really, all the big LLMs have guardrails to prevent this. Most image generation models have specific protection for children and sexual material.
There will be services that can do this, but it really is an outlier.