They Want Any Excuse For Censorship

If someone is doing no harm to anyone else, I see no problem with a bit of fantasy photo generation.
I know you're not saying that generating CSAM is ok, can you rephrase that?


100s of other tools can create the same
Not really, all the big LLMs have guardrails to prevent this. Most image generation models have specific protection for children and sexual material.

There will be services that can do this, but it really is an outlier.
 
Some seem to be saying there's no issue with such ai tooling.

Ai needs some limits.

Women and children need to be safe, from online as well as in life, predators
I don't think it is Ai that needs limits as much as some people need them themselves.
 
I'd go with the middle one, as from what I've seen from AI, they struggle with writing.
I have a feeling that its a trick question and they are all fake. But the florescent one clearly has the hair complete with distant grey hills taken from the middle one and the red jacket one has something dodgy going on with the arm on the right (her left).
Odd thing with them all is that the boats in the back ground are all in different positions like they would be if there was a gap in time between 3 separate pictures
 
Id say you and her can choose what you wear and do, without anybody interfering.

Would you say the same about people creating child sexual pictures ?

Or even fake pics of lena?
I am not going to get into a discussion on child sexual content, it is wrong on all levels and not something your average person would consider as normal or legal or ok. That is why we have prisons, I would sooner make a cell vacant for anyone indulging in such activities even if it means releasing a violent offender to make the room.
 
what harm does a person do to others if they create such an image for their own private use?
AI-generated child sexual abuse material (CSAM) also plays a significant role in the normalisation of offending behaviour. There is increasing evidence showing a strong correlation between viewing CSAM online and seeking direct contact with children. The ease and availability of AI- generated CSAM will only escalate this further and create a more permissive environment for perpetrators, putting increased children at risk.
 
Back
Top