You don't think its really creepy to sit on a beach with a long lens and photograph strangers in their bikinis or even topless without their consent or knowledge and make these pictures public?
I can't imagine anybody doing that. Is it common.
You don't think its really creepy to sit on a beach with a long lens and photograph strangers in their bikinis or even topless without their consent or knowledge and make these pictures public?


Yes, I have often found myself on the front page of Vogue, Mens Health and horse and rider without my consent.I can't imagine anybody doing that. Is it common.

Journos do it.You don't think its really creepy to sit on a beach with a long lens and photograph strangers in their bikinis or even topless without their consent or knowledge and make these pictures public?
Your preference is interesting...I would sooner make a cell vacant for anyone indulging in such activities even if it means releasing a violent offender to make the room.

Ok Again, nobody is talking about making illegal indecent images of children. This new law is the creation of intimate pictures of film of adults without consent.I know you're not saying that generating CSAM is ok, can you rephrase that?
again nobody is talking about children.. You will also find plenty of image generators perfectly happy to generate "intimate" images. VisualGPT for exampleNot really, all the big LLMs have guardrails to prevent this. Most image generation models have specific protection for children and sexual material.
there are many.There will be services that can do this, but it really is an outlier.

I wouldn't mind a violent person living near me, I can handle violent criminals, not a peodophile though.Your preference is interesting...
I'd prefer that those in prison serving sentences for non violent crimes and political offences be given early release as opposed to violent criminals in order to make room for the perverts...
But each to their own I guess![]()
again nobody is talking about children
Again, the reason this is news is because Musk has ripped out all the safety checks normally in place.MPs are “deeply alarmed” by recent reports of the Grok AI chatbot being used to create undressed and sexualised images of women and children, the Chair of a cross-party committee has said, in a letter published today.

Unfortunately the law goes much further than preventing the things identified in the above.
Again, the reason this is news is because Musk has ripped out all the safety checks normally in place.
It's a bit late for that now, the law was passed ages ago. Was there a thread at the time?Unfortunately the law goes much further than preventing the things identified in the above.
Have you read it?

That doesn't make sense...I wouldn't mind a violent person living near me, I can handle violent criminals, not a peodophile though.

Which would you choose to live near to you?That doesn't make sense...

When I saidWhich would you choose to live near to you?
I meant that if you can handle a violent offender, what's the difference with a paedophile?That doesn't make sense...

You seriously asking that question?When I said
I meant that if you can handle a violent offender, what's the difference with a paedophile?

OK. Let's dial this back. What do you mean byYou seriously asking that question?