They're both underdefining what "intimate images" means and using the term images instead of photos. So this means they want this to apply to everything that can be represented visually even if it has nothing to do with anything that happened in reality. Which means they don't care about actual harms. The way they're using the word 'harm' seems be to more in line with the word 'offend'. So now in the UK if there is an offensive image (like a painting) posted on a web site (or other internet protocol) they are going to be, " treated with the same severity as child sexual abuse and terrorism content,". That's wild. And dangerous. This policy will do far more damage than any painting or other non-photo images would.
iMark•8m ago
I agree that laws such as this to be defined very carefully, but I think "images" is the appropriate term to use, rather than "photos". LLMs make it near trivially easy to render a photo in countless styles, after all, such as paintings, or sketches.
actionfromafar•4m ago
I hate to appear to defend this, but generative AI has sort of collapsed the distinction between a photo and an image. I could generate an image from a photo which told the same story, then delete the photo, and now everything is peachy fine? So that could have been a motivation for "images".
Though I wonder if not existing frameworks around slander and libel could be made to address the brave new world of AI augmented abuse.
vr46•3m ago
What bit of "intimate images shared without a victim's consent" is lacking context in the article?
superkuh•23m ago
iMark•8m ago
actionfromafar•4m ago
Though I wonder if not existing frameworks around slander and libel could be made to address the brave new world of AI augmented abuse.
vr46•3m ago