https://www.digitaltrends.com/computing/googles-gemini-deeme...
I think the solution is not to disallow the titles, but to comment on them and draw attention to the sexism in the article.
Submission titles should be the original article titles, as long as those aren't problematic.
If you believe in democracy, and the rule of law, and citizenship, then the responsibility obviously lies with people who create and publish pictures, not the makers of tools.
Think of it. You can use a phone camera to produce illegal pictures. What kind of a world would we live in if Apple was required to run an AI filter on your pics to determine whether they comply with the laws?
A different question is if X actually hosts generated pictures that are illegal in the UK. In that case, X acts as a publisher, and you can sue them along with the creator for removal.
The power of the AI tools is so great in comparison to a non-AI image editor that there's probably debate on who -- the user, or the operator of the AI -- is creating the image.
Photoshop in the 90s was the former, Grok is the latter.
If you produce a product that causes harm, and there are steps that could be taken to prevent that harm, you should be held responsible for it. Before the trump admin dropped the Boeing case, Boeing was going to be held liable for design defects in its Max planes that caused crashes. The government wasn’t going after Boeing bc a plane crashed, but bc Boeing did not take adequate steps from preventing that from happening.
Festro•2h ago
Ironic that the minister who spearheaded that awful bill (Peter Kyle) as Tech minister is now being the government spokesperson for this debacle as Business Minister. The UK needs someone who knows how tech and business works to tackle this, and that's not Peter Kyle.
A platform suspension in the UK should have been swift, with clear terms for how X can be reinstated. As much as it appears Musk is doubling down on letting Grok produce CSAM as some form of free speech, the UK government should treat it as a limited breach or bug that the vendor needs to resolve, whilst taking action to block the site causing harm until they've fixed it.
Letting X and Grok continue to do harm, and get free PR, is just the worst case scenario for all involved.
roryirvine•1h ago
Peter Kyle was in opposition until July 2024, so how could he have spearheaded it?
Festro•1h ago
tonyedgecombe•1h ago
Festro•1h ago
I'm referring to the Act's powers to compel companies to actually do things in my original comment. I don't know exactly when various parts came into effect that would constitute that, but for the point of my post I'm going on Peter Kyle's own website's dated reference to holding companies accountable.
"As of the 24th July 2025, platforms now have a legal duty to protect children"
https://www.peterkyle.co.uk/blog/2025/07/25/online-safety-ac...
I don't understand why people are taking issue with that. Peter Kyle is the minister who delivered the measures from the bill that a lot of people are angry about and this latest issue on X is just another red flag that the bill is poorly worded and thought out putting too much emphasis on ID age checks for citizens than actually stopping any abuse. Peter Kyle is the one who called out objections to the bill as being on the "side of predators". Peter Kyle is now the one, despite having moved department, who is somehow commenting on this issue.
Totally happy to call out the Tories, and prior ministers who worked on the Bill/Act but Kyle implemented it, made reckless comments about, and now is trying to look proactive about an issue it covers that it's so ineffectively applying to.
blitzar•40m ago
chrisjj•28m ago
It does not empower platform suspension for bikinification.
And there's as yet no substaNtiation of your claim Grok produces CSAM.