Sounds like the kind of system small companies can't implement and large companies won't care to implement.
Or the sort of thing bigger companies lobby for to make the entry barriers higher for small competition. Regulatory capture like this is why companies above a certain level of size/profit/other tend to swing in favour of regulation when they were not while initially “disrupting”.
mschuster91•1h ago
You already need point a) to be in place to comply with EU laws and directives (DSA, anti-terrorism [1]) anyway, and I think the UK has anti-terrorism laws with similar wording, and the US with CSAM laws.
Point b) is required if you operate in Germany, there have been a number of court rulings that platforms have to take down repetitive uploads of banned content [2].
Point c) is something that makes sense, it's time to crack down hard on "nudifiers" and similar apps.
Point d) is the one I have the most issues with, although that's nothing new either, unmasking users via a barely fleshed out subpoena or dragnet orders has been a thing for many many years now.
This thing impacts gatekeepers, so not your small mom-and-pop startup but billion dollar companies. They can afford to hire proper moderation staff to handle such complaints, they just don't want to because it impacts their bottom line - at the cost of everyone affected by AI slop.
[1] https://eucrim.eu/news/rules-on-removing-terrorist-content-o...
[2] https://www.lto.de/recht/nachrichten/n/vizr6424-bgh-renate-k...
privatelypublic•1h ago
johngladtj•1h ago
mschuster91•1h ago
What did you expect governments to do in the face of rising public pressure and inaction?
The EU in particular had more than enough patience. I can't count how often individual countries and then the EU itself told Meta, Twitter, Google/Youtube et al to clean up shop or else, they decided to ignore it or do less than the bare minimum in response... and now they cry as the EU has finally shown some fangs and the US is following suit (although, I'll admit, for the entirely wrong reasons).
When industries want to self-regulate, they can, but they actually have to make an effort, because when stuff goes south, the regulation that results will inevitably be much harder.
johngladtj•1h ago
mschuster91•1h ago
ricardobeat•36m ago
These rules have serious consequences for privacy, potential for abuse, and also raise the barriers immensely for new companies to start up.
The problem is quite obvious when you consider that Trump supporters label anything they dislike as fake news, even when the facts are known and available to everyone. These rules would allow any opposition to be easily silenced. Restricting the measures to terrorism, illegal pornography, and other serious crimes would be more acceptable.
Your question is like asking “why don’t we have metal detectors and body scanners on every school and public building”. Just because you can, and it would absolutely increase safety, does not mean it’s a good idea.
IMO legislation should focus on how individuals can be made responsible, and prosecuted when they break the law – not mandating tech companies to become arms of a nanny state.
benchly•57m ago
Try dialogue.
pjc50•52m ago
Especially at a time when the US is becoming increasingly authoritarian.
anon0502•44m ago
> not your small mom-and-pop startup
not sure why you said this, it's the artists / content makers that suffer.
marcus_holmes•24m ago
So while you can compare the two, it's not an apples-to-apples comparison. You need to squint a bit.
The DMCA has proven to be way too broad, but there's no appetite to change that because it's very useful for copyright holders, and only hurts small content producers/owners. This looks like it's heading the same way.
> This thing impacts gatekeepers, so not your small mom-and-pop startup but billion dollar companies.
I don't see any exemptions for small businesses, so how do you conclude this?
[0] https://www.grcworldforums.com/risk/bridging-global-business... mentions this but I couldn't find a better article specifically addressing the differences in approach.