The hypocrisy really grinds my gears.
And a monetary fine is just the cost of doing business if you are big enough.
This is bullshit and you know it. No one would ever want to go to the trouble, expense, and misery of enforcing laws where the so-called victims do not feel wronged and refuse to press complaints to the authorities. And so, copyright enforcement will only ever occur where the rightsholders wish to enforce it. The few lawsuits you see here and there aren't about genuine sentiment of being wrong, but of the rightsholders wanting their cuts. Backroom deals are already being drawn up, and everyone's cool with it.
This is the status quo. If you don't like it, suggest something better, but don't be naive that the law as it stands now could be applied sanely or pragmatically.
bpodgursky•3h ago
Sorry but this is just a competitive reality and the content matters A LOT. Sucks that Elsevier gambled badly on the scientific community putting up with overpriced subscriptions forever, but their concerns can't dictate national policy on this.
kmeisthax•2h ago
Also, if we're going to bin the entire concept of copyright, can we at least be equal about it? I'd rather not live in a world where humans labor for the remnants of their culture in the content mines while clankers[0] feast on an endless stream of training data.
[0] Fake racial slur for robots or other AI systems.
zzo38computer•2h ago
Nevertheless I thin there is another thing against the LLM training, which is that the scraping seems to be excessive (although it could be made less excessive; there are many ways to help with making it less excessive) and I think it requires too much power (although I don't really know a lot about it).
These are two separate issues, though.
jruohonen•1h ago
You know, it is really the CC-BY-style most science people care about. Same goes with MIT/BSD open source licenses, while with GPL I suppose it is one the side of CC-BY-SA.
arjie•24m ago
And if I'm being honest, I'm tired of the International Brotherhood of Stevedores[0] style of shredding human productivity to protect some special interest group. If Elsevier died tomorrow, we'd lose a curation function to scientific papers, true, but we wouldn't lose the science itself. And while the curation on scientific output is clearly valuable - China is suffering the lack of this while producing prodigious science - I think it's far less important than the scientific output itself. This is especially true of US science.
0: IBS, the AMA, pharmacists, teacher unions, firefighter unions, tax preparers: the distributed cost to society is huge because we decided on protecting these special interest groups. Blocking AI would be a bridge too far.