It is amazing how shamelessly these LLM thieves argue.
Paraphrasing that: it is amazing how much money these shameless LLM thieves have.
The problem with AI industry is that they took the scientific exception intended for the betterment of mankind and then tried to turn that into a profit model. Many if their base models are borderline legal, but using those for making money requires some kind of regulation or deal.
I expect this to get resolved by having mandatory AI royalty fees that are added to every AI subscription, to be handed out to the creative industry. It's how my country and a few others responded when tape made it possible to record songs from the radio. It'll satisfy the huge companies (because they'll be paid more the more works they have) and spit in the face of smaller creatives or mere hobbyists.
It'll cost a couple of billion to grease the wheels, but if these companies work the same way the cryptocurrency industry paid off the Trump campaign, it won't be a difficult problem to solve.
I'm not AI optimist or booster, but AI is an advancement in the arts and sciences, despite all the risks and downsides. AI is a derivative work from all digitized media.
There's an argument to be made that the AI companies should borrow a copy of every book, rent every movie, etc. But the money accruing to the owners of those copyrights would be marginal, and even summing over all those individual copyright claims, I'd say that the societal benefit to AI is greater.
Maybe it's time that we have compulsory licensing the way that radio can license music to play. As training data for AI, a compulsory license, in principle, should be quite cheap, on par with renting that media.
The bigger question is to what extent AI will tend to make other media obsolete. We are already seeing this with AI summaries in web search undermining the search results themselves. I don't have an answer to that, except to say that severely restricting the training data available to AI is not very helpful.
I fear this will be forced on us all.
I fear it because right now, it's already true that if you object to your works being used to train AI, then you can't publish your works (especially not on the web). A growing number of people are going that route, reducing the works available to us. But there is still a sliver of hope that a solution could be found at some point and it would be safe to publish again. If compulsory licensing happens, then even that small hope is lost.
But that's unlikely to happen, because any kind of compulsory licensing scheme that could allow creators to actually survive would still be cripplingly expensive for AI companies, given the number of works they have devoured under the assumption that all the world is theirs to take...and they clearly have the ear of the current administration.
Corn subsidies are a deal society made to advance the consumption of sugar water.
Who really makes these deals and who benefits from them?
Maybe having so much new content all the time is making us too content while the backroom deal makers laugh their way to the bank and fund more wars.
> and sciences
People en masse want cheaper energy, better machines and better medicine regardless of profitability of the producers. I don't buy into the idea that the only way to incentivize new inventions is with fame and wealth. People like to do good work they are proud of - not enough people, arguably, but they exist.
Some inventions may be too powerful to go without strict regulations like nuclear energy. I'd argue AI is in the same basket. I believe the internet, and by extension internet connected AI, should be considered a public utility and governed by the public.
If I record every song played on the radio for years to a digital file, will I be charged ? You know I would be.
How are these different than what AI is doing. In the US, companies are considered a person, so to me, off to jail these companies go. Why is this turning into a big deal. We know AI is stealing copyrighted data. So I hope they get what they deserve.
Which in a way is basically the UBI they claim to want anyway.
quantified•3h ago
whatshisface•3h ago
Towaway69•3h ago
Nice product you have there, shame if it got sued for copyright infringement.
TrackerFF•2h ago
kwertyoowiyop•2h ago
kelseyfrog•2h ago
jart•2h ago
JohnFen•2h ago
Equally, every author should also have the right to not have their work ingested by AI.
jart•2h ago
However you'd have to delist yourself from search engines to fully prevent AIs from reading the content on your website.
JohnFen•2h ago
It most certainly does not. robots.txt is almost totally worthless against genAI crawlers. Even being unindexed from search engines doesn't keep you safe.
kelseyfrog•2h ago
gibbitz•2h ago
gopher_space•2h ago
neaden•2h ago
lawlessone•2h ago
jlarocco•2h ago
Do people really own land? Or is it just a legal fiction to make things easier?
Is a contract really a thing? Or is it a legal fiction?
Times change. IP is a thing now. It's kinda funny to see somebody on a software developer forum arguing IP doesn't exist.
logsr•2h ago
IP law recognizes this ground truth and creates a legal framework that allows IP to be traded in the economy which creates an incentive for people to share their IP.
ronsor•2h ago
quantified•2h ago
brentm•2h ago
That being said, I don't know if I agree that the AI companies are violating IP rights.
jart•2h ago
quantified•2h ago
7jjjjjjj•1h ago
dragonwriter•1h ago
The question for any category of property is whether it is a socially useful construct or not, not whether it has a basis is nature.
brentm•2h ago