In his follow-up post he talks about him open sourcing old games as a gift, and he doesn't much care how people receive that gift, just that they do.
He doesn't acknowledge that Anthropic, OpenAI, etc, are profiting while the original authors are not.
The original authors most of the time didn't write the software to profit. But that doesn't mean they don't care if other people profit from their work.
It's odd to me that he doesn't acknowledge this.
AI provides an offramp for people to disengage from social coding. People don't see the point because they still don't understand the difference between barely getting something to work and meaningfully improving that thing with new ideas.
The whole point of contributing to open source is to make decisions and the code is the medium.
What makes this more objectionable than profiting off open source projects by using it directly? eg. tech giants using linux as a server OS, rather than having to pay microsoft thousands per server for a windows server license? With the original GPL, they don't even have to contribute back any patches.
With AI, the link is not clear at all. Its just pure consumption. There is no recognition.
I've never written or contributed to open source code with this being the goal. I never even considered this is why people do it.
(edit: the comment i replied to was edited to be more a statement about themselves rather than a question about other developers, so my comment probably makes less sense now)
i can brag if netflix is using my X or facebook runs all their stuff with my Y. that can help me land consulting gigs, solicit donations, etc.
I'm on both sides. I've contributed to open source. I use AI both in my personal projects now and to make money for my employer.
I'm still not sure how I feel about any of it, but to me the bigger problem is the division between capital and labor and the growing wealth inequality divide.
That quote is about inspiration, not just using others' work or style.
T. S. Eliot's version from 1920 put it best imho:
> Immature poets imitate; mature poets steal; bad poets deface what they take, and good poets make it into something better, or at least something different. The good poet welds his theft into a whole of feeling which is unique, utterly different from that from which it was torn; the bad poet throws it into something which has no cohesion.
What's the point of a gift if the receiver isn't allowed to benefit/profit from it?
For instance, do you think Linus is upset that ~90% of all internet servers are running his os, for profit, without paying him?
Of course he isn't, that was the point of the whole thing!
Are you upset Netflix, Google, and heck, even Microsoft are raking in millions from services running on Linux? No? Of course you aren't. The original author never expected to be paid. He gave the gift of open source, and what a gift it is!
Not exactly. You can modify Linux and run it yourself all you want without obligation to share your changes. The sharing requirements are more limited and involve distribution.
Prominent examples include Sony PlayStation, and Apple OSX.
It's not an unconditional gift, it's got strings attached.
AI training on GPL works is basically IP laundering, you're taking the product without paying the asking prices.
If you take my gift and profit, it doesn't hurt me, there were no strings. Your users presumably benefit from the software I wrote, unless you're using it for evil, but I don't have enough clout to use an only IBM may use it for evil license. You benefit from the software I wrote. I've made the world a better place and I didn't have to market or support my software; win-win.
I've done plenty of software for hire too. I've used plenty of open source software for work. Ocassionally, I've been able to contribute to open source while working for hire, which is always awesome. It's great to be paid to find and fix problems my employer is having and be able to contribute upstream to fix them for lots more people.
That'd be far more believable if it weren't for the fact a vast majority of the research is publicly funded for those drug companies. They have no issues selling their drugs for less money in other markets while still turning a profit. And there's absolutely no indication they'd cease to exist with just outrageous profits, not "crippling entire economies" level profits.
Not all code is licensed that way. Some open-source code had strings attached, but AI launders the code and makes them moot.
there are binary files that some companies are allowing you to download, for now. it was called shareware in the old days.
one day the tap will close and we'll see then what open models really means
For my own purposes, open weights are 95% as good, to be honest. I understand that not everyone will agree with that. As long as training takes hundreds of millions of dollars' worth of somebody else's compute, we're always going to be at the big companies' mercy to some extent.
At some point they will start to restrict access, as you suggest, and that's the point where the righteous indignation displayed by the neo-Luddites will be necessary and helpful. What I advocate is simply to save up enough outrage for that battle. Don't waste your passion defending legacy copyright interests.
At that point it will be far, far, faaaaar too late.
> Don't waste your passion defending legacy copyright interests
The companies training big models are actively respecting copyright from anyone big enough to actually fight back, and soaking everyone else.
They are actively furthering the entrenchment of Big IP Law.
I use AI every day in my dev workflows, yet I am still easily able to empathize with those who did not intend for their code to be laundered through AI to remove their attribution (or whatever other caveats applied in their licensing.)
Disney saw which way the wind is blowing and invested over a billion into OpenAI
Tech is becoming universally hated whereas before it was adored and treated optimistically/preferably.
If someone published something as MIT and doesn't like it being used for LLM training, yeah that person can only blame themselves.
For GPL, it all depends if you consider a LLM "derivative software" of the GPL code it was trained on. It's fair to have an opinion on that either way, but I don't think it's fair to treat that opinion as the obvious truth. The same applies to art, a lot of it is visible on the Internet but that doesn't make it "a gift".
> The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
My personal thought on that: it's going to be almost guaranteed that, if an LLM is producing stuff it clearly derived from a certain piece of code XYZ, it will also be capable of producing the correct answer to the question "what's the license for XYZ?" And lawyers will successfully argue that this counts as "included".
The giving back part is strongly related to the "freedom", not related to whether you profit from it or not.
My opinion is that it actually hurts everyone when the open source commons are looted for private profits
My motivations are very different: the projects I authored and maintained were deliberately all GPL-licensed, my contributions to other OSS was motivated by helping other people - not to an amorphous "world."
A: you made this as a free gift to anyone including openai B: you made this to profit yourself in some way
The argument he makes is if you did the second one don't do opensource?
It does kill a ton of opensource companies though and truth is that model of operating now is not going to work in this new age.
Also is sad because it means the whole system will collapse. The processes that made him famous can no longer be followed. Your open source code will be used by countless people and they will never know your name.
It's not called a distruptive tech for nothing. Can't un opensource all that code without lobotomizing every AI model.
Obviously LLMs are new and nobody knew that they would happen. But the part where most popular OSS willfully committed to broad for profit use is not.
He says it's a gift, and if people do whatever, he doesn't care; he already gave it away.
I think it's interesting that nobody would cry that Fabien should shovel cash from his book sales towards Carmack, nor should those who learned how to code by reading source owe something to the authors beyond gratitude and maybe a note here and there.
Even things like Apple's new implementation of SMB, which is "code clean" from GPLv3 Samba, but likely still leans on the years and years of experience and documentation about the SMB protocol.
That's his choice and I assume he licensed his code accordingly. That doesn't mean that the choices of others who used different licenses are invalid.
There were source available licenses against commercial use. Free Software Definition and Open Source Definition said a license must allow any use.
Either it works and the AI makers stop stop slurping up OSS or it doesn't hold up in court and shrinkwrap licenses are deemed bullshit. A win/win scenario if you ask me.
How is this different than any company that uses the open source software?
I find this argument hard to swallow. If open source contributors want to profit from their code being used and prevent big companies from using it or learning from it, open sourcing it would be an irrational choice.
recognition for the authors, which can lead to all sorts of opportunities. "netflix uses my X for their Y, worldwide" opens doors.
Not a community-developed project with a lot of contributors, but a software that would realistically qualify as being mostly attributable to one person?
Redis is an easy example, but the author of that doesn't need to say "Netflix uses my X" because the software is popular by itself. AI being trained on Redis code hasn't done anything to diminish that, as far as I can tell.
FAANG specifically? no, i am not familiar with their entire tech stacks.
but i have leaned on my single-developer projects (being used in other, not owned by me, software) to help land consulting gigs.
I've noticed this thing where people who have decided they are strongly "anti-AI" will just parrot talking points without really thinking them through, and this is a common one.
Someone made this argument to me recently, but when probed, they were also against open weights models training on OSS as well, because they simply don't want LLMs to exist as a going concern. It seems like the profit "reason" is just a convenient bullet point that resonates with people that dislike corporations or the current capitalist structure.
Similarly, plenty of folks driving big gas guzzling vehicles and generally not terribly climate-focused will spread misinformation about AI water usage. It's frankly kind of maddening. I wish people would just give their actual reasons, which are largely (actually) motivated by perceived economic vulnerability.
This doesn't make sense. You make something and put out there, for free, of your own will. Why do you care if someone takes it and makes a profit? Shouldn't you have taken that profit route yourself before if that's what you wanted?
You basically are looking at a contract and saying you aren't going to agree to the terms but you're taking the product anyway.
If you publish a cookbook, you should get a portion of the sales of the cookbook itself, and no one should be allowed to distribute copies of it for free to undermine your sales.
What you don't get is a portion of the revenues of restaurants that use your recipes!
I’m against AI art because it is built on stealing the work of artists who did not consent to their work being trained on.
I couldn’t care less about models trained on the open source software I released, because I released it to be used.
edit: I’m assuming licenses were respected
Licenses were not respected. Most open source licenses require credit at least.
I don't ask anyone to share my ideals but conflating these two is dishonest.
- OSS is valuable for decentralizing power and influence
- AI as it is being developed is likely to centralize it
Depends on how you see it.
I know many people building oss, local alternatives to enterprise software for specific industries that cost thousands of dollars all thanks to AI.
If everyone can produce software now and at a much complex and bigger scale, it's much easier to create decentralized and free alternatives to long-standing closed projects.
He can easily afford to be altruistic in this regard.
But Carmack isn't wired for empathy; he has never been.
What an utterly pretentious and rude thing to say.
And that is a big reason why he's making this post, is what I'm saying. It doesn't excuse him, but it's not surprising in the least.
Can you give some examples, outside of this post? I only know about Carmack by the things he'd worked on, but not anything personal like this. This would help me get a more complete picture of him.
That's not true. There are business models around open source, and many companies making money from open source work.
(I'm only reacting to this specific part of your comment)
The point is that most individuals who open source their code do so without expecting financial returns from it. In that context, whether Carmack has a $1 or $1e9 doesn’t make a difference.
Not only are there businesses built around open-source work, but it used to be widely-accepted that publishing open-source software was a good way to land a paying gig as a junior.
I think that whether you need to continue working to afford to live is very relevant to discussions about AI.
Profits don't need to be direct - and licenses are chosen based on a user's particular open-source goals. AI does not respect code's original licensing.
What's your point here? Because whether or not someone needs income to pay their bills is MASSIVELY relevant to whether or not they have to care about the profit on their work.
The bulk of Open Source maintainers aren't "set for life", and need to get a real job in order to not be homeless.
Ah, how naive. You're not squinting hard enough.
But the man's argument is that since he sees something a given way then it's the truth. What people are doing in return is showing that he can only do so because of who he is.
GPL is transactional. The author's profit is in the up streaming of enhancements.
Those who release under GPL absolutely do care about profit, it's just that the profit is measured in contributions.
saying he has no empathy, and has never had empathy, on the other hand...
If you want to make money, use a proper license.
To expand on this, GPL is not against capitalism neither. Sometimes, end-users' freedom with their hardware is good to make money on (they buy your support, to have confidence they can migrate from one hardware to another, or use their hardware way longer than the original manufacturer can stay in business). But it is also not an automated license to say "give me your money" neither.
Anti-AI sentiment comes primarily from slop PRs (and slop projects) along with the water use hoax; copyright concerns originate almost entirely from the art sphere, crossing over into the open source sphere by osmosis and only representing a small minority of opinion-havers therein.
Fine for him, but it's totally reasonable for people to want to use the GPL and not have it sneakily bypassed using AI.
The license was supposed to make derivative work feed back into improving the software itself, not to allow it to be used to create competing software.
Many of those are disappointed with leading free software / open source advocates such as Stallman for not taking a stance against the AI companies' practice.
Should we protect developers and their rights? Surely, and users' rights too definitely. But protecting source-code as such seems a bit abstract to me.
Carmack's argument makes no sense, but I guess it has "Carmack" in it so obviously it must be on the front page of HN.
https://youtu.be/ucXYWG0vqqk?t=1889
I find him speaking really soothing.
This is demonstrably incorrect given how LLM are built, and he should retire instead of trolling people that still care about workmanship. =3
"A Day in the Life of an Ensh*ttificator"
- Sharing/working on something for free with the hopes that others like it and maybe co tribute back.
- Sharing something for free so that a giant corporation can make several trillion dollars and use my passion to train a machine for (including, but not limited to) drone striking a school.
Open sourcing code is a form of power, power to influence, inspire, and propagate one's worldview on whomever reads that code. Thank you OpenAI, Anthropic, Meta, thank you for amplifying the voices of all us open source contributors!
I respect Carmack so much more now. I always scratched my head why he made Quake GPL. It was such a waste. Now it doesn't matter anymore. I so thankful copyleft is finally losing its teeth. It served its purpose 30 years ago, we don't need it anymore.
It seems like Carmack, like a lot of tech people, have forgotten to ask the question: who stands to benefit if we devalue the US services economy broadly? Who stands to lose? It seems like a lot of these people are assuming AI will be a universal good. It is easy to feel that way when you are independently wealthy and won't feel the fallout.
Even a small % of layoffs of the US white collar work force will crash the economy, as our economy is extremely levered. This is what happened in 2008: like 7% of mortgages failed, and this caused a cascade of failures we are still feeling today.
what examples are you thinking of?
Look up:
- The Haymarket Affair
- The Homestead Strike
- The Triangle Shirtwaist Factory Fire
- The Ludlow Massacre
- The Battle of Blair Mountain
You could also simply have taken the quote you were responding to and run it through a few LLMs to acquire those examples.
A major economic crash as the only consequence would be the good ending.
The real societal risk here is that software development is not just a field of primarily white men, it was one of the last few jobs that could reliably get one homeownership & an (upper) middle class life.
And the current US government is not, shall we say, the most liberal. There is a substantial risk that when forced with the financial destitution of being unemployed while your field is dying, people will radicalize.
It takes a good amount of moral integrity to be homeless under a bridge and still oppose the gestapo deporting the foreigners who have jobs you'd be qualified for. And once the deportations begin, I doubt they'll stop with only the H1Bs. The Trump admin's not exactly been subtle about their desire to undo naturalizations and even birthright citizenship.
https://news.ycombinator.com/item?id=47115597
The US is built on-top of a high value service economy. And what we're doing is allowing a couple companies to come in, devalue US service labor, and capture a small fraction of the prior value for themselves on top of models trained on copyrighted material without permission. Of course, to your point: things can get a lot worse than that. I honestly don't think a lot of executives even know how much they're shooting themselves in the foot because they seem unable to think beyond the first order.
I also see a lot of top 1% famous or semi-famous engineers totally ignoring the economic realities of this tech, people like: Carmack, Simon Willison, Mitchell Hashimoto, Steve Yegg, Salvatore Sanfilippo and others. They are blind to the suffering these technologies could cause even in the event it is temporary. Sure, it's fun, but weekend projects are irrelevant when people cannot put food on the table. It's been really something to watch them and a lot of my friends from FAANG totally ignore this side. It is why identity matters when people make arguments.
I also think I'm insulated partially from the likely initial waves of fallout here by nature of a lucky and successful career. I would love it if the influential engineers I mentioned above stopped acting like high modernists and started taking the social consequences of this technology seriously. They could change a lot more minds than I could. And they could ensure through that advocacy for labor that we see the happiest ending with respect to rolling out LLMs.
Unfortunately I don't really believe labor has much teeth anymore, and tech will wake up too late to do anything about it.
Training an AI on GPL code and then having it generate equivalent code that is released under a closed source license seems like a good way to destroy the copy-left FOSS ecosystem.
I think this debate is mainly about the value of human labor. I guess when you're a millionaire, it's much easier to be excited about human labor losing value.
I can understand his stance on AI given this perspective. I have a harder time empathizing his frustrations. Did he also have a hard time coming to terms with the need for AGPL?
MIT asks for credit. GPL asks or credit and GPL'ing of things built atop. Unlicense is a free gift, but it is a minority.
AI reproduces code while removing credit and copyleft from it and this is the problem.
It is far healthier to see it as a collaboration. The author publishes the software with freedoms that allow anyone to not only use the software, but crucially to modify it and, hopefully, to publish their changes as well so that the entire community can benefit, not just the original author or those who modify it. It encourages people to not keep software to themselves, which is in great part the problem with proprietary software. Additionally, copyleft licenses ensure that those freedoms are propagated, so that malicious people don't abuse the system, i.e. avoiding the paradox of tolerance.
Far be it from me to question the wisdom of someone like Carmack, but he's not exactly an authority on open source. While id has released many of their games over the years, this is often a few years after the games are commercially relevant. I guess it makes sense that someone sees open source as a "gift" they give to the world after they've extracted the value they needed from it. I have little interest in what he has to say about "AI", as well.
Hey John, where can I find the open source projects released by your "AI" company?
Ah, there's physical_atari[1]. Somehow I doubt this is the next industry breakthrough, but I won't look a gift horse in the mouth.
Where and when? In cases where LLM coding assistants reproduce copyleft code in someone's work assignment? The responsibility in those would be on the user, not on AI.
That is, in fact, OSS. Open source does not mean, and has never meant, ongoing development nor development with the community.
[0] https://en.wikipedia.org/wiki/Open-source_software_developme...
[1] https://en.wikipedia.org/wiki/The_Cathedral_and_the_Bazaar
Keyword being "can"
The Wikipedia page you linked to refers to "Open-source software development (OSSD)" which implies that it's a different concept than "open source" by itself
The conferees believed the pragmatic, business-case grounds that had motivated Netscape to release their code illustrated a valuable way to engage with potential software users and developers, and convince them to create and improve source code by participating in an engaged community. The conferees also believed that it would be useful to have a single label that identified this approach and distinguished it from the philosophically- and politically-focused label “free software.”
From the beginning it was about promoting the model of developing software in an open community. The licensing is a means to that, but the motivating idea is to have open-source development.
And Netscape’s release of the source code, what lead to Mozilla, was prompted by the “bazaar” ideas presented by RMS.
You're right and it's worth pointing out that a lot of open source has the opposite lifecycle: the authors make a thing, aren't sure how to sell it, so they open source it and hope to eventually sell something peripheral, i.e. "open core" with paid plugins or enterprise support.
In these cases, open source isn't a gift so much as a marketing strategy. So it makes sense the maintainers wouldn't see LLM training on their code as a good thing; it was never a "gift", it was a loss leader.
I break down what you said as: "Sure, he's released code with an open-source license, but that's not real open source in the sense that matters."
I happen to disagree. OSS is OSS. AGPL is OSS. MIT is Open Source. Unlicense is OSS.
(I do agree that it's still OSS even if you never maintain it or anything.)
I agree there's a difference between publishing code under an OSS license and actively maintaining a project while fielding the flood of low-quality AI issues and PRs. Someone in the latter category is obviously closer to that pain.
I still wouldn't go so far as to dismiss Carmack's view on that basis alone, though. It just means his experience is less representative of maintainers dealing with that specific problem every day.
He also started an AI company, right?
meanwhile, in the trenches, rent and bills are approaching 2/3 of paycheck and food the other 2/3, while at the same time the value of our knowledge and experience are going down to zero (in the eyes of the managerial class)
'ai training magnifies the gift' ... sure thing ai training magnifies a lot of things
I really can't see a valid reason to be against it, beyond something related to profiting in some way by restricting access, which - I would think - is the antithesis of copyleft/permissively licensed open source.
In the other thread you argued that AI output is not copyrighted.
Do you think I can take proprietary code and lauder through AI to get a non-copyrighted copy of it, then modify to my needs? How can I obtain the proprietary code legally in the first place?
I think if people want a revshare on things then perhaps they should release under a revshare license. Providing things under open licenses and then pulling a bait-and-switch saying "oh the license isn't actually that you're not supposed to be doing that" doesn't sit right with me. Just be upfront and open with things.
The point of the Free Software licenses is that you can go profit off the software, you just have certain obligations back. I think those are pretty good standards. And, in fact, given the tendency towards The Revshare License that everyone seems to learn towards, I think that coming up with the GPL or MIT must have taken some exceptional people. Good for them.
but if my tool becomes popular and a megacorp uses it to promote their own commercial closed source features alongside it, then that's excessive. that's one reason i like the AGPL, it reduced that. but in my opinion the ideal license is one that limits the freedom to smaller companies. maybe less than 100 or 500 employees, or less than some reasonable amount of revenue. (10 million per year? is that to high or to low?)
and even for those above, i don't want revshare, just pay me something adequate.
Now it feels like the public good is being diminished (enshittification) as they keep turning the "profit" knob, trying to squeeze more and more marginal dollars from the good.
The system still requires the same inputs from us, but gives less back.
The idea is that you have people paid to create something of potential value, but the value of the outputs has only a limited and indirect impact on their compensation. If someone finds the outputs valuable, they should mention it in public, to let the creators use it to demonstrate the value of their work to funders and other interested parties.
Did you respond by asking them how Reddit makes money?
The anti-corporate mentality isn't new, but it does surface in different ways and communities over time. The Reddit hivemind leans very anti-corporate, albeit with a huge blind spot for corporations they actually like (Reddit itself, their chosen phone brand, the corporations that produce the shows they watch).
The Reddit style rebellion is largely symbolic, with a lot of shaming and snark, but it usually stops when it would require people to alter their own behavior. That's why you got dog-piled for doing something productive on a site where user-generated content is the money maker.
The free beer movement came out of UNIX culture, probably influenced by how originally AT&T wasn't able to profit from it.
The problem is that the big tech companies aren't holding up their end of the traditional social contract.
I like to think of the wider open source community as one giant group project. Everyone contributes what they can, and in turn they can benefit from the work everyone else has done. The work you do goes towards making the world a better place. I have absolutely zero problem filing pull requests for bugs I encounter or submitting issues on OpenStreetMap, because I know that in return I get the Linux DE and reliable maps in other towns. If you want to make it political, it's a "from each according to their means, to each according to their needs".
The big tech companies operate completely differently. They see open source contributors primarily as a resource to exploit. Submit a single fix on Google Maps? You'll get zero credit, they'll never stop bothering you with popups about "making improvements", design their map around what is most profitable to show, and they will of course log your location history and sell it to the highest bidder. And they are getting filthy rich off of it as well.
I couldn't care less about getting monetary compensation for some odd work I do in my spare time, but there's no way in hell I'm going to do free labor for some millionaire who's going to reward me by spitting in my face.
Other FOSS developers, not so much. They are the ones who are exploited.
Copy left licenses are generally intended, afaict, to protect the commons and ensure people have access to the source. AI systems seem to hide that. And they contribute nothing back.
Maybe they need updating, IANAL. But I’d be hesitant to believe that everyone should be as excited as Carmack is.
nkassis•1h ago
Edit: I'm also thinking of what he did rewriting all of Symbolics code for LISP machines
(similar to the person that accidentally hacked all vacuum of a certain manufacturer trying to gain access to his robot vacuum? https://www.theguardian.com/lifeandstyle/2026/feb/24/acciden...)
bombcar•1h ago
In a world without copyright, code obfuscation, or compliers, where everything ran interpreted as it was written and nobody could do anything to you if you modified it, Stallman would be perfectly content.