In his follow-up post he talks about him open sourcing old games as a gift, and he doesn't much care how people receive that gift, just that they do.
He doesn't acknowledge that Anthropic, OpenAI, etc, are profiting while the original authors are not.
The original authors most of the time didn't write the software to profit. But that doesn't mean they don't care if other people profit from their work.
It's odd to me that he doesn't acknowledge this.
AI provides an offramp for people to disengage from social coding. People don't see the point because they still don't understand the difference between barely getting something to work and meaningfully improving that thing with new ideas.
What makes this more objectionable than profiting off open source projects by using it directly? eg. tech giants using linux as a server OS, rather than having to pay microsoft thousands per server for a windows server license? With the original GPL, they don't even have to contribute back any patches.
With AI, the link is not clear at all. Its just pure consumption. There is no recognition.
I've never written or contributed to open source code with this being the goal. I never even considered this is why people do it.
(edit: the comment i replied to was edited to be more a statement about themselves rather than a question about other developers, so my comment probably makes less sense now)
i can brag if netflix is using my X or facebook runs all their stuff with my Y. that can help me land consulting gigs, solicit donations, etc.
I'm on both sides. I've contributed to open source. I use AI both in my personal projects now and to make money for my employer.
I'm still not sure how I feel about any of it, but to me the bigger problem is the division between capital and labor and the growing wealth inequality divide.
What's the point of a gift if the receiver isn't allowed to benefit/profit from it?
For instance, do you think Linus is upset that ~90% of all internet servers are running his os, for profit, without paying him?
Of course he isn't, that was the point of the whole thing!
Are you upset Netflix, Google, and heck, even Microsoft are raking in millions from services running on Linux? No? Of course you aren't. The original author never expected to be paid. He gave the gift of open source, and what a gift it is!
If you take my gift and profit, it doesn't hurt me, there were no strings. Your users presumably benefit from the software I wrote, unless you're using it for evil, but I don't have enough clout to use an only IBM may use it for evil license. You benefit from the software I wrote. I've made the world a better place and I didn't have to market or support my software; win-win.
I've done plenty of software for hire too. I've used plenty of open source software for work. Ocassionally, I've been able to contribute to open source while working for hire, which is always awesome. It's great to be paid to find and fix problems my employer is having and be able to contribute upstream to fix them for lots more people.
Not all code is licensed that way. Some open-source code had strings attached, but AI launders the code and makes them moot.
A: you made this as a free gift to anyone including openai B: you made this to profit yourself in some way
The argument he makes is if you did the second one don't do opensource?
It does kill a ton of opensource companies though and truth is that model of operating now is not going to work in this new age.
Also is sad because it means the whole system will collapse. The processes that made him famous can no longer be followed. Your open source code will be used by countless people and they will never know your name.
It's not called a distruptive tech for nothing. Can't un opensource all that code without lobotomizing every AI model.
Obviously LLMs are new and nobody knew that they would happen. But the part where most popular OSS willfully committed to broad for profit use is not.
He says it's a gift, and if people do whatever, he doesn't care; he already gave it away.
I think it's interesting that nobody would cry that Fabien should shovel cash from his book sales towards Carmack, nor should those who learned how to code by reading source owe something to the authors beyond gratitude and maybe a note here and there.
Even things like Apple's new implementation of SMB, which is "code clean" from GPLv3 Samba, but likely still leans on the years and years of experience and documentation about the SMB protocol.
There were source available licenses against commercial use. Free Software Definition and Open Source Definition said a license must allow any use.
Either it works and the AI makers stop stop slurping up OSS or it doesn't hold up in court and shrinkwrap licenses are deemed bullshit. A win/win scenario if you ask me.
I’m against AI art because it is built on stealing the work of artists who did not consent to their work being trained on.
I couldn’t care less about models trained on the open source software I released, because I released it to be used.
edit: I’m assuming licenses were respected
I don't ask anyone to share my ideals but conflating these two is dishonest.
- OSS is valuable for decentralizing power and influence
- AI as it is being developed is likely to centralize it
Depends on how you see it.
I know many people building oss, local alternatives to enterprise software for specific industries that cost thousands of dollars all thanks to AI.
If everyone can produce software now and at a much complex and bigger scale, it's much easier to create decentralized and free alternatives to long-standing closed projects.
He can easily afford to be altruistic in this regard.
But Carmack isn't wired for empathy; he has never been.
What an utterly pretentious and rude thing to say.
That's not true. There are business models around open source, and many companies making money from open source work.
(I'm only reacting to this specific part of your comment)
Not only are there businesses built around open-source work, but it used to be widely-accepted that publishing open-source software was a good way to land a paying gig as a junior.
I think that whether you need to continue working to afford to live is very relevant to discussions about AI.
Profits don't need to be direct - and licenses are chosen based on a user's particular open-source goals. AI does not respect code's original licensing.
What's your point here? Because whether or not someone needs income to pay their bills is MASSIVELY relevant to whether or not they have to care about the profit on their work.
The bulk of Open Source maintainers aren't "set for life", and need to get a real job in order to not be homeless.
But the man's argument is that since he sees something a given way then it's the truth. What people are doing in return is showing that he can only do so because of who he is.
If you want to make money, use a proper license.
To expand on this, GPL is not against capitalism neither. Sometimes, end-users' freedom with their hardware is good to make money on (they buy your support, to have confidence they can migrate from one hardware to another, or use their hardware way longer than the original manufacturer can stay in business). But it is also not an automated license to say "give me your money" neither.
Anti-AI sentiment comes primarily from slop PRs (and slop projects) along with the water use hoax; copyright concerns originate almost entirely from the art sphere, crossing over into the open source sphere by osmosis and only representing a small minority of opinion-havers therein.
Fine for him, but it's totally reasonable for people to want to use the GPL and not have it sneakily bypassed using AI.
The license was supposed to make derivative work feed back into improving the software itself, not to allow it to be used to create competing software.
Many of those developers are disappointed with leading free software / open source advocates such as Stallman for not taking a stance against the AI companies practice.
https://youtu.be/ucXYWG0vqqk?t=1889
I find him speaking really soothing.
This is demonstrably incorrect given how LLM are built, and he should retire instead of trolling people that still care about workmanship. =3
"A Day in the Life of an Ensh*ttificator"
- Sharing/working on something for free with the hopes that others like it and maybe co tribute back.
- Sharing something for free so that a giant corporation can make several trillion dollars and use my passion to train a machine for (including, but not limited to) drone striking a school.
Open sourcing code is a form of power, power to influence, inspire, and propagate one's worldview on whomever reads that code. Thank you OpenAI, Anthropic, Meta, thank you for amplifying the voices of all us open source contributors!
I respect Carmack so much more now. I always scratched my head why he made Quake GPL. It was such a waste. Now it doesn't matter anymore. I so thankful copyleft is finally losing its teeth. It served its purpose 30 years ago, we don't need it anymore.
It seems like Carmack, like a lot of tech people, have forgotten to ask the question: who stands to benefit if we devalue the US services economy broadly? Who stands to lose? It seems like a lot of these people are assuming AI will be a universal good. It is easy to feel that way when you are independently wealthy and won't feel the fallout.
Even a small % of layoffs of the US white collar work force will crash the economy, as our economy is extremely levered. This is what happened in 2008: like 7% of mortgages failed, and this caused a cascade of failures we are still feeling today.
what examples are you thinking of?
Training an AI on GPL code and then having it generate equivalent code that is released under a closed source license seems like a good way to destroy the copy-left FOSS ecosystem.
Also the take is Buckwild. The primary overlap of AI and open source seems to be: 1. Open source maintainers being overwhelmed by a deluge of AI slop, making their jobs harder and making open source software worse 2. People trying to use AI to circumvent the GPL's protections, by making an AI copy of the open source project that they argue is acceptable to put under a less altruistic license
So when he says "AI training on the code magnifies the value of the gift", what planet is he living on?
I think this debate is mainly about the value of human labor. I guess when you're a millionaire, it's much easier to be excited about human labor losing value.
I can understand his stance on AI given this perspective. I have a harder time empathizing his frustrations. Did he also have a hard time coming to terms with the need for AGPL?
MIT asks for credit. GPL asks or credit and GPL'ing of things built atop. Unlicense is a free gift, but it is a minority.
AI reproduces code while removing credit and copyleft from it and this is the problem.
nkassis•37m ago
Edit: I'm also thinking of what he did rewriting all of Symbolics code for LISP machines
(similar to the person that accidentally hacked all vacuum of a certain manufacturer trying to gain access to his robot vacuum? https://www.theguardian.com/lifeandstyle/2026/feb/24/acciden...)
bombcar•13m ago
In a world without copyright, code obfuscation, or compliers, where everything ran interpreted as it was written and nobody could do anything to you if you modified it, Stallman would be perfectly content.