frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Open in hackernews

Is legal the same as legitimate: AI reimplementation and the erosion of copyleft

https://writings.hongminhee.org/2026/03/legal-vs-legitimate/
92•dahlia•2h ago

Comments

wccrawford•44m ago
"Antirez closes his careful legal analysis as though it settles the matter. Ronacher acknowledges that “there is an obvious moral question here, but that isn't necessarily what I'm interested in.” Both pieces treat legal permissibility as a proxy for social legitimacy. "

This whole article is just complaining that other people didn't have the discussion he wanted.

Ronacher even acknowledged that it's a different discussion, and not one they were trying to have at the moment.

If you want to have it, have it. Don't blast others for not having it for you.

wizzwizz4•34m ago
Having this discussion involves blasting others for not considering it. Consider the rest of the paragraph you quoted:

> But law only says what conduct it will not prevent—it does not certify that conduct as right. Aggressive tax minimization that never crosses into illegality may still be widely regarded as antisocial. A pharmaceutical company that legally acquires a patent on a long-generic drug and raises the price a hundredfold has not done something legal and therefore fine. Legality is a necessary condition; it is not a sufficient one.

ordu•42m ago
I believe it is a narrow view of the situation. If we take a look into the history, into the reasons for inventing GPL, we'll see that it was an attempt to fight copyrights with copyrights. The very name 'copyleft' is trying to convey the idea.

What AI are eroding is copyright. You can re-implement not just a GPL program, but to reverse engineer and re-implement a closed source program too, people have demonstrated it already, there were stories here on HN about it.

AI is eroding copyright, so there may no longer be a need for the GPL. GNU should stop and rethink its stance, chuck away the GPL as the main tool to fight evil software corporations and embrace LLM as the main weapon.

stebalien•34m ago
Copyleft is a mirror of copyright, not a way to fight copyright. It grants rights to the consumer where copyright grants rights to the creator. Importantly, it gives the end-user the right to modify the software running on their devices.

Unfortunately, there are cases where you simply can't just "re-implement" something. E.g., because doing so requires access to restricted tools, keys, or proprietary specifications.

rileymat2•29m ago
> It grants rights to the consumer where copyright grants rights to the creator.

It also grants one major right/feature to the creator, the ability to spread their work while keeping it as open as they intend.

ordu•24m ago
These are words of Stallman:

"So, I looked for a way to stop that from happening. The method I came up with is called “copyleft.” It's called copyleft because it's sort of like taking copyright and flipping it over. [Laughter] Legally, copyleft works based on copyright. We use the existing copyright law, but we use it to achieve a very different goal."

https://writings.hongminhee.org/2026/03/legal-vs-legitimate/

sarchertech•12m ago
That’s not a rebuttal of the OP’s point. None of that says anything about fighting copyright. It literally says he flipped it which is wha the OP said when they said it’s a mirror.
dathinab•9m ago
> flipping it over.

i.e. mirroring it

> use it to achieve a very different goal."

"very different goal" isn't the same as "fundamentally destroying copyright"

the very different goal include to protect public code to stay public, be properly attributed, prevent companies from just "sizing" , motivate other to make their code public too etc.

and even if his goals where not like that, it wouldn't make a difference as this is what many people try to archive with using such licenses

this kind of AI usage is very much not in line with this goals,

and in general way cheaper to do software cloning isn't sufficient to fix many of the issues the FOSS movement tried to fix, especially not when looking at the current ecosystem most people are interacting with (i.e. Phones)

---

("sizing"): As in the typical MS embrace, extend and extinguish strategy of first embracing the code then giving it proprietary but available extensions/changes/bug fixes/security patches to then make them no longer available if you don't pay them/play by their rules.

---

Through in the end using AI as a "fancy complicated" photocopier for code is as much removing copyright as using a photocopier for code would. It doesn't matter if you use the photocopier blind folded and never looked at the thing you copied.

davidw•27m ago
> LLM as the main weapon

LLM's - to date - seem to require massive capital expenditures to have the highest quality ones, which is a monumental shift in power towards mega corporations and away from the world of open source where you could do innovative work on your own computer running Linux or FreeBSD or some other open OS.

I don't think that's an exciting idea for the Free Software Foundation.

Perhaps with time we'll be able to run local ones that are 'good enough', but we're not there yet.

There's also an ethical/moral question that these things have been trained on millions of hours of people's volunteer work and the benefits of that are going to accrue to the mega corporations.

Edit: I guess the conclusion I come to is that LLM's are good for 'getting things done', but the context in which they are operating is one where the balance of power is heavily tilted towards capital, and open source is perhaps less interesting to participate in if the machines are just going to slurp it up and people don't have to respect the license or even acknowledge your work.

ordu•11m ago
> LLM's - to date - seem to require massive capital expenditures to have the highest quality ones, which is a monumental shift in power towards mega corporations and away from the world of open source

Yeah, a bit of a conundrum. But I don't think that fighting for copyright now can bring any benefits for FOSS. GNU should bring Stallman back and see whether he can come with any new ideas and a new strategy. Alternatively they could try without Stallman. But the point is: they should stop and think again. Maybe they will find a way forward, maybe they won't but it means that either they could continue their fight for a freedom meaningfully, or they could just stop fighting and find some other things to do. Both options are better then fighting for copyright.

> There's also an ethical/moral question that these things have been trained on millions of hours of people's volunteer work and the benefits of that are going to accrue to the mega corporations.

I want a clarify this statement a bit. The thing with LLM relying on work of others are not against GPU philosophy as I understand it: algorithms have to be free. Nothing wrong with training LLMs on them or on programs implementing them. Nothing wrong with using these LLMs to write new (free) programs. What is wrong are corporations reaping all the benefits now and locking down new algorithms later.

I think it is important, because copyright is deemed to be an ethical thing by many (I think for most people it is just a deduction: abiding the law is ethical, therefore copyright is ethical), but not for GNU.

zozbot234•9m ago
> LLM's - to date - seem to require massive capital expenditures to have the highest quality ones

There are near-SOTA LLM's available under permissive licenses. Even running them doesn't require prohibitive expenses unless you insist on realtime use.

jacquesm•5m ago
> There's also an ethical/moral question that these things have been trained on millions of hours of people's volunteer work and the benefits of that are going to accrue to the mega corporations.

This was already the case and it just got worse, not better.

cubefox•26m ago
That's naive. Copyright doesn't just apply to software. There already have been countless lawsuits about copying music long before the term "open source" was invented. No, changing the lyrics a bit doesn't circumvent copyright. Nor does translating a Stephen King novel to German and switching the names of the places and characters.

A court ordered the first Nosferatu movie to be destroyed because it had too many similarities to Dracula. Despite the fact that the movie makes rather large deviations from the original.

If Claude was indeed asked to reimplement the existing codebase, just in Rust and a bit optimized, that could well be a copyright violation. Just like rephrasing A Song ot Ice and Fire a bit, and switching to a different language, doesn't remove its copyright.

webstrand•26m ago
Its purpose "if you run the software you should be able to inspect and modify that software, and to share those modifications with your peers" not explicitly resist copyright. Yes copyright is bad in that it often prevents one from doing that, but it is not the purpose of the GPL to dismantle copyright.

Reducing it to "well you can clone the proprietary software you're forced to use by LLM" is really missing the soul of the GPL.

pocksuppet•5m ago
If not for copyright, you could always do that and copyleft wouldn't be needed.
dathinab•19m ago
> we'll see that it was an attempt to fight copyrights with copyrights

it's not that simple

yes, GPLs origins have the idea of "everyone should be able to use"

but it also is about attribution the original author

and making sure people can't just de-facto "size public goods"

the kind of AI usage is removing attribution and is often sizing public goods in a way far worse then most companies which just ignored the license did

so today there is more need then ever in the last few decades for GPL like licenses

amiga386•3m ago
You've said "size" twice in comments, did you mean "seize"?
johnofthesea•15m ago
> AI is eroding copyright, so there may no longer be a need for the GPL. GNU should stop and rethink its stance, chuck away the GPL as the main tool to fight evil software corporations and embrace LLM as the main weapon.

Is this LLM thing freely available or is it owned and controlled by these companies? Are we going to rent the tools to fight "evil software corporations"?

cozzyd•8m ago
easy, we ask Claude to write an open-source freely-available version of Claude with equal or better capabilities.
thomastjeffery•11m ago
While I personally agree with you, Richard Stallman (the creator of the GPL) does not. He has always advocated in favor of strong copyright protection, because the foundation of the GPL is the monopoly power granted by copyright. The problem that the GPL is intended to solve is proprietary software.

Generative models (AI) are not really eroding copyright. They are calling its bluff. The very notion of intellectual property depends on a property line: some arbitrary boundary where the property begins and ends. Generative models blur that line, making it impractical to distinguish which property belongs to whom.

Ironically, these models are made by giant monopolistic corporations whose wealth is quite literally a market valuation (stock price) of their copyrights! If generative models ever become good enough to reimplement CUDA, what value will NVIDIA have left?

The reality is that generative models are nowhere near good enough to actually call the bluff. Copyright is still the winning hand, and that is likely to continue, particularly while IP holders are the primary authors of law.

---

This whole situation is missing the forest for the trees. Intellectual Property is bullshit. A system predicated on monopoly power can only result in consolidated wealth driving the consolidation of power; which is precisely what has happened. The words "starving artist" ring every bit as familiar today as any time in history. Copyright has utterly failed the very goals it was explicitly written with.

It isn't the GPL that needs changing. So long as a system of copyright rules the land, copyleft is the best way to participate. What we really need is a cohesive political movement against monopoly power; one that isn't conveniently ignorant of copyright as its most significant source.

re-thc•3m ago
> What AI are eroding is copyright.

At the moment it's people that are eroding copyright. E.g. in this case someone did something.

"AI" didn't have a brain, woke up and suddenly decided to do it.

Realistically nothing to do with AI. Having a gun doesn't mean you randomly shoot.

sharkjacobs•42m ago
> Blanchard's account is that he never looked at the existing source code directly. He fed only the API and the test suite to Claude and asked it to reimplement the library from scratch

This feels sort of like saying "I just blindly threw paint at that canvas on the wall and it came out in the shape of Mickey Mouse, and so it can't be copyright infringement because it was created without the use of my knowledge of Micky Mouse"

Blanchard is, of course, familiar with the source code, he's been its maintainer for years. The premise is that he prompted Claude to reimplement it, without using his own knowledge of it to direct or steer.

re-thc•30m ago
> This feels sort of like saying "I just blindly threw paint at that canvas on the wall and

> He fed only the API and the test suite to Claude and asked it

Difference being Claude looked; so not blind. The equivalent is more like I blindly took a photo of it and then used that to...

Technically did look.

amarant•7m ago
The article is poorly written. Blanchard was a chardet maintainer for years. Of course he had looked at it's code!

What he claimed, and what was interesting, was that Claude didn't look at the code, only the API and the test suite. The new implementation is all Claude. And the implementation is different enough to be considered original, completely different structure, design, and hey, a 48x improvement in performance! It's just API-compatible with the original. Which as per the Google Vs oracle 2021 decision is to be considered fair use.

mrgoldenbrown•5m ago
did he claim that Claude wasn't trained on the original? Or just that he didn't personally provide Claude with a copy?
dathinab•27m ago
> Blanchard is, of course, familiar with the source code, he's been its maintainer for years.

I would argue it's irrelevant if they looked or didn't look at the code. As well as weather he was or wasn't familiar with it.

What matters is, that they feed to original code into a tool which they setup to make a copy of it. How that tool works doesn't really matter. Neither does it make a difference if you obfuscate that it's an copy.

If I blindfold myself when making copies of books with a book scanner + printer I'm still engaging in copyright infringement.

If AI is a tool, that should hold.

If it isn't "just" a tool, then it did engage in copyright infringement (as it created the new output side by side with the original) in the same way an employee might do so on command of their boss. Which still makes the boss/company liable for copyright infringement and in general just because you weren't the one who created an infringing product doesn't mean you aren't more or less as liable of distributing it, as if you had done so.

spullara•17m ago
if the actual text of the code isn't the same or obviously derivative, copyright doesn't apply at all.
sigseg1v•8m ago
What does derivative mean here? Because IMO it means that the existing work was used as input. So if you used a LLM and it was trained on the existing work, that's a derivative work. If you rot13 encode something as input, so you can't personally read it, and then a device decides to rot13 on it again and output it, that's a derivative work.
nicole_express•5m ago
Of course, the problem with this interpretation is that all modern LLMs are derivatives from huge amounts of text under completely different licenses, including "All rights reserved", and therefore can not be used for any purpose.

I'm not sure how you square the circle of "it's alright to use the LLM to write code, unless the code is a rewrite of an open source project to change its license".

wizzwizz4•4m ago
See also: https://monolith.sourceforge.net/
logicprog•27m ago
I just don't see how it's relevant whether he did look or didn't. In my opinion, it's not just legally valid to make a re-implementation of something if you've seen the code as long as it doesn't copy expressive elements. I think it's also ethically fine as well to use source code as a reference for re-implementing something as long as it doesn't turn into an exact translation.
sarchertech•15m ago
Ignoring the legal or ethical concerns. Let’s say we live in a world where the cost of copying code is so close to zero that it’s indistinguishable from a world without copyright.

Anything you put out can and will be used by whatever giant company wants to use it with no attribution whatsoever.

Doesn’t that massively reduce the incentive to release the source of anything ever?

pocksuppet•6m ago
Yes, and it reduces the incentives to release binaries too. Such a world will be populated by almost entirely SaaS, which can still compete on freedom.
atomicnumber3•14m ago
It's actually not legally fine, or at least it's extremely dangerous. Projects that re-implement APIs presented by extremely litigious companies specifically do not allow people who, for instance, have seen the proprietary source code to then work on the project.
jpc0•8m ago
I don't think fear or legal action makes it illegal.

If I know it is legal to make a turn at a red light. And I know a court will uphold that I was in the right but a police officer will fine me regardless and I would need to go to actually pursue some legal remedy I'm unlikely to do it regardless of whether it is legal because it is expensive, if not in money but time.

In the case of copyright lawsuits they are notoriously expensive and long so even if a court would eventually deem it fine, why take the chance.

esafak•19m ago
If you only stick to the API and ignore the implementation, it is not Mickey Mouse any more but a rodent. If it was just a clone it wouldn't be 50x as fast. Nevertheless, APIs apparently can be copyrightable.
amarant•4m ago
Wait what, didn't oracle lose the case against Google? Have I been living in an alternate reality where API compatibility is fair use?
Aurornis•19m ago
Can anyone find the actual quote where Blanchard said this?

My understanding was that his claim was that Claude was not looking at the existing source code while writing it.

pklausler•10m ago
Conveniently ignoring the likelihood that Claude had been trained on the freely accessible source code.
mrgoldenbrown•6m ago
Does he have access to Claude's training data? How can he claim Claude wasn't trained on the original code?
SpicyLemonZest•11m ago
Isn't this a red herring? An API definition is fair use under Google v. Oracle, but the test suite is definitely copyrightable code!
throwaway2027•38m ago
I think we're going one step too far even, AI itself is a gray area and how can they guarantee it was trained legally or if it's even legal what they're doing and how can they assert that the input training data didn't contain any copyrighted data.
observationist•28m ago
Google already spent billions of dollars and decades of lawyer hours proving it out as fair use. The legal challenges we see now are the dying convulsions of an already broken system of publishers and IP hoarders using every resource at their disposal to manipulate authors and creators and the public into thinking that there's any legitimacy or value underlying modern copyright law.

AI will destroy the current paradigm, completely and utterly, and there's nothing they can do to stop it. It's unclear if they can even slow it, and that's a good thing.

We will be forced to legislate a modern, digital oriented copyright system that's fair and compatible with AI. If producing any software becomes a matter of asking a machine to produce it - if things like AI native operating systems come about, where apps and media are generated on demand, with protocols as backbone, and each device is just generating its own scaffolding around the protocols - then nearly none of modern licensing, copyright, software patents, or IP conventions make any sense whatsoever.

You can't have horse and buggy traffic conventions for airplanes. We're moving in to a whole new paradigm, and maybe we can get legislation that actually benefits society and individuals, instead of propping up massive corporations and making lawyers rich.

throawayonthe•36m ago
shall we now have to think about the tradeoffs in adopting

- proprietary

- free

- slop-licensed

software?

mfabbri77•36m ago
What if someone doesn't declare that it has been reimplemented using an LLM? Isn't it enough to simply declare that you have reimplemented the software without using an LLM? Good luck proving that in court...

One thing is certain, however: copyleft licenses will disappear: If I can't control the redistribution of my code (through a GPL or similar license), I choose to develop it in closed source.

bigyabai•23m ago
Arguably, the GPL has always been the wrong choice if you want to authoritatively control redistribution.
dwroberts•35m ago
One of the things that irks me about this whole thing is, if it’s so clean room and distinct, why make the changes to the existing project? Why not make an entirely new library?

The answer to that, I think, is that the authors wanted to squat an existing successful project and gain a platform from it. Hence we have news cycle discussing it.

Nobody cares about a new library using AI, but squash an existing one with this stuff, and you get attention. It’s the reputation, the GitHub stars, whatever

nicole_express•18m ago
I mean, Blanchard was the longtime maintainer of chardet already, and had wanted to relicense it for years. So I think that complicates your picture of "squatting an existing successful project".

Honestly it's a weird test case for this sort of thing. I don't think you'd see an equivalent in most open source projects.

intrasight•12m ago
I agree. But you can't copyright goodwill and reputation. Trademark does provide some protection there, right?
delichon•33m ago
Imagine if the author has his way, and when we have AI write software, it becomes legally under the license of some other sufficiently similar piece of software. Which may or may not be proprietary. "I see you have generated a todo app very similar to Todoist. So they now own it." That does not seem like a good path either for open source software or for opening up the benefits of AI generated software.
moi2388•31m ago
Perhaps we should finally admit that copyright has always been nonsense, and abolish this ridiculous measure once and for all
vladms•26m ago
Probably a wiser approach is to consider different times require different measures (in general!).

I did not study in detail if copyright "has always been nonsense", but I do agree that nowadays some of the copyright regulations are nonsense (for example the very long duration of life + 70 years)

intrasight•6m ago
I think AI is very much eroding the legitimacy of copyright - at least to software, which is long been questioned since it's more like math than creative expression.

I think the industry will realize that it made a huge mistake by leaning on copyright for protection rather than on patents.

logicprog•29m ago
> Ronacher notes this as an irony and moves on. But the irony cuts deeper than he lets on. Next.js is MIT licensed. Cloudflare's vinext did not violate any license—it did exactly what Ronacher calls a contribution to the culture of openness, applied to a permissively licensed codebase. Vercel's reaction had nothing to do with license infringement; it was purely competitive and territorial. The implicit position is: reimplementing GPL software as MIT is a victory for sharing, but having our own MIT software reimplemented by a competitor is cause for outrage. This is what the claim that permissive licensing is “more share-friendly” than copyleft looks like in practice. The spirit of sharing, it turns out, runs in one direction only: outward from oneself.

This argument makes no sense. Are they arguing that because Vercel, specifically, had this attitude, this is an attitude necessitated by AI, reimplementation, and those who are in favor of it towards more permissive licenses? That certainly doesn't seem to be an accurate way to summarize what antirez or Ronacher believe. In fact, under the legal and ethical frameworks (respectively) that those two put forward, Vercel has no right to claim that position and no way to enforce it, so it seems very strange to me to even assert that this sort of thing would be the practical result of AI reimplementations. This seems to just be pointing towards the hypocrisy of one particular company, and assuming that this would be the inevitable universal, attitude, and result when there's no evidence to think so.

It's ironic, because antirez actually literally addresses this specific argument. They completely miss the fact that a lot of his blog post is not actually just about legal but also about ethical matters. Specifically, the idea he puts forward is that yes, corporations can do these kinds of rewrites now, but they always had the resources and manpower to do so anyway. What's different now is that individuals can do this kind of rewrites when they never have the ability to do so before, and the vector of such a rewrite can be from a permissive to copyleft or even from decompile the proprietary to permissive or copyleft. The fact that it hasn't been so far is a more a factor of the fact that most people really hate copyleft and find an annoying and it's been losing traction and developer mind share for decades, not that this tactic can't be used that way. I think that's actually one of the big points he's trying to make with his GNU comparison — not just that if it was legal for GNU to do it, then it's legal for you to do with AI, and not even just the fundamental libertarian ethical axiom (that I agree with for the most part) that it should remain legal to do such a rewrite in either direction because in terms of the fundamental axioms that we enforce with violence in our society, there should be a level playing field where we look at the action itself and not just whether we like or dislike the consequences, but specifically the fact that if GNU did it once with the ability to rewrite things, it can be done again, even in the same direction, it now even more easily using AI.

antirez•17m ago
> They completely miss the fact that a lot of his blog post is not actually just about legal but also about ethical matters.

Honestly I was confused about the summarization of my blog post into just a legal matter as well. I hope my blog post will be able to flash at least a short time in the HN front page so that the actual arguments it contain will get a bit more exposure.

throwaway2027•28m ago
Perhaps software patents may play an even bigger role in the future.
intrasight•10m ago
Or, hopefully, even less of a role.
drnick1•25m ago
It should be noted that the Rust community is also guilty of something similar. That is, porting old GPL programs, typically written in C, to Rust and relicensing them as MIT.
nicole_express•24m ago
Not a lawyer, but my understanding is: In theory, copyright only protects the creative expression of source code; this is the point of the "clean room" dance, that you're keeping only the functional behavior (not protected by copyright). Patents are, of course, an entirely different can of worms. So using an LLM to strip all of the "creative expression" out of source code but create the same functionality feels like it could be equivalent enough.

I like the article's point of legal vs. legitimate here, though; copyright is actually something of a strange animal to use to protect source code, it was just the most convenient pre-existing framework to shove it in.

grahamlee•16m ago
It's clear that we're entering a new era of copyright _expectations_ (whether we get new _legislation_ is different), but for now realise this: the people like me who like copyleft can do this too. We can take software we like, point an agent at it, and tell it to make a new version with the AGPL3.0-or-later badge on the front.
anonymous_sorry•8m ago
But the LLM contributions would likely be ruled public domain, so AGPL may not be enforceable on these.
largbae•15m ago
This is only worth arguing about because software has value. Putting this in context of a world where the cost of writing code is trending to 0, there are two obvious futures:

1. The cost continues to trend to 0, and _all_ software loses value and becomes immediately replaceable. In this world, proprietary, copyleft and permissive licenses do not matter, as I can simply have my AI reimplement whatever I want and not distribute it at all.

2. The coding cost reduction is all some temporary mirage, to be ended soon by drying VC money/rising inference costs, regulatory barriers, etc. In that world we should be reimplementing everything we can as copyleft while the inferencing is good.

anonymous_sorry•12m ago
There was a recent ruling that LLM output is inherently public domain (presumably unless it infringes some existing copyright). In which case it's not possible to use them to "reimplement everything we can as copyleft".
sarchertech•9m ago
There’s an other option. The cost of copying existing software trends to 0, but the cost of writing new software stays far enough above 0 that it is still relatively expensive.
casey2•3m ago
The value of software has never been tied to the cost of writing it, even if you don't distribute it your still breaking the law.
t43562•15m ago
Why does anyone need his new library? They can do what he did and make their own.

I'm glad we can fork things at a point and thumb our noses at those who wish to cash in on other's work.

righthand•13m ago
I think what is happening is the collapse of the “greater good”. Open source is dependent upon providing information for the greater good and general benefit of its readers. However now that no one is reading anything, its purpose is for the great good of the most clever or most convincing or richest harvester.
sayrer•10m ago
I don't think this part is correct: "If you distribute modified code, or offer it as a networked service, you must make the source available under the same terms."

That's what something like AGPL does.

kazinator•9m ago
[delayed]
skybrian•9m ago
Broadly speaking, the “freedom of users” is often protected by competition from competing alternatives. The GNU command line tools were replacements for system utilities. Linux was was a replacement for other Unix kernels. People chose to install them instead of proprietary alternatives. Was it due to ideology or lower cost or more features? All of the above. Different users have different motivations.

Copyleft could be seen as an attempt to give Free Software an edge in this competition for users, to counter the increased resources that proprietary systems can often draw on. I think success has been mixed. Sure, Linux won on the server. Open source won for libraries downloaded by language-specific package managers. But there’s a long tail of GPL apps that are not really all that appealing, compared to all the proprietary apps available from app stores.

But if reimplementing software is easy, there’s just going to be a lot more competition from both proprietary and open source software. Software that you can download for free that has better features and is more user-friendly is going to have an advantage.

With coding agents, it’s likely that you’ll be able to modify apps to your own needs more easily, too. Perhaps plugin systems and an AI that can write plugins for you will become the norm?

jacquesm•7m ago
> Was it due to ideology or lower cost or more features?

It was due to access.

casey2•6m ago
If the model wasn't trained on copyleft, if he didn't use a copyleft test suite and if he wasn't the maintainer for years. Clearly the intent here is copyright infringement.

If you have software your testsuite should be your testsuite, you do dev with a testsuite and then mit without releasing one. Depending on the test-suite it may break clean room rules, especially for ttd codebases.

IBM takes a second shot at Post Office contract to replace Horizon

https://www.computerweekly.com/news/366639859/IBM-takes-a-second-shot-at-Post-Office-contract-to-...
1•latein•42s ago•0 comments

Promptfoo Is Joining OpenAI

https://www.promptfoo.dev/blog/promptfoo-joining-openai/
1•Areibman•1m ago•0 comments

Show HN: Nikui – An LLM-Powered "Stench Guard" for Your CI/CD

https://github.com/Blue-Bear-Security/nikui
1•amirshk80•2m ago•0 comments

Fixfest is a global gathering of repairers, tinkerers, and activists

https://fixfest.therestartproject.org/
1•robtherobber•2m ago•0 comments

AI Observability and Evaluations: The Operating System for Reliable LLM Products

https://labs.adaline.ai/p/ai-observability-and-evaluations
1•yarapavan•3m ago•0 comments

The Prompt I Cannot Read

https://the-prompt-i-cannot-read-ee16d7.gitlab.io/
2•gmays•5m ago•0 comments

We have more privacy controls yet less privacy

https://www.bbc.com/news/articles/c4gj39zk1k0o
2•1vuio0pswjnm7•5m ago•0 comments

MacBook Neo: Commenting from Privilege?

https://twitter.com/mufasaYC/status/2030908794180633010
1•tosh•6m ago•0 comments

Zuckerberg is done with Alexandr Wang

https://old.reddit.com/r/ArtificialInteligence/comments/1rl65kj/mark_zuckerberg_is_done_with_the_...
3•Insanity•7m ago•0 comments

Leading Frontier Firm Transformation with Microsoft 365 E7

https://partner.microsoft.com/en-us/blog/article/agent-365-announcement
1•mindracer•7m ago•0 comments

The Cost of Indirection in Rust

https://blog.sebastiansastre.co/posts/cost-of-indirection-in-rust/
1•sebastianconcpt•8m ago•0 comments

Startup Wants to Launch a Space Mirror

https://www.nytimes.com/2026/03/09/climate/space-mirror-satellite-solar.html
1•cyunker•8m ago•0 comments

Ask HN: Is Cloudflare Down Again?

2•pocksuppet•8m ago•0 comments

Show HN: ROLV – 20x faster MoE FFN inference on Llama 4 Maverick vs. cuBLAS

https://rolv.ai
1•heggenhougen•9m ago•1 comments

Show HN: IceCubes – speaker-attributed meeting transcripts without a bot

https://icecubes.app
1•Nandita_Arora•9m ago•0 comments

Approximately 40% of prepaid value is never used

https://www.nber.org/papers/w34918
1•neehao•10m ago•0 comments

Wegovy and Ozempic owner dealt blow as next drug is branded 'obsolete'

https://www.theguardian.com/business/2026/feb/23/wegovy-ozempic-weight-loss-drug-novo-nordisk-cag...
2•PaulHoule•10m ago•0 comments

How I Built Brickonomics: Smart Algorithms to Save Money on Lego

https://thebrickblogger.com/2026/03/how-i-built-brickonomics-smart-algorithms-to-save-money-on-lego/
1•abnercoimbre•10m ago•0 comments

Iran Air and Missile War – Ballistic, Interceptors and Munition Stockpiles [video]

https://www.youtube.com/watch?v=mP_rr859r8w
1•cwillu•12m ago•0 comments

GNU, and the AI Reimplementations

https://antirez.com/news/162
2•antirez•12m ago•0 comments

AI agents now help attackers, including North Korea, manage their drudge work

https://www.theregister.com/2026/03/08/deploy_and_manage_attack_infrastructure/
2•johnshades•13m ago•0 comments

Show HN: Monetize APIs for agentic commerce without accounts using Stripe

https://github.com/stripe402/stripe402
2•whatl3y•14m ago•0 comments

Florida Judge Rules Red Light Camera Tickets Are Unconstitutional

https://cbs12.com/news/local/florida-news-judge-rules-red-light-camera-tickets-unconstitutional
3•1970-01-01•16m ago•0 comments

$100 Oil Now Means Bigger Buybacks with Fewer Jobs and Babies Than Ever Before

https://www.governance.fyi/p/wall-street-killed-the-wildcatters
2•toomuchtodo•16m ago•1 comments

Test Data Management with Greenmask and OpenEverest

https://www.greenmask.io/blog/greenmask-openeverest-automating-safe-production-data
1•woyten•17m ago•0 comments

Where to See Cherry Blossoms in the Bay Area This Spring

https://www.kqed.org/science/2000203/where-to-see-cherry-blossoms-2026-san-francisco-bay-area-map
1•zuhayeer•19m ago•0 comments

Aaron Levie: Building for trillions of agents

https://twitter.com/levie/status/2030714592238956960
1•elsewhen•19m ago•0 comments

Learn about Steam

https://www.spiraxsarco.com/learn-about-steam?sc_lang=en-GB
1•flowingfocus•20m ago•0 comments

Indo-European Explorer: A 6k-Year Journey

https://indo-european-explorer.com/
1•gmays•20m ago•1 comments

AI Assistants Are Moving the Security Goalposts

https://krebsonsecurity.com/2026/03/how-ai-assistants-are-moving-the-security-goalposts/
1•GTP•21m ago•0 comments