frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Start all of your commands with a comma (2009)

https://rhodesmill.org/brandon/2009/commands-with-comma/
210•theblazehen•2d ago•64 comments

OpenCiv3: Open-source, cross-platform reimagining of Civilization III

https://openciv3.org/
686•klaussilveira•15h ago•204 comments

The Waymo World Model

https://waymo.com/blog/2026/02/the-waymo-world-model-a-new-frontier-for-autonomous-driving-simula...
960•xnx•20h ago•553 comments

How we made geo joins 400× faster with H3 indexes

https://floedb.ai/blog/how-we-made-geo-joins-400-faster-with-h3-indexes
127•matheusalmeida•2d ago•35 comments

Unseen Footage of Atari Battlezone Arcade Cabinet Production

https://arcadeblogger.com/2026/02/02/unseen-footage-of-atari-battlezone-cabinet-production/
65•videotopia•4d ago•3 comments

Jeffrey Snover: "Welcome to the Room"

https://www.jsnover.com/blog/2026/02/01/welcome-to-the-room/
29•kaonwarb•3d ago•24 comments

Vocal Guide – belt sing without killing yourself

https://jesperordrup.github.io/vocal-guide/
44•jesperordrup•5h ago•23 comments

Show HN: Look Ma, No Linux: Shell, App Installer, Vi, Cc on ESP32-S3 / BreezyBox

https://github.com/valdanylchuk/breezydemo
236•isitcontent•15h ago•26 comments

ga68, the GNU Algol 68 Compiler – FOSDEM 2026 [video]

https://fosdem.org/2026/schedule/event/PEXRTN-ga68-intro/
8•matt_d•3d ago•2 comments

Monty: A minimal, secure Python interpreter written in Rust for use by AI

https://github.com/pydantic/monty
230•dmpetrov•15h ago•122 comments

Show HN: I spent 4 years building a UI design tool with only the features I use

https://vecti.com
334•vecti•17h ago•146 comments

Where did all the starships go?

https://www.datawrapper.de/blog/science-fiction-decline
26•speckx•3d ago•16 comments

Hackers (1995) Animated Experience

https://hackers-1995.vercel.app/
499•todsacerdoti•23h ago•244 comments

Sheldon Brown's Bicycle Technical Info

https://www.sheldonbrown.com/
384•ostacke•21h ago•97 comments

Microsoft open-sources LiteBox, a security-focused library OS

https://github.com/microsoft/litebox
360•aktau•21h ago•183 comments

Show HN: If you lose your memory, how to regain access to your computer?

https://eljojo.github.io/rememory/
295•eljojo•18h ago•186 comments

An Update on Heroku

https://www.heroku.com/blog/an-update-on-heroku/
421•lstoll•21h ago•280 comments

PC Floppy Copy Protection: Vault Prolok

https://martypc.blogspot.com/2024/09/pc-floppy-copy-protection-vault-prolok.html
67•kmm•5d ago•10 comments

Dark Alley Mathematics

https://blog.szczepan.org/blog/three-points/
95•quibono•4d ago•22 comments

Was Benoit Mandelbrot a hedgehog or a fox?

https://arxiv.org/abs/2602.01122
21•bikenaga•3d ago•11 comments

Delimited Continuations vs. Lwt for Threads

https://mirageos.org/blog/delimcc-vs-lwt
33•romes•4d ago•3 comments

How to effectively write quality code with AI

https://heidenstedt.org/posts/2026/how-to-effectively-write-quality-code-with-ai/
262•i5heu•18h ago•212 comments

Female Asian Elephant Calf Born at the Smithsonian National Zoo

https://www.si.edu/newsdesk/releases/female-asian-elephant-calf-born-smithsonians-national-zoo-an...
38•gmays•10h ago•13 comments

I now assume that all ads on Apple news are scams

https://kirkville.com/i-now-assume-that-all-ads-on-apple-news-are-scams/
1074•cdrnsf•1d ago•460 comments

Introducing the Developer Knowledge API and MCP Server

https://developers.googleblog.com/introducing-the-developer-knowledge-api-and-mcp-server/
61•gfortaine•13h ago•27 comments

Understanding Neural Network, Visually

https://visualrambling.space/neural-network/
294•surprisetalk•3d ago•45 comments

I spent 5 years in DevOps – Solutions engineering gave me what I was missing

https://infisical.com/blog/devops-to-solutions-engineering
153•vmatsiiako•20h ago•72 comments

The AI boom is causing shortages everywhere else

https://www.washingtonpost.com/technology/2026/02/07/ai-spending-economy-shortages/
14•1vuio0pswjnm7•1h ago•1 comments

Why I Joined OpenAI

https://www.brendangregg.com/blog/2026-02-07/why-i-joined-openai.html
159•SerCe•11h ago•146 comments

Show HN: R3forth, a ColorForth-inspired language with a tiny VM

https://github.com/phreda4/r3
74•phreda4•14h ago•14 comments
Open in hackernews

MIT Non-AI License

17•dumindunuwan•4w ago
Don't we have to ask for permission before feeding someone's years of work into an AI?

MIT NON-AI License

Copyright (c) 2025-2026 NAME

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

1. The Software and any derivative works may not be used for the purposes of training, fine-tuning, or validating artificial intelligence models or machine learning algorithms without prior written permission from the copyright holders.

2. The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Comments

toomuchtodo•4w ago
What if instead of a license most won’t respect, you include a poison pill in the repo or other code storage to poison the model?

https://news.ycombinator.com/item?id=45529587

https://news.ycombinator.com/item?id=45533842

dumindunuwan•4w ago
This should be about get permission from open-source developers before feeding their years of work into AI. I think we should not believe what Anthropic, OpenAI, Meta, Google tells.

We should move to local LLMs.

HumanOstrich•4w ago
How do local LLMs help?
01092026•4w ago
Interesting perspective. I think you have a right, whats yours is yours to do as you please with - you know?

But can I give you another "viewpoint"? I guess it's like, "Wow, my code, my work, what came from my brain, my fingers" - it essentially lives forever, if you think about it - it becomes embedded and compressed inside weights/tokens. Like - part of you is there.

I guess it's cool. For me it's just to know that like this super intelligent things deep down actually knows who I am - my code is in it's architecture, it gives me feeling of honor in some way. You know?

Just my take.

tylerchilds•4w ago
I don’t always say AI pilled but

You deserve to be recognized beyond the false religion of the singularity.

tylerchilds•3w ago
Hilarious to get downvoted for defending the concept of

Literature

Without references and citations—- never mind.

Go read Animal Farm

nomel•3w ago
you comment did no such thing, in a way that would be recognizable by anyone else. it probably would have been a good comment if you did expand on that though, in some meaningful way.
tylerchilds•3w ago
Valid point. I shouldn’t have met condescension with condescension.

I don’t think I would have said anything if the opening statement wasn’t a “you do you, but”

Thanks for taking the time to call me out politely.

To be objective, we can ask the creator of tailwind how much honor he feels with his business just being weights in a model in someone else’s business

https://github.com/tailwindlabs/tailwindcss.com/pull/2388#is...

https://www.businessinsider.com/tailwind-engineer-layoffs-ai...

https://dev.to/kniraj/tailwind-css-lays-off-75-of-engineerin...

https://news.ycombinator.com/item?id=46527950

Going full hypertext computer history, this is why Ted Nelson believed in micro transactions on transclusions.

In modern terms— when ai bills by the tokens, those tokens should also get paid out to the source materials.

The business model is primarily broken, which is why that’s not happening, and why the main business use case is militarization of it.

tylerchilds•3w ago
I use ai daily but reciting the talking points that killed what open source used to mean and using them to further separate original authors from the impact of their work

That’s ai pilled

I write code under the mit license

I know the risk

Helping humans still makes it worth it

And technically these AI companies should have a /licenses route that lists every MIT piece of code their model was trained on.

That’s literally the only expectation I have from anyone as an active author using the MIT license, getting cited.

I think the legal AI defense is that the models themselves are a bastardized form of dynamic linking. I say the models are statically linked though, so they need to spill their sources.

tylerchilds•3w ago
That’d be my question to the person I disrespected:

Why should I, as someone that’s not hypothetically giving back in code, continue to do so, when the social contract has been broken, where the always minimal expectation has been: Say my name?

nomel•3w ago
> minimal expectation has been: Say my name?

MIT license specifically does not require public attribution for derivative works. You should be using a different license if that is your goal.

tylerchilds•3w ago
I don’t want to insult you by pasting the full text here but this is the required bit

“The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.”

Which is here in my operating system

https://github.com/tylerchilds/plan98/blob/d27d975bf931a7d80...

I literally came up with a unique sdk for all “my elves” such that I can in fact see people in court for mishandling the software supply chain.

There’s a lot of software licensing misinformation out there and including my name and email with the rest of the license text is such a simple thing to misunderstand.

I’m sorry for any and all infractions you’ve committed across all MIT authors to date.

I’m not really planning to take anyone to court, but if you really believe what you’ve said to me here and you’ve been writing code that follows those beliefs

I’m not a lawyer, but you should probably consult a lawyer.

tylerchilds•3w ago
I literally re-wrote how I wrote software post-ai to ai pill the ai, such that, when these models produce ASTs that match my signature, I do have a legal defense.
tylerchilds•3w ago
Full circle:

We all deserve to be recognized beyond the false religion of the singularity.

In practice, that’s a bibliography. That’s what the MIT license guarantees— sources: cited, aka, literature in action.

tylerchilds•3w ago
Plagiarism is the crux of what MIT protects

I do legally wave my right by using it to know every instance of it in the wild.

But even in those private instances, legally, that cannot be presented to a superior without the license without the copyright and the license text.

Just because the product is private doesn’t erase the name of the author.

There’s a saying “can you separate the art from the artist?”

And the punchline to sillyz.computer is: that’s the only thing banned.

nomel•3w ago
> I’m sorry for any and all infractions you’ve committed across all MIT authors to date.

Oops.

I don't think this applies to AI though [1].

IANAL, I am also not smart.

[1] https://www.nbcnews.com/tech/tech-news/federal-judge-rules-c...

tylerchilds•3w ago
That’s about the inputs going in and even in the first paragraph disclaims other courts may side differently

We’re still not in the legal territory of the outputs on the other side, which is what’s actually more interesting to me.

The technical political maneuvering of to two sides of the accelerationist movement is the fair use bits, which is not even the point I’m debating.

Even there, they are discussing complete bodies of work as source material, which is exactly my point. My stuff can go in, but they still have to cite that I’m in there, which does matter on the other side.

I don’t think anyone deserves to be just weights and algorithms in a dark and shuddered library.

They want this to be a legal laundering device and that’s the bit I’m hung up on.

altairprime•4w ago
Licenses have no bearing on fair use or where otherwise permitted to be ignored by law.
nomel•4w ago
Reference? A lot has changed within the last couple years.
altairprime•4w ago
From a U.S. standpoint: Licensing is a function of copyright. A work not subject to copyright cannot be licensed productively as-is, as the public domain quality of the work is a trivial and conclusive defense against a licensor’s claims of copyright violation. Fair use is not subject to copyright. Since licensing enforcement is only possible with an applicable copyright, enforcement cannot be completed against fair uses, as copyright law is not applicable to fair uses and therefore licensing enforcement has no legal basis. However, a judgment may overturn a defense of fair use brought against a defendant in a licensing enforcement claim, which would then subject the defendant’s use to copyright law and thus to license enforcement.

It’s midnight now, so you’re on your own to dig up and review specific instances of relevant case law, or to contrast with non-U.S. laws. Licensing above refers to i.e. LICENSE files of the specific sort that this post is about ("MIT Non-AI License"); other definitions of licensing, as well as e.g. DMCA exceptions, exist that might be of interest for you to explore further. I believe there’s been a handful of cases related to AI and fair use this past year, but as with all such defenses, unique circumstances are common enough that I hesitate to suggest any future outcome as 100% certain without much more case law than AI has today. (I am not your lawyer, this is not legal advice.)

dumindunuwan•4w ago
The AI Act will be fully applicable from 2 August 2026.

Providers of GPAI models must respect Text and Data Mining (TDM) opt-outs.

2.1 Legal Basis: Article 53(1)(c) AI Act and Directive (EU) 2019/790 The Copyright Chapter of the Code directly addresses one of the most contentious legal questions in AI governance: the use of copyrighted material in training GPAI models and the risk of infringing outputs. Article 53(1)(c) AI Act requires GPAI providers to “identify and respect copyright protection and rights reservations” within their datasets. + This obligation complements the framework of Directive (EU) 2019/790 on copyright and related rights in the Digital Single Market (DSM Directive). Notably, Article 4(3) DSM Directive allows rightsholders to exclude their works from text and data mining (TDM) operations via machine-readable opt-outs.

https://www.ddg.fr/actualite/the-european-unions-code-of-pra...

altairprime•4w ago
Well, in the eyes of the EU, the entire 'fair use' thing doesn't exist there at all (per the EU's own JURI, which I think is roughly the equivalent of the U.S.'s Office of the Attorney General, with similar duties around defining the canonical interpretations of the laws).

https://www.europarl.europa.eu/RegData/etudes/STUD/2025/7740...

Two relevant bits, dug out from the 175-page whole:

> Although the Act tries to address this by extending obligations to any provider placing a GPAI model on the EU market, the extraterritorial enforcement of these obligations remains highly uncertain due to the territorial nature of copyright law and the practical difficulty of pursuing infringement claims when training occurs under foreign legal standards, such as U.S. fair use.

and:

> Finally, it is important to clarify that the current EU framework provides a closed list of exceptions and does not recognise a general fair use defence. As a result, AI-generated outputs that include protected expression without a valid exception remain unlawful.

It seems to be dated the same month as DDG's analysis, July 2025, so I would expect the MIT Non-AI License that we're discussing here to be much more defensible in the EU than in the U.S. — as long as one studies that full 175-page "Generative AI and Copyright" analysis and ensures that it addresses the salient points necessary to apply and enforce in EU copyright terms. (Queued for my someday-future :)

nomel•4w ago
For "AI", does that include some of the more advanced search indexing and auto complete tools?

Related, would you be in violation if you hosted this in a public GitHub repo, since it's in the TOS that they use source for training AI?

utopiah•4w ago
I applaud the effort but honestly I see 2 cases :

- VC funded SF startups with an ethos of "Do whatever is necessary at first, scale then lawyer up once a unicorn and shits starts to hit the fan" this might try to prevent... and who will blissfully ignore it

- actually mindful actors, e.g public researchers, non-profit, etc who genuinely understand the point and will respect it but whom, ironically enough, you might want to support by providing them with more resources including your code

So... yes the intent is good but honestly I wouldn't trust the 1st category of actors to respect that. Technically what they "should" do, because it's safe and efficient, is only crawl the most popular compatible licenses, e.g. MIT, and ignore the rest. That's safe and pragmatic. Again what I expect them to do (just my hunch) is take EVERYTHING, get called on, apologize, change nothing, get brought to court, apologize again and only do something if the financial repercussion is bigger than the expected alternative. So probably still do nothing.

HumanOstrich•4w ago
If you want to add restrictions to the MIT license, just don't use the MIT license or name it that way. Your change makes it no longer an open or permissive license so it's bizarre to keep the name. It would be like creating the MIT But-You-Have-To-Pay License.
dumindunuwan•4w ago
This is what any AI agent tells
HumanOstrich•4w ago
Yes, I'm a secret AI agent here to try and stop your powerful new licensing idea. /s
dumindunuwan•4w ago
I meant I asked Gemini before and it told the same
throwawayqqq11•4w ago
Then lets call this special license something like:

No-you-still-dont-have-to-pay-but-any-AI-use-is-restricted-license. 1 This way, everyone knows, its not as free as the MIT license and has absolutely no relation to it. /s

Ofcourse you can specialize existing licenses with limited paragraphs and reflect that in their names...

HumanOstrich•4w ago
I think you're missing the point of the MIT License and its history. If you want a proprietary license, "specializing" the MIT License for that is silly. My grandmother could specialize as a bicycle if I added wheels to her.
throwawayqqq11•4w ago
Yes, the MIT ought to be as broad, free and compatible as possible and is imo targeted to counter copy left licenses but this permissiveness can be also a problem, so specializing it into something like a tagged creative commons license can be a reasonable effort. It depends on the problem/topic of restriction and how narrow or well defined they are.
bitwize•4w ago
Doesn't actually count as open source. Per the OSD, you cannot restrict the purpose for which people use the software and still be open source.

If you want to release your code as actual open source but legally restrict what AI companies do with it, use a copyleft license like the GPL. They could still use it for training, but the product of such training may itself fall under the GPL being a derivative work, and the AI corps don't want to touch that legal quagmire. (The GPL continues to be Stallman's brilliant legal aikido at work. Stallman is like that one guy from Final Fantasy Tactics Advance who gives you the power to manipulate the laws to work to your advantage.)

Honestly, it may be time to abandon open source as a concept. There are source-available strategies that cause less social harm than open source, including Coraline Ada Ehmke's "ethical source" and the Sentry project's "fair source" (https://fair.io) models that we can draw inspiration from.

Mic92•4w ago
Well the thing is, you have a copyright that you can license. However from what it currently looks like, fine-tuning/training is not copying.
Gathering6678•4w ago
How do you propose someone who uses this license to enforce such clauses?
dumindunuwan•4w ago
Article 4(3) of the Digital Single Market (DSM) Directive establishes a legal bridge between AI governance and copyright law.

https://www.ddg.fr/actualite/the-european-unions-code-of-pra...

Gathering6678•3w ago
I meant the enforceability of such clause: to the extent of my limited understanding of law, you would need to at least appear to prove that: someone has breached the agreement by, for example, using your code to train AI. I am not sure how it is possible.
thatthatis•3w ago
Very ironically, this violates the MIT trademark in a quixotic pursuit of intellectual property ludditry.
linkdd•3w ago
> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software

Key words are:

  - permission is [...] granted
  - free of charge
  - without restriction
  - use, copy, …
Then:

> may not be used for the purposes of […]

The license contradicts itself.

> Don't we have to ask for permission before feeding someone's years of work into an AI?

That's the point of an OpenSource license, to give permission.

This kind of stuff makes me think very few people really understand what OpenSource is about. The very same people who will fallback to licenses such as the BSL as soon as people/companies will use the permissions that they gave, and then will complain that "no one wants to pay for the thing i did for free and nobody asked for".

dumindunuwan•3w ago
I understand these points. As someone who truly love open source, we can see open source projects are becoming just a free training materials for AI. After training LLMs using open-source projects AI can build far superior software one day and that software may be not free, not able to replace by any non-AI software project. We all know that day is not far and that period of time all open-source software might consider legacy as no individual contributor able to implement stuff the speed of AI. What you are protecting is not only a legacy system we build decade old requirements and also the death of the purpose of why people build free software.

What we have to focus is why we created free software, not word by word terms that not fulfill the requirement at this and future time period.

linkdd•3w ago
You can't say you love opensource and be mad that users are using the freedom you granted.

OpenSource projects are not becoming free training material for AI, AI companies are using a freedom OpenSource projects granted.

The claim that AI can build far superior software is dubious and I don't believe it one second. And even if it were true, that does not change anything.

With or without AI, permissive licenses (MIT, BSD, ISC, ...) always allowed the code to be used and redistributed in non opensource software. If you don't want that, use the GPL or a derive. If you don't believe that the GPL would be enforceable on the derivative works produced by AI, don't release your code as opensource.

OpenSource is essentially an ideology, that software should be free of use, and transparent, and freely shareable, without restriction. If you don't buy into that ideology, it's fine, but don't claim to love OpenSource when you don't. Just like a person who eats fish should not claim to be vegan.

AI will not be the end of OpenSource, firstly because it's a dead-end technology, it has already peaked years ago and is becoming worse with each new model. It does not have the ability to build complex software beyond a CRUD app (would you use a kernel that was entirely vibecoded? would you trust it the way you trust the Linux kernel?). Secondly, because OpenSource does not discriminate who gets to enjoy the freedom you granted.

You decided to "work for free" when you decided to distribute as OpenSource. If you don't want to work for free, maybe OpenSource is not for you.

xign•3w ago
The whole point of open source license is that they are a legal document that can be enforced and have legal meaning. It's not just a feel-good article. Your argument is like saying to a client who you are drafting a contract to and say "oh yeah don't worry about the word by word terms in the contract, wink".

Also, this "non-AI" license is plainly not open source nor is it permissive. You can't really say you are a fan of open source when you use a license like this. The whole pt of the MIT license is that you just take it with no strings attached. You can use the software for good or for evil. It's not the license's job to decide.

There is nothing wrong with not liking open source, btw. The largest tech companies in the world all have their most critical software behind closed doors. I just really dislike it when people engage in double-speak and go on this open source clout chasing. This is also why all these hipsters startups (MongoDB, Redis, etc) all ended up enshittifying their open source products IMO, because culturally we are all trying to chase this "we ♥ open source" meme without thinking whether it makes sense.

If people say they "truly love open source", they should mean it.

insane_dreamer•3w ago
this shouldn't have been flagged (regardless of whether you agree with the argument made by OP or not)