frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

X said it would give $1M to a user who had previously shared racist posts

https://www.nbcnews.com/tech/internet/x-pays-1-million-prize-creator-history-racist-posts-rcna257768
1•doener•1m ago•0 comments

155M US land parcel boundaries

https://www.kaggle.com/datasets/landrecordsus/us-parcel-layer
2•tjwebbnorfolk•5m ago•0 comments

Private Inference

https://confer.to/blog/2026/01/private-inference/
1•jbegley•9m ago•0 comments

Font Rendering from First Principles

https://mccloskeybr.com/articles/font_rendering.html
1•krapp•12m ago•0 comments

Show HN: Seedance 2.0 AI video generator for creators and ecommerce

https://seedance-2.net
1•dallen97•16m ago•0 comments

Wally: A fun, reliable voice assistant in the shape of a penguin

https://github.com/JLW-7/Wally
1•PaulHoule•17m ago•0 comments

Rewriting Pycparser with the Help of an LLM

https://eli.thegreenplace.net/2026/rewriting-pycparser-with-the-help-of-an-llm/
1•y1n0•19m ago•0 comments

Lobsters Vibecoding Challenge

https://gist.github.com/MostAwesomeDude/bb8cbfd005a33f5dd262d1f20a63a693
1•tolerance•19m ago•0 comments

E-Commerce vs. Social Commerce

https://moondala.one/
1•HamoodBahzar•20m ago•1 comments

Avoiding Modern C++ – Anton Mikhailov [video]

https://www.youtube.com/watch?v=ShSGHb65f3M
2•linkdd•21m ago•0 comments

Show HN: AegisMind–AI system with 12 brain regions modeled on human neuroscience

https://www.aegismind.app
2•aegismind_app•25m ago•1 comments

Zig – Package Management Workflow Enhancements

https://ziglang.org/devlog/2026/#2026-02-06
1•Retro_Dev•27m ago•0 comments

AI-powered text correction for macOS

https://taipo.app/
1•neuling•30m ago•1 comments

AppSecMaster – Learn Application Security with hands on challenges

https://www.appsecmaster.net/en
1•aqeisi•31m ago•1 comments

Fibonacci Number Certificates

https://www.johndcook.com/blog/2026/02/05/fibonacci-certificate/
1•y1n0•33m ago•0 comments

AI Overviews are killing the web search, and there's nothing we can do about it

https://www.neowin.net/editorials/ai-overviews-are-killing-the-web-search-and-theres-nothing-we-c...
3•bundie•38m ago•1 comments

City skylines need an upgrade in the face of climate stress

https://theconversation.com/city-skylines-need-an-upgrade-in-the-face-of-climate-stress-267763
3•gnabgib•39m ago•0 comments

1979: The Model World of Robert Symes [video]

https://www.youtube.com/watch?v=HmDxmxhrGDc
1•xqcgrek2•43m ago•0 comments

Satellites Have a Lot of Room

https://www.johndcook.com/blog/2026/02/02/satellites-have-a-lot-of-room/
2•y1n0•44m ago•0 comments

1980s Farm Crisis

https://en.wikipedia.org/wiki/1980s_farm_crisis
4•calebhwin•44m ago•1 comments

Show HN: FSID - Identifier for files and directories (like ISBN for Books)

https://github.com/skorotkiewicz/fsid
1•modinfo•49m ago•0 comments

Show HN: Holy Grail: Open-Source Autonomous Development Agent

https://github.com/dakotalock/holygrailopensource
1•Moriarty2026•56m ago•1 comments

Show HN: Minecraft Creeper meets 90s Tamagotchi

https://github.com/danielbrendel/krepagotchi-game
1•foxiel•1h ago•1 comments

Show HN: Termiteam – Control center for multiple AI agent terminals

https://github.com/NetanelBaruch/termiteam
1•Netanelbaruch•1h ago•0 comments

The only U.S. particle collider shuts down

https://www.sciencenews.org/article/particle-collider-shuts-down-brookhaven
2•rolph•1h ago•1 comments

Ask HN: Why do purchased B2B email lists still have such poor deliverability?

1•solarisos•1h ago•3 comments

Show HN: Remotion directory (videos and prompts)

https://www.remotion.directory/
1•rokbenko•1h ago•0 comments

Portable C Compiler

https://en.wikipedia.org/wiki/Portable_C_Compiler
2•guerrilla•1h ago•0 comments

Show HN: Kokki – A "Dual-Core" System Prompt to Reduce LLM Hallucinations

1•Ginsabo•1h ago•0 comments

Software Engineering Transformation 2026

https://mfranc.com/blog/ai-2026/
1•michal-franc•1h ago•0 comments
Open in hackernews

Tell HN: We've come a long way since GPT-3 days

3•behnamoh•9mo ago
I remember the old days when the only open-weight model out there was BLOOM, a 176B parameter model WITHOUT QUANTIZATION that wasn't comparable to GPT-3 but still gave us hope that the future would be bright!

I remember when the local AI community was just a few thousand enthusiasts who were curious about these new language models. We used to sit aside and watch OpenAI make strides with their giant models, and our wish was to bring at least some of that power to our measly small machines, locally.

Then Meta's Llama-1 leak happened and it opened the pandora's box of AI. Was it better than GPT-3.5? Not really, but it kick started the push to making small capable models. Llama.cpp was a turning point. People figured out how to run LLMs on CPU.

Then the community came up with GGML quants (later renamed to GGUF), making models even more accessible to the masses. Several companies joined the race to AGI: Mistral with their mistral-7b and mixtral models really brought more performance to small models and opened our eyes to the power of MoE.

Many models and finetunes kept popping up. TheBloke was tirelessly providing all the quants of these models. Then one day he/she went silent and we never heard from them again (hope they're ok).

You could tell this was mostly an enthusiasts hobby by looking at the names of projects! The one that was really out there was "oobabooga" The thing was actually called "Text Generation Web UI" but everyone kept calling it ooba or oobabooga (that's its creator's username).

Then came the greed... Companies figured out there was potential in this, so they worked on new language models for their own bottom-line reasons, but it didn't matter to us since we kept getting good models for free (although sometimes the licenses were restrictive and we ignored those models).

When we found out about LoRA and QLoRA, it was a game changer. So many people finetuned models for various purposes. I kept asking: do you guys really use it for role-playing? And turns out yes, many people liked the idea of talking to various AI personas. Soon people figured out how to bypass guardrails by prompt injection attacks or other techniques.

Now, 3 years later, we have tens of open-weight models. I say open-WEIGHT because I think I only saw one or two truly open-SOURCE models. I saw many open source tools developed for and around these models, so many wrappers, so many apps. Most are abandoned now. I wonder if their developers realized they were in high demand and could get paid for their hard work if they didn't just release everything out in the open.

I remember the GPT-4 era: a lot of papers and models started to appear on my feed. It was so overwhelming that I started to think: "is this was singularity feels like?" I know we're nowhere near singularity, but the pace of advancements in this field and the need to keep yourself updated at all times has truly been amazing! OpenAI used to say they didn't open-source GPT-3 because it was "too dangerous" for the society. We now have way more capable open-weight models that make GPT-3 look like a toy, and guess what, no harm happened to the society, business as usual.

A question we kept getting was: "can this 70B model run on my 3090?" Clearly, the appeal of running these LLMs locally was great, as can be seen by looking at the GPU prices. I remain hopeful that Nvidia's monopoly will collapse and we'll get more competitive prices and products from AMD, Intel, Apple, etc.

I appreciate everyone who taught me something new about LLMs and everything related to them. It's been a journey.