frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Start all of your commands with a comma

https://rhodesmill.org/brandon/2009/commands-with-comma/
208•theblazehen•2d ago•62 comments

OpenCiv3: Open-source, cross-platform reimagining of Civilization III

https://openciv3.org/
685•klaussilveira•15h ago•204 comments

The Waymo World Model

https://waymo.com/blog/2026/02/the-waymo-world-model-a-new-frontier-for-autonomous-driving-simula...
959•xnx•20h ago•553 comments

Unseen Footage of Atari Battlezone Arcade Cabinet Production

https://arcadeblogger.com/2026/02/02/unseen-footage-of-atari-battlezone-cabinet-production/
65•videotopia•4d ago•3 comments

How we made geo joins 400× faster with H3 indexes

https://floedb.ai/blog/how-we-made-geo-joins-400-faster-with-h3-indexes
126•matheusalmeida•2d ago•35 comments

Jeffrey Snover: "Welcome to the Room"

https://www.jsnover.com/blog/2026/02/01/welcome-to-the-room/
28•kaonwarb•3d ago•23 comments

Vocal Guide – belt sing without killing yourself

https://jesperordrup.github.io/vocal-guide/
44•jesperordrup•5h ago•23 comments

Show HN: Look Ma, No Linux: Shell, App Installer, Vi, Cc on ESP32-S3 / BreezyBox

https://github.com/valdanylchuk/breezydemo
236•isitcontent•15h ago•26 comments

Monty: A minimal, secure Python interpreter written in Rust for use by AI

https://github.com/pydantic/monty
230•dmpetrov•15h ago•122 comments

Show HN: I spent 4 years building a UI design tool with only the features I use

https://vecti.com
334•vecti•17h ago•146 comments

Where did all the starships go?

https://www.datawrapper.de/blog/science-fiction-decline
26•speckx•3d ago•14 comments

Hackers (1995) Animated Experience

https://hackers-1995.vercel.app/
499•todsacerdoti•23h ago•244 comments

Sheldon Brown's Bicycle Technical Info

https://www.sheldonbrown.com/
384•ostacke•21h ago•97 comments

ga68, the GNU Algol 68 Compiler – FOSDEM 2026 [video]

https://fosdem.org/2026/schedule/event/PEXRTN-ga68-intro/
7•matt_d•3d ago•2 comments

Microsoft open-sources LiteBox, a security-focused library OS

https://github.com/microsoft/litebox
360•aktau•21h ago•183 comments

Show HN: If you lose your memory, how to regain access to your computer?

https://eljojo.github.io/rememory/
295•eljojo•18h ago•186 comments

An Update on Heroku

https://www.heroku.com/blog/an-update-on-heroku/
420•lstoll•21h ago•280 comments

PC Floppy Copy Protection: Vault Prolok

https://martypc.blogspot.com/2024/09/pc-floppy-copy-protection-vault-prolok.html
66•kmm•5d ago•10 comments

Dark Alley Mathematics

https://blog.szczepan.org/blog/three-points/
95•quibono•4d ago•22 comments

Was Benoit Mandelbrot a hedgehog or a fox?

https://arxiv.org/abs/2602.01122
21•bikenaga•3d ago•11 comments

How to effectively write quality code with AI

https://heidenstedt.org/posts/2026/how-to-effectively-write-quality-code-with-ai/
262•i5heu•18h ago•210 comments

Delimited Continuations vs. Lwt for Threads

https://mirageos.org/blog/delimcc-vs-lwt
33•romes•4d ago•3 comments

Female Asian Elephant Calf Born at the Smithsonian National Zoo

https://www.si.edu/newsdesk/releases/female-asian-elephant-calf-born-smithsonians-national-zoo-an...
38•gmays•10h ago•13 comments

Introducing the Developer Knowledge API and MCP Server

https://developers.googleblog.com/introducing-the-developer-knowledge-api-and-mcp-server/
61•gfortaine•12h ago•26 comments

I now assume that all ads on Apple news are scams

https://kirkville.com/i-now-assume-that-all-ads-on-apple-news-are-scams/
1074•cdrnsf•1d ago•460 comments

Understanding Neural Network, Visually

https://visualrambling.space/neural-network/
294•surprisetalk•3d ago•44 comments

I spent 5 years in DevOps – Solutions engineering gave me what I was missing

https://infisical.com/blog/devops-to-solutions-engineering
152•vmatsiiako•20h ago•72 comments

The AI boom is causing shortages everywhere else

https://www.washingtonpost.com/technology/2026/02/07/ai-spending-economy-shortages/
13•1vuio0pswjnm7•1h ago•0 comments

Why I Joined OpenAI

https://www.brendangregg.com/blog/2026-02-07/why-i-joined-openai.html
158•SerCe•11h ago•144 comments

Learning from context is harder than we thought

https://hy.tencent.com/research/100025?langVersion=en
187•limoce•3d ago•103 comments
Open in hackernews

New analog chip capable of outperforming top-end GPUs by as much as 1000x

https://www.livescience.com/technology/computing/china-solves-century-old-problem-with-new-analog-chip-that-is-1-000-times-faster-than-high-end-nvidia-gpus
79•mrbluecoat•3mo ago
Study: https://www.nature.com/articles/s41928-025-01477-0

Comments

alexnewman•3mo ago
What’s this good for?
andrewstuart•3mo ago
Fear
falseprofit•3mo ago
Computing, they propose
teruakohatu•3mo ago
Faster than an H100 for solving 128x128 matrices. But it’s not clear to me how they tested this, code is only available on request.

> We have described a high-precision and scalable analogue matrix equation solver. The solver involves low-precision matrix operations, which are suited well to RRAM-based computing. The matrix operations were implemented with a foundry-developed 40-nm 1T1R RRAM array with 3-bit resolution. Bit-slicing was used to guarantee the high preci- sion. Scalability was addressed through the BlockAMC algorithm, which was experimentally demonstrated. A 16 × 16 matrix inversion problem was solved with the BlockAMC algorithm with 24-bit fixed-point preci- sion. The analogue solver was also applied to the detection process in massive MIMO systems and showed identical BER performance within only three iterative cycles compared with digital counterparts for 128 × 8 systems with 256-QAM modulation.

alyxya•3mo ago
This looks like one of many ideas for more efficient compute chips for machine learning. I'm waiting for the day some chip gets mass produced and works at scale for some large model and with sufficient reliability, but until then, I don't think there's anything particularly newsworthy here. I do think it'll eventually happen at some point maybe within a decade, but surely some alternative computing paradigm to the GPU will succeed. The analog chip in the article only seems to be a research prototype for now.
drnick1•3mo ago
Seems a bit too good to be true.
gnarlouse•3mo ago
Huge if true, room temperature semiconductor if false
makapuf•3mo ago
Semi or supra conductor ?
generuso•3mo ago
The idea was always appealing, but the implementation has always remained challenging.

For over a decade, "Mythic AI" was making accelerator chips with analog multipliers based on research by Laura Fick and coworkers. They raised $165M and produced actual hardware, but at the end of 2022 have almost gone bankrupt and since then there has been very little heard from them.

Much earlier, the legendary chip designers Federico Faggin and Carver Mead founded Synaptics with an idea to make neuromorphic chips which would be fast and power efficient by harnessing analog computation. Carver Mead published a book on that in 1989: "Analog VLSI and Neural Systems", but making working chips turned to be too hard, and Synaptics successfully pivoted to touchpads and later many other types of hardware.

Of course, the concept can be traced to an even older and still more legendary Frank Rosenblatt's "Perceptron" -- the original machine learning system from 1950s. It implemented the weights of the neural network as variable resistors that were adjusted by little motors during training. Multiplication was simply input voltage times conductivity of the resistor producing the current -- which is what all the newer system are also trying to use.

rasz•3mo ago
I know of only one real world successful product using analog computation in place of expensive high end micro. It was the first proper (no dedicated special mousepads) Optical Mouse designed and build by HP->Agilent->Avago and released by Microsoft in 1999 as IntelliMouse Optical. https://gizmodo.com/20-years-ago-microsoft-changed-how-we-mo... Afaik Microsoft bought 1 year explosivity for the sensor. Avago HDNS-2000 chip did all the heavy lifting in analog domain.

Travis Blalock Oral History https://www.youtube.com/watch?v=wmqa9XJED-Q https://archive.computerhistory.org/resources/access/text/20...:

"each array element had nearest neighbor connectivity so you would calculate nine correlations, an autocorrelation and eight cross-correlations, with each of your eight nearest neighbors, the diagonals and the perpendicular, and then you could interpolate in correlation space where the best fit was."

"And the reason we did difference squared instead of multiplication is because in the analog domain I could implement a difference-squared circuit with six transistors and so I was like “Okay, six transistors. I can’t do multiplication that cheaply so sold, difference squared, that’s how we’re going to do it.”

"little chip running in the 0.8 micron CMOS could do the equivalent operations per second to 1-1/2 giga operations per second and it was doing this for under 200 milliwatts, nothing you could have approached at that time in the digital domain."

Extra Oral History with inventor of the sensor Gary Gordon: https://www.youtube.com/watch?v=TxxoWhCzIeU

physarum_salad•3mo ago
The optical mouse is great example. There are lots pre-90s ofc such as in military applications.

One of the reasons for failure to compete is that actually all computers are physical computers. Therefore digital is still tethered to one of the greatest analog components ever discovered and as a result when you do analog ai you are really competing with the physics of the transistor. The digital computation is the complex icing on the top of an analog cake.

smartbit•3mo ago
The idea of analog neural networks is appealing. I bought Analog VLSI and Neural Systems in 1989 and still have it as a trophy on my bookshelves. My gut feeling says one day analog neural networks will be a thing, if only for the reason of considerable lower power consumption.

I’m not saying that life is analog, DNA is two bits. IMHO life is a mix of Analog & Digital.

pk-protect-ai•3mo ago
It is very difficult to scale digital-analog hybrids, because of amount of DAC-ADC components required.
ConteMascetti71•3mo ago
Using all analog signal, why non analogue multiplying cells (operation amplifier)!
Archit3ch•3mo ago
Now put it in a guitar pedal!
smitty1e•3mo ago
Wo Fat has you covered: "Analog Man" => https://open.spotify.com/track/6KcM6et6Pn6UIna1o8Vl07?si=qFu...
xeonmc•3mo ago
Wonderful, can’t wait to run Crysis with this chip.
rapjr9•3mo ago
This group has had some success turning machine learning algorithms into low power analog chips:

https://sites.dartmouth.edu/odame/

Not the same as general purpose training type computations though.

vivzkestrel•3mo ago
But what do we do about bottleneck operating systems like Windows 11. You can give them a chip 10000x faster but they find way to add more telemetry, more bloat and thus render those gains meaningless. Let us talk about this from the perspective of a gamer, a guy who solely depends on Windows for Visual C++, .NET SDK etc etc (the versions for these go back all the way to 2000s) We need an OS capable of running games all the way from good old Quake 2 to modern games but the GPU isnt the bottleneck anymore
physarum_salad•3mo ago
Device to device variability is not considered? This is a huge problem in analog computing
hossbeast•3mo ago
Can it run doom?
slater•3mo ago
yes, but Vimeo videos will still chug on it