frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Show HN: LocalGPT – A local-first AI assistant in Rust with persistent memory

https://github.com/localgpt-app/localgpt
144•yi_wang•5h ago•45 comments

Haskell for all: Beyond agentic coding

https://haskellforall.com/2026/02/beyond-agentic-coding
67•RebelPotato•4h ago•16 comments

Bye Bye Humanity: The Potential AMOC Collapse

https://thatjoescott.com/2026/02/03/bye-bye-humanity-the-potential-amoc-collapse/
50•rolph•3h ago•38 comments

SectorC: A C Compiler in 512 bytes (2023)

https://xorvoid.com/sectorc.html
262•valyala•13h ago•51 comments

Total surface area required to fuel the world with solar (2009)

https://landartgenerator.org/blagi/archives/127
29•robtherobber•4d ago•21 comments

Software factories and the agentic moment

https://factory.strongdm.ai/
205•mellosouls•15h ago•355 comments

Speed up responses with fast mode

https://code.claude.com/docs/en/fast-mode
169•surprisetalk•12h ago•163 comments

LLMs as the new high level language

https://federicopereiro.com/llm-high/
72•swah•4d ago•125 comments

Brookhaven Lab's RHIC concludes 25-year run with final collisions

https://www.hpcwire.com/off-the-wire/brookhaven-labs-rhic-concludes-25-year-run-with-final-collis...
73•gnufx•11h ago•59 comments

Hoot: Scheme on WebAssembly

https://www.spritely.institute/hoot/
182•AlexeyBrin•18h ago•35 comments

Stories from 25 Years of Software Development

https://susam.net/twenty-five-years-of-computing.html
174•vinhnx•15h ago•17 comments

Vocal Guide – belt sing without killing yourself

https://jesperordrup.github.io/vocal-guide/
324•jesperordrup•23h ago•97 comments

Why there is no official statement from Substack about the data leak

https://techcrunch.com/2026/02/05/substack-confirms-data-breach-affecting-email-addresses-and-pho...
22•witnessme•2h ago•6 comments

First Proof

https://arxiv.org/abs/2602.05192
135•samasblack•15h ago•81 comments

Vouch

https://twitter.com/mitchellh/status/2020252149117313349
73•chwtutha•3h ago•17 comments

Wood Gas Vehicles: Firewood in the Fuel Tank (2010)

https://solar.lowtechmagazine.com/2010/01/wood-gas-vehicles-firewood-in-the-fuel-tank/
32•Rygian•2d ago•8 comments

Show HN: I saw this cool navigation reveal, so I made a simple HTML+CSS version

https://github.com/Momciloo/fun-with-clip-path
83•momciloo•12h ago•17 comments

The Architecture of Open Source Applications (Volume 1) Berkeley DB

https://aosabook.org/en/v1/bdb.html
6•grep_it•5d ago•0 comments

Al Lowe on model trains, funny deaths and working with Disney

https://spillhistorie.no/2026/02/06/interview-with-sierra-veteran-al-lowe/
106•thelok•14h ago•24 comments

Start all of your commands with a comma (2009)

https://rhodesmill.org/brandon/2009/commands-with-comma/
586•theblazehen•3d ago•212 comments

Show HN: A luma dependent chroma compression algorithm (image compression)

https://www.bitsnbites.eu/a-spatial-domain-variable-block-size-luma-dependent-chroma-compression-...
40•mbitsnbites•3d ago•5 comments

FDA intends to take action against non-FDA-approved GLP-1 drugs

https://www.fda.gov/news-events/press-announcements/fda-intends-take-action-against-non-fda-appro...
112•randycupertino•8h ago•238 comments

The AI boom is causing shortages everywhere else

https://www.washingtonpost.com/technology/2026/02/07/ai-spending-economy-shortages/
310•1vuio0pswjnm7•19h ago•494 comments

Learning from context is harder than we thought

https://hy.tencent.com/research/100025?langVersion=en
234•limoce•4d ago•125 comments

Where did all the starships go?

https://www.datawrapper.de/blog/science-fiction-decline
158•speckx•4d ago•242 comments

OpenCiv3: Open-source, cross-platform reimagining of Civilization III

https://openciv3.org/
906•klaussilveira•1d ago•277 comments

Microsoft account bugs locked me out of Notepad – Are thin clients ruining PCs?

https://www.windowscentral.com/microsoft/windows-11/windows-locked-me-out-of-notepad-is-the-thin-...
147•josephcsible•10h ago•186 comments

Selection rather than prediction

https://voratiq.com/blog/selection-rather-than-prediction/
35•languid-photic•4d ago•16 comments

Show HN: Look Ma, No Linux: Shell, App Installer, Vi, Cc on ESP32-S3 / BreezyBox

https://github.com/valdanylchuk/breezydemo
304•isitcontent•1d ago•39 comments

An Update on Heroku

https://www.heroku.com/blog/an-update-on-heroku/
497•lstoll•1d ago•331 comments
Open in hackernews

New analog chip capable of outperforming top-end GPUs by as much as 1000x

https://www.livescience.com/technology/computing/china-solves-century-old-problem-with-new-analog-chip-that-is-1-000-times-faster-than-high-end-nvidia-gpus
79•mrbluecoat•3mo ago
Study: https://www.nature.com/articles/s41928-025-01477-0

Comments

alexnewman•3mo ago
What’s this good for?
andrewstuart•3mo ago
Fear
falseprofit•3mo ago
Computing, they propose
teruakohatu•3mo ago
Faster than an H100 for solving 128x128 matrices. But it’s not clear to me how they tested this, code is only available on request.

> We have described a high-precision and scalable analogue matrix equation solver. The solver involves low-precision matrix operations, which are suited well to RRAM-based computing. The matrix operations were implemented with a foundry-developed 40-nm 1T1R RRAM array with 3-bit resolution. Bit-slicing was used to guarantee the high preci- sion. Scalability was addressed through the BlockAMC algorithm, which was experimentally demonstrated. A 16 × 16 matrix inversion problem was solved with the BlockAMC algorithm with 24-bit fixed-point preci- sion. The analogue solver was also applied to the detection process in massive MIMO systems and showed identical BER performance within only three iterative cycles compared with digital counterparts for 128 × 8 systems with 256-QAM modulation.

alyxya•3mo ago
This looks like one of many ideas for more efficient compute chips for machine learning. I'm waiting for the day some chip gets mass produced and works at scale for some large model and with sufficient reliability, but until then, I don't think there's anything particularly newsworthy here. I do think it'll eventually happen at some point maybe within a decade, but surely some alternative computing paradigm to the GPU will succeed. The analog chip in the article only seems to be a research prototype for now.
drnick1•3mo ago
Seems a bit too good to be true.
gnarlouse•3mo ago
Huge if true, room temperature semiconductor if false
makapuf•3mo ago
Semi or supra conductor ?
generuso•3mo ago
The idea was always appealing, but the implementation has always remained challenging.

For over a decade, "Mythic AI" was making accelerator chips with analog multipliers based on research by Laura Fick and coworkers. They raised $165M and produced actual hardware, but at the end of 2022 have almost gone bankrupt and since then there has been very little heard from them.

Much earlier, the legendary chip designers Federico Faggin and Carver Mead founded Synaptics with an idea to make neuromorphic chips which would be fast and power efficient by harnessing analog computation. Carver Mead published a book on that in 1989: "Analog VLSI and Neural Systems", but making working chips turned to be too hard, and Synaptics successfully pivoted to touchpads and later many other types of hardware.

Of course, the concept can be traced to an even older and still more legendary Frank Rosenblatt's "Perceptron" -- the original machine learning system from 1950s. It implemented the weights of the neural network as variable resistors that were adjusted by little motors during training. Multiplication was simply input voltage times conductivity of the resistor producing the current -- which is what all the newer system are also trying to use.

rasz•3mo ago
I know of only one real world successful product using analog computation in place of expensive high end micro. It was the first proper (no dedicated special mousepads) Optical Mouse designed and build by HP->Agilent->Avago and released by Microsoft in 1999 as IntelliMouse Optical. https://gizmodo.com/20-years-ago-microsoft-changed-how-we-mo... Afaik Microsoft bought 1 year explosivity for the sensor. Avago HDNS-2000 chip did all the heavy lifting in analog domain.

Travis Blalock Oral History https://www.youtube.com/watch?v=wmqa9XJED-Q https://archive.computerhistory.org/resources/access/text/20...:

"each array element had nearest neighbor connectivity so you would calculate nine correlations, an autocorrelation and eight cross-correlations, with each of your eight nearest neighbors, the diagonals and the perpendicular, and then you could interpolate in correlation space where the best fit was."

"And the reason we did difference squared instead of multiplication is because in the analog domain I could implement a difference-squared circuit with six transistors and so I was like “Okay, six transistors. I can’t do multiplication that cheaply so sold, difference squared, that’s how we’re going to do it.”

"little chip running in the 0.8 micron CMOS could do the equivalent operations per second to 1-1/2 giga operations per second and it was doing this for under 200 milliwatts, nothing you could have approached at that time in the digital domain."

Extra Oral History with inventor of the sensor Gary Gordon: https://www.youtube.com/watch?v=TxxoWhCzIeU

physarum_salad•3mo ago
The optical mouse is great example. There are lots pre-90s ofc such as in military applications.

One of the reasons for failure to compete is that actually all computers are physical computers. Therefore digital is still tethered to one of the greatest analog components ever discovered and as a result when you do analog ai you are really competing with the physics of the transistor. The digital computation is the complex icing on the top of an analog cake.

smartbit•3mo ago
The idea of analog neural networks is appealing. I bought Analog VLSI and Neural Systems in 1989 and still have it as a trophy on my bookshelves. My gut feeling says one day analog neural networks will be a thing, if only for the reason of considerable lower power consumption.

I’m not saying that life is analog, DNA is two bits. IMHO life is a mix of Analog & Digital.

pk-protect-ai•3mo ago
It is very difficult to scale digital-analog hybrids, because of amount of DAC-ADC components required.
ConteMascetti71•3mo ago
Using all analog signal, why non analogue multiplying cells (operation amplifier)!
Archit3ch•3mo ago
Now put it in a guitar pedal!
smitty1e•3mo ago
Wo Fat has you covered: "Analog Man" => https://open.spotify.com/track/6KcM6et6Pn6UIna1o8Vl07?si=qFu...
xeonmc•3mo ago
Wonderful, can’t wait to run Crysis with this chip.
rapjr9•3mo ago
This group has had some success turning machine learning algorithms into low power analog chips:

https://sites.dartmouth.edu/odame/

Not the same as general purpose training type computations though.

vivzkestrel•3mo ago
But what do we do about bottleneck operating systems like Windows 11. You can give them a chip 10000x faster but they find way to add more telemetry, more bloat and thus render those gains meaningless. Let us talk about this from the perspective of a gamer, a guy who solely depends on Windows for Visual C++, .NET SDK etc etc (the versions for these go back all the way to 2000s) We need an OS capable of running games all the way from good old Quake 2 to modern games but the GPU isnt the bottleneck anymore
physarum_salad•3mo ago
Device to device variability is not considered? This is a huge problem in analog computing
hossbeast•3mo ago
Can it run doom?
slater•3mo ago
yes, but Vimeo videos will still chug on it