frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

Show HN: I'm an airline pilot – I built interactive graphs/globes of my flights

https://jameshard.ing/pilot
1075•jamesharding•13h ago•163 comments

Normalizing Flows Are Capable Generative Models

https://machinelearning.apple.com/research/normalizing-flows
87•danboarder•6h ago•6 comments

Learn OCaml

https://ocaml-sf.org/learn-ocaml-public/#activity=exercises
69•smartmic•6h ago•20 comments

James Webb Space Telescope Reveals Its First Direct Image of an Exoplanet

https://www.smithsonianmag.com/smart-news/james-webb-space-telescope-reveals-its-first-direct-image-discovery-of-an-exoplanet-180986886/
131•divbzero•9h ago•60 comments

SymbolicAI: A neuro-symbolic perspective on LLMs

https://github.com/ExtensityAI/symbolicai
120•futurisold•8h ago•35 comments

Structuring Arrays with Algebraic Shapes

https://dl.acm.org/doi/abs/10.1145/3736112.3736141
65•todsacerdoti•6h ago•5 comments

C compiler for Web Assembly (c4wa)

https://github.com/kign/c4wa
10•90s_dev•3d ago•0 comments

Reinforcement learning, explained with a minimum of math and jargon

https://www.understandingai.org/p/reinforcement-learning-explained
51•JnBrymn•3d ago•1 comments

Multi-Stage Programming with Splice Variables

https://tsung-ju.org/icfp25/
16•matt_d•3h ago•1 comments

Qwen VLo: From "Understanding" the World to "Depicting" It

https://qwenlm.github.io/blog/qwen-vlo/
170•lnyan•12h ago•50 comments

10 Years of Pomological Watercolors

https://parkerhiggins.net/2025/04/10-years-of-pomological-watercolors/
171•fanf2•12h ago•28 comments

nimbme – Nim bare-metal environment

https://github.com/mikra01/nimbme
44•michaelsbradley•8h ago•9 comments

bootc-image-builder: Build your entire OS from a Containerfile

https://github.com/osbuild/bootc-image-builder
32•twelvenmonkeys•3d ago•7 comments

Transmitting data via ultrasound without any special equipment

https://halcy.de/blog/2025/06/27/transmitting-data-via-ultrasound-without-any-special-equipment/
98•todsacerdoti•9h ago•31 comments

Theoretical Analysis of Positional Encodings in Transformer Models

https://arxiv.org/abs/2506.06398
15•PaulHoule•4h ago•1 comments

Facebook is starting to feed its AI with private, unpublished photos

https://www.theverge.com/meta/694685/meta-ai-camera-roll
67•pier25•2h ago•42 comments

Rust in the Linux kernel: part 2

https://lwn.net/SubscriberLink/1025232/fbb2d90d084368e3/
71•chmaynard•4h ago•2 comments

Spark AI (YC W24) is hiring a full-stack engineer in SF (founding team)

https://www.ycombinator.com/companies/spark/jobs/kDeJlPK-software-engineer-full-stack-founding-team
1•juliawu•5h ago

New Process Uses Microbes to Create Valuable Materials from Urine

https://newscenter.lbl.gov/2025/06/17/new-process-uses-microbes-to-create-valuable-materials-from-urine/
26•gmays•8h ago•5 comments

The Journey of Bypassing Ubuntu's Unprivileged Namespace Restriction

https://u1f383.github.io/linux/2025/06/26/the-journey-of-bypassing-ubuntus-unprivileged-namespace-restriction.html
14•Bogdanp•5h ago•1 comments

Weird Expressions in Rust

https://www.wakunguma.com/blog/rust-weird-expr
142•lukastyrychtr•11h ago•111 comments

A Brief History of Children Sent Through the Mail (2016)

https://www.smithsonianmag.com/smart-news/brief-history-children-sent-through-mail-180959372/
84•m-hodges•6h ago•75 comments

Whitesmiths C compiler: One of the earliest commercial C compilers available

https://github.com/hansake/Whitesmiths-C-compiler
96•todsacerdoti•4d ago•31 comments

Glass nanostructures reflect nearly all visible light, challenging assumptions

https://phys.org/news/2025-06-glass-nanostructures-visible-photonics-assumptions.html
26•bookofjoe•3d ago•4 comments

Does a Focus on Royalty Obscure British History?

https://www.historytoday.com/archive/head-head/does-focus-royalty-obscure-british-history
16•pepys•3d ago•5 comments

A New Kind of Computer (April 2025)

https://lightmatter.co/blog/a-new-kind-of-computer/
41•gkolli•3d ago•17 comments

Parameterized types in C using the new tag compatibility rule

https://nullprogram.com/blog/2025/06/26/
130•ingve•21h ago•64 comments

Slightly better named character reference tokenization than Chrome, Safari, FF

https://www.ryanliptak.com/blog/better-named-character-reference-tokenization/
48•todsacerdoti•1d ago•8 comments

PJ5 TTL CPU

https://pj5cpu.wordpress.com/
81•doener•19h ago•2 comments

Project Vend: Can Claude run a small shop? (And why does that matter?)

https://www.anthropic.com/research/project-vend-1
203•gk1•10h ago•86 comments
Open in hackernews

A New Kind of Computer (April 2025)

https://lightmatter.co/blog/a-new-kind-of-computer/
41•gkolli•3d ago

Comments

croemer•6h ago
I stopped reading after "Soon, you will not be able to afford your computer. Consumer GPUs are already prohibitively expensive."
kevin_thibedeau•6h ago
This is always a hilarious take. If you inflation adjust a 386 PC from the early 90s when 486's were on the market you'd find they range in excess of $3000 and the 486s are in the $5000 zone. Computers are incredibly cheap now. What isn't cheap is the bleeding edge. A place fewer and fewer people have to be at, which leads to lower demand and higher prices to compensate.
ge96•5h ago
It is crazy you can buy a used laptop for $15 and do something meaningful with like writing code (meaningful as in make money)

I used to have this weird obsession of doing this, buying old chromebooks putting linux on them, with 4GB of RAM it was still useful but I realize nowadays for "ideal" computing it seems 16GB is a min for RAM

ge96•4h ago
It's like the black Mac from 2007, I know its tech is outdated but I want it
TedDallas•5h ago
It was kind or that way in early days of high end personal computing. I remember seeing an ad in the early 90s for a 486 laptop that was $6,000. Historically prices have always gone down. You just have to wait. SoTA is always going to go for a premium.
ghusto•5h ago
That irked me too. "_Bleeding edge" consumer GPUs are ...", sure, but you wait 6 months and you have it at a fraction of the cost.

It's like saying "cars are already prohibitively expensive" whilst looking a Ferraris.

Animats•5h ago
That's related more to NVidia's discovery that they could get away with huge margins, and the China GPU projects for graphics being years behind.[1]

[1] https://www.msn.com/en-in/money/news/china-s-first-gaming-gp...

Anduia•6h ago
> Critically, this processor achieves accuracies approaching those of conventional 32-bit floating-point digital systems “out-of-the-box,” without relying on advanced methods such as fine-tuning or quantization-aware training.

Hmm... what? So it is not accurate?

btilly•5h ago
It's an analog system. Which means that accuracy is naturally limited.

However a single analog math operation requires the same energy as a single bit flip in a digital computer. And it takes a lot of bit flips to do a single floating point operation. So a digital calculation can be approximated with far less energy and hardware. And neural nets don't need digital precision to produce useful results.

B1FF_PSUVM•5h ago
> neural nets don't need digital precision to produce useful results.

The point - as shown by the original implementation...

bee_rider•4h ago
It seems weirdly backwards. They don’t do techniques like quantization aware tuning to increase the accuracy of the coprocessor, right? (I mean that’s nonsense). They use those techniques, to allow them to use less accurate coprocessors, I thought.

I think they are just saying the coprocessor is pretty accurate, so they don’t need to use these advanced techniques.

btilly•6h ago
This paradigm for computing was already covered three years ago by Veratasium in https://www.youtube.com/watch?v=GVsUOuSjvcg.

Maybe not the specific photonic system that they are describing. Which I'm sure has some significant improvements over what existed then. But the idea of using analog approximations of existing neural net AI models, to allow us to run AI models far more cheaply, with far less energy.

Whether or not this system is the one that wins out, I'm very sure that AI run on an analog system will have a very important role to play in the future. It will allow technologies like guiding autonomous robots with AI models running on hardware inside of the robot.

boznz•5h ago
Weirdly complex to read yet light on key technical details. My TLDR (as an old clueless electronics engineer) was the compute part is photonic/analog, lasers and waveguides, yet we still require 50 billion transistors performing the (I guess non-compute) parts such as ADC, I/O, memory etc. The bottom line is 65 TOPS for <80W - The processing (optical) part consuming 1.65W and the 'helper electronics' consuming the rest so scaling the (optical) processing should not have the thermal bottlenecks of a solely transistor based processor. Also parallelism of the optical part though using different wavelengths of light as threads may be possible. Nothing about problems, costs, or can the helper electronics eventually use photonics.

I remember a TV Program in the UK from the 70's (tomorrows world I think) that talked about this so I am guessing silicon was just more cost effective until now. Still taking it at face value I would say it is quite an exciting technology.

quantadev•5h ago
In 25 years we'll have #GlassModels. A "chip", which is a passive device (just a complex lens) made only of glass or graphene, which can do an "AI Inference" simply by shining the "input tokens" thru it. (i.e. arrays of photons). In other words, the "numeric value" at one MLP "neuron input" will be the amplitude of the light (number of simultaneous photons).

All addition, multiplication, and tanh functions will be done by photon superposition/interference effects, and it will consume zero power (since it's only a complex "lens").

It will probably do parallel computations where each photon frequency range will not interfere with other ranges, allowing multiple "inferences" to be "Shining Thru" simultaneously.

This design will completely solve the energy crisis and each inference will take the same time as it takes light to travel a centimeter. i.e. essentially instantaneous.

gcanyon•2h ago
For years I've been fascinated by those little solar-powered calculators. In a weird way, they're devices that enable us to cast hand shadows to do arithmetic.
quantadev•2h ago
Lookup "Analog Optical Computing". There was recently a breakthrough just last week where optical computing researchers were able to use photon interference effects to do mathematical operations purely in analog! That means no 0s and 1s, just pure optics. Paste all that into Gemini to learn more.
Animats•5h ago
Interesting. Questions, the Nature paper being expensively paywalled:

- Is the analog computation actually done with light? What's the actual compute element like? Do they have an analog photonic multiplier? Those exist, and have been scaling up for a while.[1] The announcement isn't clear on how much compute is photonic. There are still a lot of digital components involved. Is it worth it to go D/A, generate light, do some photonic operations, go A/D, and put the bits back into memory? That's been the classic problem with photonic computing. Memory is really hard, and without memory, pretty soon you have to go back to a domain where you can store results. Pure photonic systems do exist, such as fiber optic cable amplifiers, but they are memoryless.

- If all this works, is loss of repeatability going to be a problem?

[1] https://ieeexplore.ieee.org/document/10484797