Hmm... what? So it is not accurate?
However a single analog math operation requires the same energy as a single bit flip in a digital computer. And it takes a lot of bit flips to do a single floating point operation. So a digital calculation can be approximated with far less energy and hardware. And neural nets don't need digital precision to produce useful results.
The point - as shown by the original implementation...
I think they are just saying the coprocessor is pretty accurate, so they don’t need to use these advanced techniques.
Maybe not the specific photonic system that they are describing. Which I'm sure has some significant improvements over what existed then. But the idea of using analog approximations of existing neural net AI models, to allow us to run AI models far more cheaply, with far less energy.
Whether or not this system is the one that wins out, I'm very sure that AI run on an analog system will have a very important role to play in the future. It will allow technologies like guiding autonomous robots with AI models running on hardware inside of the robot.
I remember a TV Program in the UK from the 70's (tomorrows world I think) that talked about this so I am guessing silicon was just more cost effective until now. Still taking it at face value I would say it is quite an exciting technology.
All addition, multiplication, and tanh functions will be done by photon superposition/interference effects, and it will consume zero power (since it's only a complex "lens").
It will probably do parallel computations where each photon frequency range will not interfere with other ranges, allowing multiple "inferences" to be "Shining Thru" simultaneously.
This design will completely solve the energy crisis and each inference will take the same time as it takes light to travel a centimeter. i.e. essentially instantaneous.
Has anyone built a physical ASIC that embeds a full model yet?
- Is the analog computation actually done with light? What's the actual compute element like? Do they have an analog photonic multiplier? Those exist, and have been scaling up for a while.[1] The announcement isn't clear on how much compute is photonic. There are still a lot of digital components involved. Is it worth it to go D/A, generate light, do some photonic operations, go A/D, and put the bits back into memory? That's been the classic problem with photonic computing. Memory is really hard, and without memory, pretty soon you have to go back to a domain where you can store results. Pure photonic systems do exist, such as fiber optic cable amplifiers, but they are memoryless.
- If all this works, is loss of repeatability going to be a problem?
When you pressed Win+E, Windows opens an explorer window (in both versions).
In XP this happens in the span of a single video frame.
In Windows 10, first nothing happens, then a big white rectangle, then you get to watch all the UI elements get painted in one by one.
The really impressive part is that this was before they rewrote explorer as an electron app! I think it might actually be faster now that it's an electron app.
croemer•7mo ago
kevin_thibedeau•7mo ago
ge96•7mo ago
I used to have this weird obsession of doing this, buying old chromebooks putting linux on them, with 4GB of RAM it was still useful but I realize nowadays for "ideal" computing it seems 16GB is a min for RAM
ge96•7mo ago
thom•7mo ago
TedDallas•7mo ago
ghusto•7mo ago
It's like saying "cars are already prohibitively expensive" whilst looking a Ferraris.
imiric•7mo ago
That's demonstrably false. The RTX 4090 released in 2022 with an MSRP of $1,600. Today you'd be hard pressed to find one below $3K that isn't a scam.
The reality is that NVIDIA is taking advantage of their market dominance to increase their markup with every generation of products[1], even when accounting for inflation and price-to-performance. The 50 series is even more egregious, since it delivers a marginal performance increase, yet the marketing relies heavily on frame generation. The trickling supply and scalpers are doing the rest.
AMD and Intel have a more reasonable pricing strategy, but they don't compete at the higher end.
[1]: https://www.digitaltrends.com/computing/nvidias-pricing-stra...
Animats•7mo ago
[1] https://www.msn.com/en-in/money/news/china-s-first-gaming-gp...