Hmm... what? So it is not accurate?
However a single analog math operation requires the same energy as a single bit flip in a digital computer. And it takes a lot of bit flips to do a single floating point operation. So a digital calculation can be approximated with far less energy and hardware. And neural nets don't need digital precision to produce useful results.
The point - as shown by the original implementation...
I think they are just saying the coprocessor is pretty accurate, so they don’t need to use these advanced techniques.
Maybe not the specific photonic system that they are describing. Which I'm sure has some significant improvements over what existed then. But the idea of using analog approximations of existing neural net AI models, to allow us to run AI models far more cheaply, with far less energy.
Whether or not this system is the one that wins out, I'm very sure that AI run on an analog system will have a very important role to play in the future. It will allow technologies like guiding autonomous robots with AI models running on hardware inside of the robot.
I remember a TV Program in the UK from the 70's (tomorrows world I think) that talked about this so I am guessing silicon was just more cost effective until now. Still taking it at face value I would say it is quite an exciting technology.
All addition, multiplication, and tanh functions will be done by photon superposition/interference effects, and it will consume zero power (since it's only a complex "lens").
It will probably do parallel computations where each photon frequency range will not interfere with other ranges, allowing multiple "inferences" to be "Shining Thru" simultaneously.
This design will completely solve the energy crisis and each inference will take the same time as it takes light to travel a centimeter. i.e. essentially instantaneous.
- Is the analog computation actually done with light? What's the actual compute element like? Do they have an analog photonic multiplier? Those exist, and have been scaling up for a while.[1] The announcement isn't clear on how much compute is photonic. There are still a lot of digital components involved. Is it worth it to go D/A, generate light, do some photonic operations, go A/D, and put the bits back into memory? That's been the classic problem with photonic computing. Memory is really hard, and without memory, pretty soon you have to go back to a domain where you can store results. Pure photonic systems do exist, such as fiber optic cable amplifiers, but they are memoryless.
- If all this works, is loss of repeatability going to be a problem?
croemer•6h ago
kevin_thibedeau•6h ago
ge96•5h ago
I used to have this weird obsession of doing this, buying old chromebooks putting linux on them, with 4GB of RAM it was still useful but I realize nowadays for "ideal" computing it seems 16GB is a min for RAM
ge96•4h ago
TedDallas•5h ago
ghusto•5h ago
It's like saying "cars are already prohibitively expensive" whilst looking a Ferraris.
Animats•5h ago
[1] https://www.msn.com/en-in/money/news/china-s-first-gaming-gp...