The big secret of efficiency lies in the working mode: while conventional neural networks are constantly calculating whether there is something to do or not, SNN neurons only fire when relevant events occur.
Classic AI models, from ChatGPT to image generators, are “frozen” knowledge stores. If companies want to adapt such models to their own data, they have to retrain them – an energy- and time-intensive process. SNNs, on the other hand, can change their weightings on the fly without interrupting operations.
Nvidia’s GPUs currently dominate the AI market like Intel’s x86 processors used to dominate the PC sector. Their architecture is not, however, optimized for energy efficiency, but for raw parallel performance. Neuromorphic systems such as those from SpiNNcloud are no substitute for these computing monsters, not yet. But they are strategically dangerous in certain fields.
t43562•1h ago
The big secret of efficiency lies in the working mode: while conventional neural networks are constantly calculating whether there is something to do or not, SNN neurons only fire when relevant events occur.
Classic AI models, from ChatGPT to image generators, are “frozen” knowledge stores. If companies want to adapt such models to their own data, they have to retrain them – an energy- and time-intensive process. SNNs, on the other hand, can change their weightings on the fly without interrupting operations.
Nvidia’s GPUs currently dominate the AI market like Intel’s x86 processors used to dominate the PC sector. Their architecture is not, however, optimized for energy efficiency, but for raw parallel performance. Neuromorphic systems such as those from SpiNNcloud are no substitute for these computing monsters, not yet. But they are strategically dangerous in certain fields.