“The new device is built from arrays of resistive random-access memory (RRAM) cells… The team was able to combine the speed of analog computation with the accuracy normally associated with digital processing. Crucially, the chip was manufactured using a commercial production process, meaning it could potentially be mass-produced.”
Article is based on this paper: https://www.nature.com/articles/s41928-025-01477-0



This was bound to happen. Neural networks are inherently analog processes, simulating them digitally is massively expensive in terms of hardware and power.
Digital domain is good for exact computation, analog is better for approximate computation, as required by neural networks.
That and the way companies have been building AI they have been doing so little to optimize compute to instead try to get the research out faster because that’s what is expected in this bubble. I’m absolutely fully expecting to see future research finding plenty of ways to optimize these major models.
But also R&D has been entirely focused on digital chips I would not be at all surprised if there were performance and/or efficiency gains to be had in certain workloads by shifting to analog circuits
You might benefit from watching Hinton’s lecture; much of it details technical reasons why digital is much much better than analog for intelligent systems
BTW that is the opposite of what he set out to prove He says the facts forced him to change his mind
https://m.youtube.com/watch?v=IkdziSLYzHw
Thank you for the link, it was very interesting.
Even though analogue neural networks have the drawback that you can’t copy the neuron weights (currently, but tech may evolve to do it), they can still have use cases in lower powered edge devices.
I think we’ll probably end up with hybrid designs, using digital for most parts except the calculations.
For current LLMs there would be a massive gain in energy efficiency if analogue computing was used. Much of the current energy costs come from stimulating what effectively analogue processing on digital hardware. There’s a lot lost in the conversation, or “emulation” of analogue.
I wish researchers like Hinton would stick to discussing the tech. Anytime he says anything about linguistics or human intelligence he sounds like a CS major smugly raising his hand in Phil 101 to a symphony of collective groans.
Hinton is a good computer scientist (with an infinitesimally narrow field of expertise). But the guy is philosophically illiterate.
That’s a good point. The model weights could be voltage levels instead of digital representations. Lots of audio tech uses analog for better fidelity.I also read that there’s a startup using particle beams for lithography. Exciting times.
what audio tech uses analog for better fidelity?
Vinyl records, analog tube amplifiers, a good pair of speakers 🤌
Honestly though digital compression now is so good it probably sounds the same.
speakers are analog devices by nature.
The other two are used for the distortions they introduce, so quite literally lower fidelity. Whether some people like those distortions is irrelevant.
You want high fidelity: lossless digital audio formats.
Yeah, I get very good sound out of class d amplifiers. They’re cheap; they’re energy efficient, and they usually pack in features for digital formats because it’s easy to do.
At least one Nobel Laureate has exactly the opposite opinion (see the Hinton lecture above)