“The new device is built from arrays of resistive random-access memory (RRAM) cells… The team was able to combine the speed of analog computation with the accuracy normally associated with digital processing. Crucially, the chip was manufactured using a commercial production process, meaning it could potentially be mass-produced.”
Article is based on this paper: https://www.nature.com/articles/s41928-025-01477-0
sounds like bullshit.
This already a thing, there’s a US lab doing this
(x) Doubt.
Same here. I wait to see real life calculations done by such circuits. They won’t be able to e.g. do a simple float addition without losing/mangling a bunch of digits.
But maybe the analog precision is sufficient for AI, which is an imprecise matter from the start.
Wouldn’t analog be a lot more precise?
Accurate, though, that’s a different story…
Ahh yeah and we should 1. Believe this exists 2. Believe that china doesnt think technology of this caliber isnt a matter of national security
For the love of Christ this thumbnail is triggering, lol
Why? It’s standard socket in SMOBO design (sandwich Motherboard).
Just push ever so slightly more when you hear the crunching sounds.
Then apply thermal paste generously
1000x!
Is this like medical articles about major cancer discoveries?
yes, except the bullshit cancer discoveries are always in Israel, and the bullshit chip designs are in china.
1000x yes!
It uses 1% of the energy but is still 1000x faster than our current fastest cards? Yea, I’m calling bullshit. It’s either a one off, bullshit, or the next industrial revolution.
EDIT: Also, why do articles insist on using ##x less? You can just say it uses 1% of the energy. It’s so much easier to understand.
coming from china, more like 1 -off bs, with nothing to backup on.
I mean it‘s like the 10th time I‘m reading about THE breakthrough in Chinese chip production on Lemmy so lets just say I‘m not holding my breath LoL.
Yeah it’s like reading about North American battery science. Like yeah ok cool, see you in 30 years when you’re maybe production ready
https://www.nature.com/articles/s41928-025-01477-0
Here’s the paper published in Nature.
However, it’s worth noting that Nature has had to retract studies before:
https://en.wikipedia.org/wiki/Nature_(journal)#Retractions
From 2000 to 2001, a series of five fraudulent papers by Jan Hendrik Schön was published in Nature. The papers, about semiconductors, were revealed to contain falsified data and other scientific fraud. In 2003, Nature retracted the papers. The Schön scandal was not limited to Nature; other prominent journals, such as Science and Physical Review, also retracted papers by Schön.
Not saying that we shouldn’t trust anything published in scientific journals, but yes, we should wait until more studies that replicate these results exist before jumping to conclusions.
But it only does 16x16 matrix inversion.
Oh noes, how could that -possibly- scale?
To a billion parameter matrix inverter? Probably not too hard, maybe not at those speeds.
To a GPU, or even just the functions used in GenAI? We don’t even know if those are possible with analog computers to begin with.
It’s a weird damn lie if it is.
And the death of the American economy if it isn’t, fingers crossed.
As someone with a 401k I really hope it isn’t.
The economy crashing won’t hurt billionaires but will kill the middle class.
If anything the economy crashing will allow the 0.1% to buy up anything they haven’t gotten already.
And now you see why they want to crash the economy.
What middle class? 🤔
The one so worried about their 401Ks they won’t risk the ire of the rich.
This seems like promising technology, but the figures they are providing are almost certainly fiction.
This has all the hallmarks of a team of researchers looking to score an R&D budget.
This was bound to happen. Neural networks are inherently analog processes, simulating them digitally is massively expensive in terms of hardware and power.
Digital domain is good for exact computation, analog is better for approximate computation, as required by neural networks.
You might benefit from watching Hinton’s lecture; much of it details technical reasons why digital is much much better than analog for intelligent systems
BTW that is the opposite of what he set out to prove He says the facts forced him to change his mind
much of it details technical reasons why digital is much much better than analog for intelligent systems
For current LLMs there would be a massive gain in energy efficiency if analogue computing was used. Much of the current energy costs come from stimulating what effectively analogue processing on digital hardware. There’s a lot lost in the conversation, or “emulation” of analogue.
I wish researchers like Hinton would stick to discussing the tech. Anytime he says anything about linguistics or human intelligence he sounds like a CS major smugly raising his hand in Phil 101 to a symphony of collective groans.
Hinton is a good computer scientist (with an infinitesimally narrow field of expertise). But the guy is philosophically illiterate.
That’s a good point. The model weights could be voltage levels instead of digital representations. Lots of audio tech uses analog for better fidelity.I also read that there’s a startup using particle beams for lithography. Exciting times.
what audio tech uses analog for better fidelity?
Vinyl records, analog tube amplifiers, a good pair of speakers 🤌
Honestly though digital compression now is so good it probably sounds the same.
At least one Nobel Laureate has exactly the opposite opinion (see the Hinton lecture above)
This is not a new line of research in the sense that this is not the only place looking in the mixed analog/digital computers. been articles on it for at least a year I think and when digital was taking over there was a lot of discussion around it being inferior to analog so I bet its been being thrown around to combine the two likely since digital became a thing.
Who is China? Why is it so smart?
Edit: I removed a chatgtp generated summary because I thought it could have been useful.
Anyway just have a good day.This comment violates rule 8 of the community. Please get your AI generated garbage out of here.
In that case I’m editing it. I’m sorry for my mistake, I thought it would be useful to a point. That’s why I said it was AI.
I appreciate that you wanted to help people even if it didn’t land how you intended. :)
No one is reading that.
That’s fine. Just have a good day :)
It was a decent summary, I was replying when you pulled it. Analog has its strengths (the first computers were analog, but electronics was much cruder 70 years ago) and it is def. a better fit for neural nets. Bound to happen.
Nice thorough commentary. The LiveScience article did a better job of describing it for people with no background in this stuff.
The original computers were analog. They were fast, but electronics was -so crude- at the time, it had to evolve a lot … and has in the last half-century.












