“The new device is built from arrays of resistive random-access memory (RRAM) cells… The team was able to combine the speed of analog computation with the accuracy normally associated with digital processing. Crucially, the chip was manufactured using a commercial production process, meaning it could potentially be mass-produced.”

Article is based on this paper: https://www.nature.com/articles/s41928-025-01477-0

  • NutWrench@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    3
    ·
    2 hours ago

    Look, It’s one of those articles again. The bi-monthly “China invents earth-shattering technology breakthrough that we never hear about again.”

    “1000x faster?” Learn to lie better. Real technological improvements are almost always incremental, like “10-20% faster, bigger, stronger.” Not 1000 freaking times faster. You lie like a child. Or like Trump.

    • notarobot@lemmy.zip
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      2 hours ago

      It can be 1000x faster because it analog. Analog things take very very little time to compute stuff. We don’t generally use them because they are very hard to get the same result twice and updating is also hard

    • Treczoks@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      1
      ·
      11 hours ago

      Same here. I wait to see real life calculations done by such circuits. They won’t be able to e.g. do a simple float addition without losing/mangling a bunch of digits.

      But maybe the analog precision is sufficient for AI, which is an imprecise matter from the start.

        • Treczoks@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          4 hours ago

          No, it wouldn’t. Because you cannot make it reproduceable on that scale.

          Normal analog hardware, e.g. audio tops out at about 16 bits of precision. If you go individually tuned and high end and expensive (studio equipment) you get maybe 24 bits. That is eons from the 52 bits mantissa precision of a double float.

        • Limonene@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          1
          ·
          5 hours ago

          The maximum theoretical precision of an analog computer is limited by the charge of an electron, 10^-19 coulombs. A normal analog computer runs at a few milliamps, for a second max. So a max theoretical precision of 10^16, or 53 bits. This is the same as a double precision (64-bit) float. I believe 80-bit floats are standard in desktop computers.

          In practice, just getting a good 24-bit ADC is expensive, and 12-bit or 16-bit ADCs are way more common. Analog computers aren’t solving anything that can’t be done faster by digitally simulating an analog computer.

            • turmacar@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              3 hours ago

              Every operation your computer does. From displaying images on a screen to securely connecting to your bank.

              It’s an interesting advancement and it will be neat if something comes of it down the line. The chances of it having a meaningful product in the next decade is close to zero.

            • Limonene@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              3 hours ago

              They used to use analog computers to solve differential equations, back when every transistor was expensive (relays and tubes even more so) and clock rates were measured in kilohertz. There’s no practical purpose for them now.

              In cases of number theory, and RSA cryptography, you need even more precision. They combine multiple integers together to get 4096-bit precision.

              If you’re asking about the 24-bit ADC, I think that’s usually high-end audio recording.

  • AItoothbrush@lemmy.zip
    link
    fedilink
    English
    arrow-up
    15
    ·
    10 hours ago

    Ahh yeah and we should 1. Believe this exists 2. Believe that china doesnt think technology of this caliber isnt a matter of national security

  • Alexstarfire@lemmy.world
    link
    fedilink
    English
    arrow-up
    218
    arrow-down
    5
    ·
    edit-2
    21 hours ago

    It uses 1% of the energy but is still 1000x faster than our current fastest cards? Yea, I’m calling bullshit. It’s either a one off, bullshit, or the next industrial revolution.

    EDIT: Also, why do articles insist on using ##x less? You can just say it uses 1% of the energy. It’s so much easier to understand.

    • Cethin@lemmy.zip
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 hours ago

      I would imagine there’s a kernel of truth to it. It’s probably correct, but for one rarely used operation, or something like that. It’s not a total revolution. It’s something that could be included to speed up a very particular task. Like GPUs are much better at matrix math than the CPU, so we often have that in addition to the CPU, which can handle all tasks, but isn’t as fast for those particular ones.

    • CosmoNova@lemmy.world
      link
      fedilink
      English
      arrow-up
      60
      arrow-down
      1
      ·
      17 hours ago

      I mean it‘s like the 10th time I‘m reading about THE breakthrough in Chinese chip production on Lemmy so lets just say I‘m not holding my breath LoL.

      • 4am@lemmy.zip
        link
        fedilink
        English
        arrow-up
        23
        arrow-down
        1
        ·
        16 hours ago

        Yeah it’s like reading about North American battery science. Like yeah ok cool, see you in 30 years when you’re maybe production ready

    • Snot Flickerman@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      3
      ·
      edit-2
      17 hours ago

      https://www.nature.com/articles/s41928-025-01477-0

      Here’s the paper published in Nature.

      However, it’s worth noting that Nature has had to retract studies before:

      https://en.wikipedia.org/wiki/Nature_(journal)#Retractions

      From 2000 to 2001, a series of five fraudulent papers by Jan Hendrik Schön was published in Nature. The papers, about semiconductors, were revealed to contain falsified data and other scientific fraud. In 2003, Nature retracted the papers. The Schön scandal was not limited to Nature; other prominent journals, such as Science and Physical Review, also retracted papers by Schön.

      Not saying that we shouldn’t trust anything published in scientific journals, but yes, we should wait until more studies that replicate these results exist before jumping to conclusions.

      • turdcollector69@lemmy.world
        link
        fedilink
        English
        arrow-up
        27
        arrow-down
        1
        ·
        19 hours ago

        As someone with a 401k I really hope it isn’t.

        The economy crashing won’t hurt billionaires but will kill the middle class.

        If anything the economy crashing will allow the 0.1% to buy up anything they haven’t gotten already.

        • BreakerSwitch@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 hours ago

          Yeah this is literally what happened in 2008. Economic instability stopped banks from lending to would be individual home buyers, but corpos bought up everything they could eagerly with a 20% price cut.

          • MajorasTerribleFate@lemmy.zip
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 hours ago

            Economic instability is generally better for the people who can weather the storm, i.e. those with resources to spare, because (as you say) they can buy assets on the cheap when the less fortunate run out of cash to survive on and have to liquidate.

            It’s long periods of stability that seem to let the lower classes build up a little. Yet another reason why war and strife is of benefit to the rich.

          • Zorque@lemmy.world
            link
            fedilink
            English
            arrow-up
            12
            arrow-down
            1
            ·
            15 hours ago

            The one so worried about their 401Ks they won’t risk the ire of the rich.

  • Godort@lemmy.ca
    link
    fedilink
    English
    arrow-up
    57
    ·
    20 hours ago

    This seems like promising technology, but the figures they are providing are almost certainly fiction.

    This has all the hallmarks of a team of researchers looking to score an R&D budget.

  • Quazatron@lemmy.world
    link
    fedilink
    English
    arrow-up
    62
    arrow-down
    3
    ·
    20 hours ago

    This was bound to happen. Neural networks are inherently analog processes, simulating them digitally is massively expensive in terms of hardware and power.

    Digital domain is good for exact computation, analog is better for approximate computation, as required by neural networks.

    • Trainguyrom@reddthat.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 hours ago

      That and the way companies have been building AI they have been doing so little to optimize compute to instead try to get the research out faster because that’s what is expected in this bubble. I’m absolutely fully expecting to see future research finding plenty of ways to optimize these major models.

      But also R&D has been entirely focused on digital chips I would not be at all surprised if there were performance and/or efficiency gains to be had in certain workloads by shifting to analog circuits

      • Quazatron@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 hours ago

        Thank you for the link, it was very interesting.

        Even though analogue neural networks have the drawback that you can’t copy the neuron weights (currently, but tech may evolve to do it), they can still have use cases in lower powered edge devices.

        I think we’ll probably end up with hybrid designs, using digital for most parts except the calculations.

      • CeeBee_Eh@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        14 hours ago

        much of it details technical reasons why digital is much much better than analog for intelligent systems

        For current LLMs there would be a massive gain in energy efficiency if analogue computing was used. Much of the current energy costs come from stimulating what effectively analogue processing on digital hardware. There’s a lot lost in the conversation, or “emulation” of analogue.

      • yeahiknow3@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        4
        ·
        edit-2
        15 hours ago

        I wish researchers like Hinton would stick to discussing the tech. Anytime he says anything about linguistics or human intelligence he sounds like a CS major smugly raising his hand in Phil 101 to a symphony of collective groans.

        Hinton is a good computer scientist (with an infinitesimally narrow field of expertise). But the guy is philosophically illiterate.

    • bulwark@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      19 hours ago

      That’s a good point. The model weights could be voltage levels instead of digital representations. Lots of audio tech uses analog for better fidelity.I also read that there’s a startup using particle beams for lithography. Exciting times.

        • bulwark@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          edit-2
          13 hours ago

          Vinyl records, analog tube amplifiers, a good pair of speakers 🤌

          Honestly though digital compression now is so good it probably sounds the same.

          • vrighter@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            5
            ·
            4 hours ago

            speakers are analog devices by nature.

            The other two are used for the distortions they introduce, so quite literally lower fidelity. Whether some people like those distortions is irrelevant.

            You want high fidelity: lossless digital audio formats.

            • aesthelete@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              26 minutes ago

              Yeah, I get very good sound out of class d amplifiers. They’re cheap; they’re energy efficient, and they usually pack in features for digital formats because it’s easy to do.

  • HubertManne@piefed.social
    link
    fedilink
    English
    arrow-up
    11
    ·
    20 hours ago

    This is not a new line of research in the sense that this is not the only place looking in the mixed analog/digital computers. been articles on it for at least a year I think and when digital was taking over there was a lot of discussion around it being inferior to analog so I bet its been being thrown around to combine the two likely since digital became a thing.

  • Melobol@lemmy.ml
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    37
    ·
    edit-2
    20 hours ago

    Edit: I removed a chatgtp generated summary because I thought it could have been useful.
    Anyway just have a good day.

    • Trainguyrom@reddthat.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      5 hours ago

      The article is like 5 paragraphs, not even a single sheet of paper if printed (with the unneeded images and ads excluded of course). Why does it need a summary‽

      • Melobol@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 hour ago

        The summary was for the paper the article was based on. And it was also put it in an easier to understand language.

      • Melobol@lemmy.ml
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        2
        ·
        20 hours ago

        In that case I’m editing it. I’m sorry for my mistake, I thought it would be useful to a point. That’s why I said it was AI.

    • bcovertigo@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      2
      ·
      18 hours ago

      I appreciate that you wanted to help people even if it didn’t land how you intended. :)

    • kalkulat@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      18 hours ago

      It was a decent summary, I was replying when you pulled it. Analog has its strengths (the first computers were analog, but electronics was much cruder 70 years ago) and it is def. a better fit for neural nets. Bound to happen.

    • kalkulat@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      20 hours ago

      Nice thorough commentary. The LiveScience article did a better job of describing it for people with no background in this stuff.

      The original computers were analog. They were fast, but electronics was -so crude- at the time, it had to evolve a lot … and has in the last half-century.