“The new device is built from arrays of resistive random-access memory (RRAM) cells… The team was able to combine the speed of analog computation with the accuracy normally associated with digital processing. Crucially, the chip was manufactured using a commercial production process, meaning it could potentially be mass-produced.”

Article is based on this paper: https://www.nature.com/articles/s41928-025-01477-0

    • Treczoks@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      5 hours ago

      Same here. I wait to see real life calculations done by such circuits. They won’t be able to e.g. do a simple float addition without losing/mangling a bunch of digits.

      But maybe the analog precision is sufficient for AI, which is an imprecise matter from the start.

  • AItoothbrush@lemmy.zip
    link
    fedilink
    English
    arrow-up
    9
    ·
    4 hours ago

    Ahh yeah and we should 1. Believe this exists 2. Believe that china doesnt think technology of this caliber isnt a matter of national security

  • Alexstarfire@lemmy.world
    link
    fedilink
    English
    arrow-up
    181
    arrow-down
    5
    ·
    edit-2
    15 hours ago

    It uses 1% of the energy but is still 1000x faster than our current fastest cards? Yea, I’m calling bullshit. It’s either a one off, bullshit, or the next industrial revolution.

    EDIT: Also, why do articles insist on using ##x less? You can just say it uses 1% of the energy. It’s so much easier to understand.

    • CosmoNova@lemmy.world
      link
      fedilink
      English
      arrow-up
      47
      arrow-down
      1
      ·
      12 hours ago

      I mean it‘s like the 10th time I‘m reading about THE breakthrough in Chinese chip production on Lemmy so lets just say I‘m not holding my breath LoL.

      • 4am@lemmy.zip
        link
        fedilink
        English
        arrow-up
        21
        arrow-down
        1
        ·
        10 hours ago

        Yeah it’s like reading about North American battery science. Like yeah ok cool, see you in 30 years when you’re maybe production ready

    • Snot Flickerman@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      3
      ·
      edit-2
      12 hours ago

      https://www.nature.com/articles/s41928-025-01477-0

      Here’s the paper published in Nature.

      However, it’s worth noting that Nature has had to retract studies before:

      https://en.wikipedia.org/wiki/Nature_(journal)#Retractions

      From 2000 to 2001, a series of five fraudulent papers by Jan Hendrik Schön was published in Nature. The papers, about semiconductors, were revealed to contain falsified data and other scientific fraud. In 2003, Nature retracted the papers. The Schön scandal was not limited to Nature; other prominent journals, such as Science and Physical Review, also retracted papers by Schön.

      Not saying that we shouldn’t trust anything published in scientific journals, but yes, we should wait until more studies that replicate these results exist before jumping to conclusions.

        • TheBlackLounge@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 hours ago

          To a billion parameter matrix inverter? Probably not too hard, maybe not at those speeds.

          To a GPU, or even just the functions used in GenAI? We don’t even know if those are possible with analog computers to begin with.

  • Godort@lemmy.ca
    link
    fedilink
    English
    arrow-up
    50
    ·
    14 hours ago

    This seems like promising technology, but the figures they are providing are almost certainly fiction.

    This has all the hallmarks of a team of researchers looking to score an R&D budget.

  • Quazatron@lemmy.world
    link
    fedilink
    English
    arrow-up
    55
    arrow-down
    2
    ·
    14 hours ago

    This was bound to happen. Neural networks are inherently analog processes, simulating them digitally is massively expensive in terms of hardware and power.

    Digital domain is good for exact computation, analog is better for approximate computation, as required by neural networks.

      • CeeBee_Eh@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        9 hours ago

        much of it details technical reasons why digital is much much better than analog for intelligent systems

        For current LLMs there would be a massive gain in energy efficiency if analogue computing was used. Much of the current energy costs come from stimulating what effectively analogue processing on digital hardware. There’s a lot lost in the conversation, or “emulation” of analogue.

      • yeahiknow3@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        3
        ·
        edit-2
        9 hours ago

        I wish researchers like Hinton would stick to discussing the tech. Anytime he says anything about linguistics or human intelligence he sounds like a CS major smugly raising his hand in Phil 101 to a symphony of collective groans.

        Hinton is a good computer scientist (with an infinitesimally narrow field of expertise). But the guy is philosophically illiterate.

    • bulwark@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      14 hours ago

      That’s a good point. The model weights could be voltage levels instead of digital representations. Lots of audio tech uses analog for better fidelity.I also read that there’s a startup using particle beams for lithography. Exciting times.

        • bulwark@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          edit-2
          8 hours ago

          Vinyl records, analog tube amplifiers, a good pair of speakers 🤌

          Honestly though digital compression now is so good it probably sounds the same.

  • HubertManne@piefed.social
    link
    fedilink
    English
    arrow-up
    10
    ·
    14 hours ago

    This is not a new line of research in the sense that this is not the only place looking in the mixed analog/digital computers. been articles on it for at least a year I think and when digital was taking over there was a lot of discussion around it being inferior to analog so I bet its been being thrown around to combine the two likely since digital became a thing.

  • Melobol@lemmy.ml
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    33
    ·
    edit-2
    14 hours ago

    Edit: I removed a chatgtp generated summary because I thought it could have been useful.
    Anyway just have a good day.

      • Melobol@lemmy.ml
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        2
        ·
        14 hours ago

        In that case I’m editing it. I’m sorry for my mistake, I thought it would be useful to a point. That’s why I said it was AI.

    • bcovertigo@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      2
      ·
      13 hours ago

      I appreciate that you wanted to help people even if it didn’t land how you intended. :)

    • kalkulat@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      13 hours ago

      It was a decent summary, I was replying when you pulled it. Analog has its strengths (the first computers were analog, but electronics was much cruder 70 years ago) and it is def. a better fit for neural nets. Bound to happen.

    • kalkulat@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      14 hours ago

      Nice thorough commentary. The LiveScience article did a better job of describing it for people with no background in this stuff.

      The original computers were analog. They were fast, but electronics was -so crude- at the time, it had to evolve a lot … and has in the last half-century.