Research.

Researchers have developed a new kind of nanoelectronic device that could dramatically cut the energy consumed by artificial intelligence hardware by mimicking the human brain.

The researchers, led by the University of Cambridge, developed a form of hafnium oxide that acts as a highly stable, low‑energy ‘memristor’ — a component designed to mimic the efficient way neurons are connected in the brain.

  • Zak@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    2 days ago

    could dramatically cut the energy consumed by artificial intelligence hardware

    Decreasing the cost of using a resource almost always results in more use of that resource.

    Laboratory tests showed the devices could reliably endure tens of thousands of switching cycles

    That’s not very many when GPUs perform trillions of operations per second.

    • ryannathans@aussie.zone
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 days ago

      It’d probably be far more appropriate for an analogue system where it isn’t being switched but it’s rather what the model is burned onto

      • very_well_lost@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        This seems like such a glaringly-obvious solution to lower inference cost that surely there must be some fundamental flaw in it… otherwise all of the big AI firms would be doing it, right?

        Right…?

        • ryannathans@aussie.zone
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 days ago

          Takes a while for the technology to become available in ASICs, we still don’t have purpose designed silicon for AI. We’re using repurposed GPUs with tensor cores scaled up still for pretty much all AI workloads