• mustlane@lemmy.zip
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    7
    ·
    edit-2
    10 hours ago

    With AI the GPUs have no value after 6 years

    What? GPUs don’t age. They might get old technologically wise, but they don’t just… die. The silicone chip itself doesn’t care about age.

    • elgordino@fedia.io
      link
      fedilink
      arrow-up
      13
      ·
      8 hours ago

      It’s not that they don’t technically work. It’s just they’re no longer efficient compared to newer versions that can do more with less power. So to remain competitive you need to upgrade otherwise your cost to execute a model is too high.

      Hyperscalers used to write GPU’s down to zero value after three years, over the last couple of years they’ve all increased this to six.

    • MonkderVierte@lemmy.zip
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      edit-2
      4 hours ago

      But transistors break after what? 100’000 cycles? GPUs can get “used up”. And if your computing center has twice as much running costs due to old, less efficient hardware, it isn’t competitive.

      Edit: looks like transistors can partially recover with sleep cycles.

      Editedit: that with the cycles was in flash storage. Looks like it’s higher in computing?

      • Orygin@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        7 hours ago

        I doubt the transistor on a GPU wafer break after 100k cycles, as they run at gigahertz frequencies, some cycle billions of times a second.

    • embed_me@programming.dev
      link
      fedilink
      English
      arrow-up
      2
      ·
      9 hours ago

      I’m not an expert but I was under the assumption that electronic components (including silicon chips and their internals) will age and give out on the decade timescale