You’re not productive if you don’t use a lot of AI, says guy who makes all of his money selling AI hardware

  • REDACTED@infosec.pub
    link
    fedilink
    English
    arrow-up
    25
    ·
    6 hours ago

    Something I don’t understand - AI coding is mostly useful in common code, snippets, easy stuff. What Nvidia is doing (drivers, optimization, chip design, etc.) is something I imagine there is close to zero AI training, so what can they realistically even use it for so much?

    • scytale@piefed.zip
      link
      fedilink
      English
      arrow-up
      8
      ·
      5 hours ago

      I’m gonna take a guess that a big portion of it is infrastructure-as-code, the operations side and not product development itself. I work in the operations side of things and we never touch the product at all, but we deal with a lot of code due to how backend infrastructure is built and maintained now, especially if you’re in the cloud.

    • givesomefucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      5 hours ago

      so what can they realistically even use it for so much?

      Burn money on AI tokens so it looks like AI could be profitable some day so people keep investing in AI companies that can then buy Nvidia chips…

      You’re thinking of it like “how can AI make a better product”

      They’re looking at it as “how can we sell more chips”

      Two very different questions with very different answers.

      It’s a house of cards and Nvidia can’t afford to acknowledge no one wants AI or knows how to make it profitable.

    • NinjaTurtle@feddit.online
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      6 hours ago

      Guess code review and troubleshooting. Not really sure, I have only really used it for code templates and ideas for troubleshooting to look into.

      The most use I found is rewriting documents in a specific way. But only after I write it first. Then go back and forth. Just to make tone consistent.

    • frongt@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 hours ago

      There’s plenty of driver code available. All of Linux and BSD, plus whatever internal stuff they have. Optimization is pretty generic.

      Chip design maybe not, but I imagine you can train an AI on the principles and generate a bunch of candidates, then benchmark them in simulation.