• Dran@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    7 hours ago

    Inference is dirt cheap in comparison. Hundreds to thousands of concurrent users can be served by hardware costing in the high-thousands to low-ten-thousands.

    Training those same foundational models is weeks to months of time on tens to hundreds of millions worth of hardware.

    • AbouBenAdhem@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      6 hours ago

      Yeah—but in theory you only need to train once, while inference costs are ongoing and scale up with usage.

      I guess it’s ultimately a business decision by AI companies to weigh how often retraining is worth the cost.

      • JGrffn@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        5 hours ago

        Yeah i don’t think they ever stop training is the thing. At this point I’d assume they have multiple training pipelines to try different shit out, just queued up to hit the big farms as soon as the last models are done training.

        Resting isn’t a thing in capitalism.