• pipi1234@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    edit-2
    14 hours ago

    My guess is they are using the Netflix playbook all over again.

    Get you hooked to the extreme convenience, much like a drug addict, and then pump up the price or flood every prompt with ads.

    That’s my best case.

    Worse case is, that alongside the rising adoption, they will start surreptitiously but effectively modifying general knowledge, thought and behaviour in ways the worst best Marketer would blush about.

    • Tanoh@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      10 hours ago

      Get you hooked to the extreme convenience, much like a drug addict, and then pump up the price or flood every prompt with ads.

      There is a big difference between “normal” SaaS and LLM.

      In a normal SaaS you get a lot of benefit of being at scale. Going from 1000 to 10000 users is not that much harder than going from 10000 to 1000000. Once you have your scaling set up you can just add more servers and/or data centers. But most importantly, the cost per user goes waaay down.

      With AI it just doesn’t scale at all, the 500000th user will most likely cost as much as the 5th. So doing a netflix/spotify/etc, I don’t think is going to work unless they can somehow make it a lot cheaper per user. OpenAI fails to turn a profit even on their most expensive tiers.

      Edit: to clarify, obviously you get some small benefits from being at scale. Better negotiations and already having server racks, etc. But those same benefits a traditionsl SaaS gets as well, and so much more that LLM doesn’t, because the cost per user doesn’t drop.

      • pipi1234@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 hours ago

        Is this correct? I was under the impresion that the most expensive part of an llm is the training, and once that’s done the cost of running a prompt is negligible.

        I get your point that this last part doesn’t scale well, but the far larger cost of training must get very diluted if they distribute it across a large user base.

        • cazssiew@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          5 hours ago

          I agree, scaling users isn’t the issue, what is is the neverending chase for the mirage that is AGI. They’ll throw every processing cycle they can muster at that fever dream, that’s the financial black hole.

          • pipi1234@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            2 hours ago

            Yes, but don’t underestimate the power of centralisation.

            6 months ago you could set up a server for running a decent local llm for under 800.

            By increasing the demands and pushing the price of hardware up, they are efectibly gate keeping access to llms.

            I think the plan is that we will need to rely on this companies for compute power and llms services, and then they can do all sorts of nefarious things.

    • Postimo@lemmy.zip
      link
      fedilink
      English
      arrow-up
      5
      ·
      15 hours ago

      then pump up the price or flood every prompt with ads.

      “Sure I found that document you needed, and with it, I also found this great new game I know you’ll love. Raid: Shadow Legends, It’s a free to pla…”

      I cannot wait for companies spending 300 dollars per user per month for this convenience.