• ch00f@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 hours ago

    Well, not off to a great start.

    To be clear, I think getting an LLM to run locally at all is super cool, but saying “go self hosted” sort of gloms over the fact that getting a local LLM to do anything close to what ChatGPT can do is a very expensive hobby.