cross-posted from: https://lemmy.world/post/43768262

Some may have believed they were against AI being used for war. They just don’t want it to make the final kill decision.

The argument given by those supporting them is that AI in the military was inevitable, so their position is a reasonable one.

  • partial_accumen@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    4 hours ago

    Depends on your definition of “good-ish”. Do you mean:

    • performance/accuracy?
    • ethical origin?
    • ethical ongoing operation?
    • privacy/future data harvesting concerns?

    Running one locally on your own hardware would likely reach “good-ish” with some sacrifices against performance/accuracy (unless you’ve got a lot of expensive hardware to run very large models). As far as ethical origins, there are few small models trained on public domain/nonstolen content, but their functions are far more limited.

    • criticon@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 hours ago

      I mean good-ish in the lesser-evil type of thing. I don’t expect any of those to be 100% ethical but there are some that are a lot worse than others

      I don’t really have a computer capable of running a local AI. I have an i3 laptop from around 6 years ago with 12gb of RAM and integrated graphics

      • partial_accumen@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 hours ago

        I mean good-ish in the lesser-evil type of thing. I don’t expect any of those to be 100% ethical but there are some that are a lot worse than others

        Ethics are subjective. “Good-ish” to you may mean you’re fine if its trained on copyrighted works as long as it wasn’t done with electricity from diesel generators belching exhaust into the local Memphis atmosphere (I’m looking at you Grok). Llama doesn’t do the diesel generator thing, but its a product of Facebook corporation. So is that “good-ish” to you or not? I don’t know. That’s up to you.

        It may not be fast, but your i3 laptop with 12GB of system RAM can absolutely run a local LLM. This is where that “performance/accuracy” question I raised comes in. It won’t be very fast, and you won’t be able to run the most common large models like GPT-5 etc. However, if your needs are light, light models exist. Give this a read