• XLE@piefed.social
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    2 hours ago

    Come on, don’t be so dishonest. Compare similar things. This “tool” is designed to create humanlike realtime communication, and it’s run by a billionaire rapist who just as easily have groomed the killer himself (thanks to it being a black box “live service”, we don’t know where the grooming came from, do we).

    I remember your previous comment from another thread:

    Vulnerable people don’t get to outsource responsibility.

    But apparently billionaires do.

    • iegod@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      40 minutes ago

      The tool isn’t sentient, it operates on logical weights, and provides output that mimics its training set. LLMs are pretty impressive at what they can output, but it would be dishonest to attribute human qualities to it. There are decades of implementations of various AI techniques to varying degrees in attempts to achieve the same. It is on the technical basis, and the technical basis alone, that we should be carefully considering legal constraints.

      How much a CEO is worth, how trustworthy they are, what cirlces they run in, shouldn’t be part of that consideration.

      That doesn’t mean I think Altman isn’t a turd who can suck a fat one.

      • XLE@piefed.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        20 minutes ago

        Like I said, it is built to be human_like_. Of course it’s not human or sentient, but Sam Altman sells ChatGPT with humanizing language, describes human attributes, and personally subsidized the grooming of people to commit suicide and homicide.