Hee hee.

  • SpikesOtherDog@ani.social
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    2
    ·
    22 hours ago

    With all these safety measures, it is going to hallucinate and kill a family one if these days with bad advice.

    Also, it appears that grok is about to be sued into nonexistence: “This week, xAI and X introduced a new “spicy mode” that’ll let your inner freak fly with NFSW content — including illicit deepfakes of celebs.”

    • otacon239@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      19 hours ago

      With all these safety measures, it is going to hallucinate and kill a family one if these days with bad advice.

      Don’t worry. I’m sure that’s already been happening, but just isn’t getting reported on. Safety measures or not, AI is practically guaranteed to eventually give life-threatening advice.