• Reygle@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    3
    ·
    edit-2
    2 hours ago

    “On September 29, 2025, it sent him — armed with knives and tactical gear — to scout what Gemini called a ‘kill box’ near the airport’s cargo hub,” the complaint reads. “It told Jonathan that a humanoid robot was arriving on a cargo flight from the UK and directed him to a storage facility where the truck would stop. Gemini encouraged Jonathan to intercept the truck and then stage a ‘catastrophic accident’ designed to ‘ensure the complete destruction of the transport vehicle and . . . all digital records and witnesses.’”


    WHAT

    Genuine question, REALLY: What in the fuck is an otherwise “functioning adult” doing believing shit like this? I feel like his father should also slap himself unconscious for raising a fuckwit?

    • merdaverse@lemmy.zip
      link
      fedilink
      English
      arrow-up
      9
      ·
      39 minutes ago

      AI psychosis is a thing:

      cases in which AI models have amplified, validated, or even co-created psychotic symptoms with individuals

      It’s not very studied since it’s relatively new.

      • Reygle@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        32 minutes ago

        I’ve seen that before too. A number of articles of people being so deluded by AI responses, but I’ve never seen outright murder plots and insane shit like this one before.

    • starman2112@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      2 hours ago

      If I raise a fuckwit son, and then someone convinces my fuckwit son to kill himself, I’m going to sue that someone who took advantage of my son’s fuckwittedness

    • XLE@piefed.social
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 hour ago

      I feel like his father should also slap himself unconscious for raising a fuckwit?

      So, a chatbot grooms somebody into killing himself, and your response is… Blame his father?

      • Reygle@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        6
        ·
        1 hour ago

        The father is suing the company who makes the wrong answer machine for the wrong answer machine spiraling his son to madness, but never protected his son from spiraling into madness by teaching critical thinking.

        Look I don’t like it but to think Gemini (wrong answer machine) is completely to blame would be madness.

        • XLE@piefed.social
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          1 hour ago

          Uh-huh. Do you have any evidence to back up your beliefs here, or are we just working from the presumption that the parents are always to blame

          • Reygle@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            3
            ·
            1 hour ago

            Did we read the same article? Because I feel like we did not read the same article.

    • SalamenceFury@piefed.social
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      2 hours ago

      I don’t think this person was a “fuckwit”. AI is designed to keep engaging with you and will affirm any belief you have, and anything that is a little weird, but innocent otherwise will simply get amplified further and further into straight up mega delusions until the person has a psychotic episode, and this stuff happens more to NORMIES with no historic of mental illnesses than neurodivergent people.

      • Reygle@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        5
        ·
        1 hour ago

        It’s cool, we can agree to disagree, because I 100% think that he was a textbook fuckwit.