• 鳳凰院 凶真 (Hououin Kyouma)@sh.itjust.works
    link
    fedilink
    arrow-up
    27
    ·
    edit-2
    11 hours ago

    I saw a video of a woman in China (Edit: actually she was Korean) talking to an AI recreation of her child (then-deceased) through VR.

    I felt so creeped out by it. Like wtf, if I die, I want my mom to remember me, not talking to a fucking AI IMPOSTER.

    Edit: Looked it up, actually it’s a Korean woman, I mixed it up: https://www.reddit.com/r/MadeMeCry/comments/12zkqy8/mother_meets_deceased_daughter_through_vr/

    • Datz@szmer.info
      link
      fedilink
      English
      arrow-up
      4
      ·
      9 hours ago

      There’s a whole company/llm about doing that whose CEO gave a Ted talk about it.

      https://m.youtube.com/watch?v=-w4JrIxFZRA

      After that, I actually had a pretty wild idea about someone using to replace dead/missing people in chats. Imagine the horror of finding out your friend died months ago, or got kidnapped. Horribly impractical but sounds like a good novel.

      • Avicenna@programming.dev
        link
        fedilink
        arrow-up
        5
        ·
        7 hours ago

        “watch me talk about how I get rich of off exploiting people’s emotional fragilities and try to pass it as providing closure and community service”

      • Windex007@lemmy.world
        link
        fedilink
        arrow-up
        17
        ·
        edit-2
        10 hours ago

        If someone wants an AI companion, fine. If it’s a crazy good one, fine.

        But it’s strictly predatory for it to be designed to make someone feel like it’s someone else who was a real person, ESPECIALLY someone dealing with that type of grief.

        You had to boot the mom out of the painting. There was no ambiguity on that one.