• db2@lemmy.world
    link
    fedilink
    English
    arrow-up
    54
    arrow-down
    14
    ·
    3 days ago

    Japan

    The same place that had used panties in vending machines and this is somehow shocking.

    • Essence_of_Meh@lemmy.world
      link
      fedilink
      English
      arrow-up
      54
      arrow-down
      2
      ·
      3 days ago

      There’s still a difference between a piece of clothing that may or may not have been worn by anonymous women and creations using real people without their knowledge or consent. Not trying to defend those vending machines but these aren’t the same in terms of results and potential effects on the victims.

      • ThePyroPython@lemmy.world
        link
        fedilink
        English
        arrow-up
        48
        arrow-down
        2
        ·
        3 days ago

        I think this is more a comment on the particular strain of misogyny that pervades Japanese society and not being surprised that a country which has women only metro carriages out of necessity to prevent sexual harassment is also having a problem with AI generated deep fakes. Though the latter is also a problem in many other countries which have access to these image models.

        • Essence_of_Meh@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          3 days ago

          That’s a fair point but, as you already mentioned, deepfakes are a pretty universal problem unfortunately. The type of misogyny in Japan compared to say US or Europe doesn’t seem to affect that much (if at all). Either way, it’s a terrible practice without sufficient ways to combat it at the moment which makes me pretty worried about how things will evolve in the future - in terms of “content”, affecting lives of victims and laws that will come out of this (some probably made in the worst possible knee-jerk reactionary ways).

      • jacksilver@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        I tend to lean more towards the problem being distribution rather than creation, so I’m curious about your opinion on this: Is there a difference between me imaging a sexual act with someone without their consent vs writing/drawing/deepfaking it?

        • Essence_of_Meh@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 days ago

          In my opinion absolutely. Whatever happens in your head stays in your head and doesn’t affect the other person unless you take active steps for that to happen. Images or videos on the other hand can not only be distributed far easier, even accidentally, but also have a way higher chance of affecting people’s lives (how can you disprove you didn’t take nude photos of yourself for example? let alone make people believe it). They can lead to loss of reputation, harassment, bullying and serious mental issues for the victims (trust issues, anxiety, depression, self-harm) - imagination can’t really do that on its own.

          Perhaps distribution is the real problem but easy access to tools that can create convincing results quickly and without effort makes said distribution way more probable.

          • jacksilver@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 days ago

            Thanks for sharing your perspective. It sounds like it’s the potential for harm/damage rather than the act itself that makes it an issue for you?

            • Essence_of_Meh@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              2 days ago

              I still think the act itself is pretty gross but yeah, the harm is the important part for me - and I don’t mean that just in case of sexual images. It’s also a problem in terms of content created to affect people’s reputation in other ways or influence the sociopolitical situation (something that’s already happening around the world).

              The harmful potential of generative AI is on a completely different scale than photoshopped images already mentioned in this thread by others. That doesn’t mean genAI can’t be used in fun and interesting ways but stuff like what’s described in the linked article is a big no no for me.

      • toastmeister@lemmy.ca
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        2
        ·
        3 days ago

        Photoshops were a thing before AI. Nobody cares because its just a weird thing creepy do in private.

        • Essence_of_Meh@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          3
          ·
          3 days ago

          Photoshop requires at least some skill and doesn’t allow for mass production of fakes the way generative AI does. Same problem, different scale.

          • Goretantath@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            3
            ·
            3 days ago

            Pasting a face over a body in photoshop is just as easy as snapping a polaroid of someone and pasting their face on a porn mag. AI just makes it faster.