• mrmaplebar@fedia.io
    link
    fedilink
    arrow-up
    10
    ·
    3 hours ago

    What’s their motivation? “Protecting the children” from pictures of boobs?

    Doubtful.

    Instead, I think the AI companies are looking for ways to more easily distinguish human-made “content” from bot-made content, in order to decrease the amount of generative slop that ends up being fed back into their training data.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      50 minutes ago

      It’s anticompetitiveness.

      They want to squash open models, and anyone too small to comply with this.

      I say this in every thread, but the real AI “battle” is open-weights ML vs OpenAI style tech bro AI. And OpenAI wants precisely no one to realize that.

      • mrmaplebar@fedia.io
        link
        fedilink
        arrow-up
        1
        ·
        25 minutes ago

        “Open weights” isn’t worth a shit either.

        If you don’t have the access to training data or the computational power to bake it into a model, you are beholden to somebody’s (or rather, some corporation’s) binary blob. We’re talking about the difference between freeware and FOSS effectively.

        Not that it matters because all of thin generative AI stuff is for talentless tech bro chuds in the first place…

    • buddascrayon@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 hour ago

      Considering the fact that AI based age verification now usually involves taking a head shot and uploading it to the servers of the company, I’m guessing there is an element of facial data collection involved as well.