We Asked A.I. to Create the Joker. It Generated a Copyrighted Image.::Artists and researchers are exposing copyrighted material hidden within A.I. tools, raising fresh legal questions.

  • abhibeckert@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    10 months ago

    But where is the infringement?

    This NYT article includes the same several copyrighted images and they surely haven’t paid any license. It’s obviously fair use in both cases and NYT’s claim that “it might not be fair use” is just ridiculous.

    Worse, the NYT also includes exact copies of the images, while the AI ones are just very close to the original. That’s like the difference between uploading a video of yourself playing a Taylor Swift cover and actually uploading one of Taylor Swift’s own music videos to YouTube.

    Even worse the NYT intentionally distributed the copyrighted images, while Midjourney did so unintentionally and specifically states it’s a breach of their terms of service. Your account might be banned if you’re caught using these prompts.

    • dragontamer@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      But where is the infringement?

      Do Training weights have the data? Are the servers copying said data on a mass scale, in a way that the original copyrighters don’t want or can’t control?

      • Auli@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        There response well be we don’t know we can’t understand what its doing.

        • dragontamer@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          10 months ago

          There response well be we don’t know we can’t understand what its doing.

          What the fuck is this kind of response? Its just a fucking neural network running on GPUs with convolutional kernels. For fucks sake, turn on your damn brain.

          Generative AI is actually one of the easier subjects to comprehend here. Its just calculus. Use of derivatives to backpropagate weights in such a way that minimizes error. Lather-rinse-repeat for a billion iterations on a mass of GPUs (ie: 20 TFlop compute systems) for several weeks.

          Come on, this stuff is well understood by Comp. Sci by now. Not only 20 years ago when I learned about this stuff, but today now that AI is all hype, more and more people are understanding the basics.