• Ebby@lemmy.ssba.com
    link
    fedilink
    English
    arrow-up
    8
    ·
    4 months ago

    That’s a good litmus test. If asking/paying artists to train your AI destroys your business model, maybe you’re the arsehole. ;)

    • BrianTheeBiscuiteer@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 months ago

      Not only that, but their business model doesn’t hold up if they were required to provide their model weights for free because the material that went into it was “free”.

      • T156@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 months ago

        There’s also an argument that if the business was that reliant on free things to start with, then it shouldn’t be a business.

        No-one would bat their eyes if the CEO of a real estate company was sobbing that it’s the end of the rental market, because the company is no longer allowed to get houses for free.

          • finder@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            edit-2
            4 months ago

            Extracting free resources of the land

            Not to be contrarian, but there is a cost to extract those “free” resources; like labor, equipment, transportation, lobbying (AKA: bribes for the non-Americans), processing raw material into something useful, research and development, et cetera.

  • efrique@lemm.ee
    cake
    link
    fedilink
    English
    arrow-up
    4
    ·
    4 months ago

    I’m fine with this. “We can’t succeed without breaking the law” isn’t much of an argument.

    Do I think the current copyright laws around the world are fine? No, far from it.

    But why do they merit an exception to the rules that will make them billions, but the rest of us can be prosecuted in severe and dramatic fashion for much less. Try letting the RIAA know you have a song you’ve downloaded on your PC that you didn’t pay for - tell them it’s for “research and training purposes”, just like AI uses stuff it didn’t pay for - and see what I mean by severe and dramatic.

    It should not be one rule for the rich guys to get even richer and the rest of us can eat dirt.

    Figure out how to fix the laws in a way that they’re fair for everyone, including figuring out a way to compensate the people whose IP you’ve been stealing.

    Until then, deal with the same legal landscape as everyone else. Boo hoo

    • Kühlschrank@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      I also think it’s really rich that at the same time they’re whining about copyright they’re trying to go private. I feel like the ‘Open’ part of OpenAI is the only thing that could possibly begin to offset their rampant theft and even then they’re not nearly open enough.

      • Tetsuo@jlai.lu
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        4 months ago

        They are not releasing anything of value in open source recently.

        Sam altman said they were on the wrong side of history about this when deepseek released.

        They are not open anymore I want that to be clear. They decided to stop releasing open source because 💵💵💵💵💵💵💵💵.

        So yeah I can have huge fines for downloading copyrighted material where I live, and they get to make money out of that same material without even releasing anything open source? Fuck no.

  • CMDR_Horn@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    4 months ago

    Good. I hope this is what happens.

    1. LLM algorithms can be maintained and sold to corpos to scrape their own data so they can use them for in house tools, or re-sell them to their own clients.
    2. Open Source LLMs can be made available for end users to do the same with their own data, or scrape whats available in the public domain for whatever they want so long as they don’t re-sell
    3. Altman can go fuck himself
  • killeronthecorner@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    4 months ago

    This is basically a veiled admission that OpenAI are falling behind in the very arms race they started. Good, fuck Altman. We need less ultra-corpo tech bro bullshit in prevailing technology.

  • psyspoop@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    ·
    4 months ago

    But I can’t pirate copyrighted materials to “train” my own real intelligence.

      • This is fine🔥🐶☕🔥@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        And he also said “child pornography is not necessarily abuse.”

        In the US, it is illegal to possess or distribute child pornography, apparently because doing so will encourage people to sexually abuse children.

        This is absurd logic. Child pornography is not necessarily abuse. Even if it was, preventing the distribution or posession of the evidence won’t make the abuse go away. We don’t arrest everyone with videotapes of murders, or make it illegal for TV stations to show people being killed.

        Wired has an article on how these laws destroy honest people’s lives.

        https://web.archive.org/web/20130116210225/http://bits.are.notabug.com/

        Big yikes from me whenever I see him venerated.

      • ccunning@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        Yes, and he killed himself after the FBI was throwing the book at him for doing exactly what these AI assholes are doing without repercussion

        • FauxLiving@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          4 months ago

          And for some reason suddenly everyone leaps back to the side of the FBI and copyright because it’s a meme to hate on LLMs.

          It’s almost like people don’t have real convictions.

          You can’t be Team Aaron when it’s popular and then Team Copyright Maximalist when the winds change and it’s time to hate on LLMs or diffusion models.

      • gandalf_der_12te@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        Yeah, you can train your own neural network on pirated content, all right, but you better not enjoy that content at the same time or have any feelings while watching it, because that’s not covered by “training”.

  • Ensign_Crab@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    4 months ago

    If giant megacorporations can benefit by ignoring copyright, us mortals should be able to as well.

    Until then, you have the public domain to train on. If you don’t want AI to talk like the 1920s, you shouldn’t have extended copyright and robbed society of a robust public domain.

    • Eugene V. Debs' Ghost@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      Either we can now have full authority to do anything we want with copyright, or the companies have to have to abide the same rules the plebs and serfs have to and only take from media a century ago, or stuff that fell through the cracks like Night of the Living Dead.

      Copyright has always been a farce and a lie for the corporations, so it’s nothing new that its “Do as I say, not as I do.”

  • sloppychops@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 months ago

    If everyone can ‘train’ themselves on copyrighted works, then I say "fair game.‘’

    Otherwise, get fucked.