Copyright class actions could financially ruin AI industry, trade groups say.
AI industry groups are urging an appeals court to block what they say is the largest copyright class action ever certified. They’ve warned that a single lawsuit raised by three authors over Anthropic’s AI training now threatens to “financially ruin” the entire AI industry if up to 7 million claimants end up joining the litigation and forcing a settlement.
Last week, Anthropic petitioned to appeal the class certification, urging the court to weigh questions that the district court judge, William Alsup, seemingly did not. Alsup allegedly failed to conduct a “rigorous analysis” of the potential class and instead based his judgment on his “50 years” of experience, Anthropic said.
“If we have to pay for the intellectual property that we steal and repackage, our whole business model will be destroyed!”
One thing this whole AI training debacle has done for me: made me completely guilt-free in pirating things. Copyright law has been bullshit since Disney stuck their finger in it and if megacorps can get away with massively violating it, I’m not going to give a shit about violating it myself.
For me it was Disney floating the idea of asking the wrongful death suit be dismissed because of the liability waiver in a Disney+ free trial.
I have the $$$, but I don’t agree with the terms for any of the streaming services, so I’ll just sail the seven seas and toss a doubloon (coin) to independent creators (my witchers) when I can.
I’m pretty much there too, the whole industry consolidates on the new things and charges us as they make it worse. And there can be some arguments to be made over the benefits of AI but we all know that it will not be immune to the entshitification that has already ruined all the things before it
If I downloaded ten movies to watch with my nephew in the cancer ward, they’d sue me into oblivion. Download tens of millions of books and claiming your business model depends on doesn’t make it okay. And sharing movies with my sick nephew would cause less harm to society and to the environment than AI does.
I started my own streaming service with pirated content. My business model depends on that data on my server.
Same thing but for some reason it’s different. They hate when we use their laws against them. Let’s root they rule against this class action so we can all benefit from copyright being thrown out. Or alternatively it kills AI companies, either way is a win.
deleted by creator
Worse, a (very bad) precedent will be set for future copyright cases
They’re not stealing anything. Nor are they “repackaging” anything. LLMs don’t work like that.
I know a whole heck of a lot of people hate generative AI with a passion but let’s get real: The reason they hate generative AI isn’t because they trained the models using copyrighted works (which has already been ruled fair use; as long as the works were legitimately purchased). They hate generative AI because of AI slop and the potential for taking jobs away from people who are already having a hard time.
AI Slop sucks! Nobody likes it except the people making money from it. But this is not a new phenomenon! For fuck’s sake: Those of us who have been on the Internet for a while have been dealing with outsourced slop and hidden marketing campaigns/scams since forever.
The only difference is that now—thanks to convenient and cheap LLMs—scammers and shady marketers can generate bullshit at a fraction of the cost and really, really quickly. But at least their grammar is correct now (LOL @ old school Nigerian Prince scams).
It’s humans ruining things for other humans. AI is just a tool that makes it easier and cheaper. Since all the lawsuits and laws in the world cannot stop generative AI at this point, we might as well fix the underlying problems that enable this bullshit. Making big AI companies go away isn’t going to help with these problems.
In fact, it could make things worse! Because the development of AI certainly won’t stop. It will just move to countries with fewer scruples and weaker ethics.
The biggest problem is (mostly unregulated) capitalism. Fix that, and suddenly AI “taking away jobs” ceases to be a problem.
Hopefully, AI will force the world to move toward the Star Trek future. Because generating text and images is just the start.
When machines can do just about everything a human can (and scale up really fast)—even without AGI—there’s no future for capitalism. It just won’t work when there’s no scarcity other than land and energy.
I respectfully disagree. Meta was caught downloading books from Libgen, a piracy site, to “train” it’s models. What AI models do in effect is scan information (i.e., copy), and distill and retain what they view as its essence. They can copy your voice, they can copy your face, and they can copy your distinctive artistic style. The only way they can do that is if the “training” copies and retains a portion of the original works.
Consider Shepard Fairies’ use of the AP’s copyrighted Obama photograph in the production of the iconic “Hope” poster, and the resultant lawsuit. While the suit was ultimately settled, and the issue of “fair use” was a close call given the variation in art work from the original source photograph, the suit easily could have gone against Fairey, so it was smart for him to settle.
Also consider the litigation surrounding the use of music sampling in original hip hop works, which has clearly been held to be copyright infringement.
Accordingly, I think it is very fair to say that (1) AI steals copyrighted works; and (2) repackages the essential portions of those works into new works. Might a re-write of copyright law be in order to embrace this new technology? Sure, but if I’m a actor, or voice actor, author, or other artist and I can no longer earn a living because someone else has taken my work to strip it down to it’s essence to resell cheaply without compensating me, I’m going to be pretty pissed off.
Hopefully, AI will force the world to move toward the Star Trek future.
Lol. The liberal utopia of Star Trek is a fantasy. Far more likely is that AI will be exploited by oligarchs to enrich themselves and further impoverish the masses, as they are fervently working towards right now. See, AI isn’t creative, it gives the appearance of being creative by stealing work created by humans and repackaging it. When artists can no longer create art to survive, there will be less material for the AI models to steal, and we’ll be left with soulless AI slop as our de facto creative culture.
I respectfully disagree. Meta was caught downloading books from Libgen, a piracy site, to “train” it’s models.
That action itself can and should be punished. Yes. But that has nothing to do with AI.
What AI models do in effect is scan information (i.e., copy), and distill and retain what they view as its essence. They can copy your voice, they can copy your face, and they can copy your distinctive artistic style. The only way they can do that is if the “training” copies and retains a portion of the original works.
Is that what people think is happening? You don’t even have a layman’s understanding of this technology. At least watch a few videos on the topic.
I think that copying my voice makes this robot a T-1000, and T-1000s are meant to be dunked in lava to save Sarah Connor.
But that has nothing to do with AI.
Absurd. It’s their entire fucking business model.
Meaning it would illegal even if they weren’t doing anything with ai…
deleted by creator
Meta literally torrented an insane amount of training materials illegally, from a company that was sued into the ground and forced to dissolve because of distributing stolen content
“When machines can do just about everything a human can (and scale up really fast)—even without AGI—there’s no future for capitalism.”
This might be one of the dumbest things I’ve ever read.
deleted by creator
It’s humans ruining things for other humans. AI is just a tool that makes it easier and cheaper
That’s the main point, though: the tire fire of humanity is bad enough without some sick fucks adding vast quantities of accelerant in order to maximize profits.
Since all the lawsuits and laws in the world cannot stop generative AI at this point
Clearly that’s not true. They’ll keep it up for as long as it’s at all positive to extract profits from it, but not past that. Handled right, this class action could make the entire concept poisonous from a profiteering perspective for years, maybe even decades.
we might as well fix the underlying problems that enable this bullshit.
Of COURSE! Why didn’t anyone think to turn flick off the switches marked “unscrupulous profiteering” and “regulatory capture”?!
We’ll have this done by tomorrow, Monday at the latest! 🙄
Making big AI companies go away isn’t going to help with these problems.
The cancer might be the underlying cause but the tumor still kills you if you don’t do anything about it.
the development of AI certainly won’t stop.
Again, it WILL if all profitability is removed.
It will just move to countries with fewer scruples and weaker ethics
Than silicon valley? Than the US government when ultra-rich white men want something?
No such country exists.
The biggest problem is (mostly unregulated) capitalism
Finally right about something.
Fix that, and suddenly AI “taking away jobs” ceases to be a problem.
“Discover the cure for cancer and suddenly the tumord in your brain, lungs, liver, and kidneys won’t be a problem anymore!” 🤦
Hopefully, AI will force the world to move toward the Star Trek future
Wtf have you been drinking??
deleted by creator
Of course many of them are stealing. That’s already been clearly established. As for the other groups, the ones that haven’t gotten caught stealing yet, perhaps it’s just that they haven’t gotten caught, and not that they haven’t been pirating things.
I like your rant, but I would like it better if the facts were facts.
When you “steal” something, the original owner doesn’t have that thing anymore.
When you “copy” something, the original owner still had it.
Stop calling it stealing, damnit! We fought these wars with the MPAA and RIAA in the 90s. By calling it “stealing” you’re siding the the older villains.
AI isn’t stealing or copying anything. It’s generating stuff. If it was truly copying things that would mean someone wrote all that “AI slop” that’s poisoning everyone’s search results.
I disagree that it is fair use. But, I was actually expecting the judiciary to say that it was. So, despite the ruling, I AM still mad that they used copyrighted works (including mine), in violation of the terms. (And, honestly, my terms [usually AGPLv3] are fairly generous.)
I’m also concerned about labor issues, and environmental impact, and general quality, but the unauthorized use of copyrighted works is still in the mix. And, if they are willing to all my private viewing of torrented TV “theft”, I’m willing to call their selling of an interface to a LLM / NN that was trained on and may have incorporated (or emit!) my works (in whole or in part) “theft”.
Labor issues are mostly solved by making to the workers control the means of production, not captial. Same old story.
Environment impact is better policed independent of what the electricity/water is used for. We aren’t making a lot of headway there, but we need to reign in emissions (etc.) whether they are using it to train LLMs or research cancer.
Quality… is subjective and I don’t think we are near the ceiling of that. And, since I don’t use “AI” for the above reasons, it really isn’t much of a concern to me.
LOL @ old school Nigerian Prince scams)
They were bad on purpose. People responding to such bad writing are easy marks.
Because generating text and images is just the start.
But usually this only improves after an AI winter, meaning the whole sector crashes until someone finds a better architecture/technology. Except there are now billions involved.
That’s unfair. They also have to sue people who infringe on “their” IP. You just don’t understand what it’s like to a content creator.
Hmm. I’m finding it hard to come up with more clever response to them than.
" good "
Not clever, but shared
They have the entire public domain at their disposal.
If giant megacorporations didn’t want their chatbots talking like the 1920s, they shouldn’t have spent the past century robbing society of a robust public domain.
You had me at “financially ruin AI industry”.
If the appeals court denies the petition, Anthropic argued, the emerging company may be doomed. As Anthropic argued, it now “faces hundreds of billions of dollars in potential damages liability at trial in four months” based on a class certification rushed at “warp speed” that involves “up to seven million potential claimants, whose works span a century of publishing history,” each possibly triggering a $150,000 fine.
Maybe they should have thought of that before they ripped off a century’s worth of published literature?
Are they expecting me to feel bad for them? Don’t do the crime if you can’t do the time.
If your business model depends on not paying millions of people for the product of their labors, destroys the environment, and the product hallucinates and makes people psychotic, then your business deserves to die a quick and painful death.
I’m no fan of the copyright fuckery so commonly employed by (amongst others) the RIAA and MPAA, but this is honestly the best use of copyright law I can think of in recent memory.
It’s the neat part with giant monsters… sometimes they trod on each others toes and they stop eating us to tear each other apart and we get to sit back and watch.
AI industry, fucking around: Woo! This is awesome! No consequences ever! Just endless profits!
AII, finding out: this fucking sucks! So unfair!
“copyright class action could ruin AI industry”
Oh nooooooo… How do I sign on to this lawsuit?
I would not hold my breath. There is a high likelihood that the courts will side AI companies because the American courts are compromised.
Or, according to the person who read the article, because the case is not what the title suggests it is?

Alsup allegedly failed to conduct a “rigorous analysis” of the potential class and instead based his judgment on his “50 years” of experience, Anthropic said.
The judge didn’t apply critical thinking but instead just did whatever was the most likely decision based on all the information it was fed in its training? That must be very inconvenient and it would be a shame if we had companies advertising exactly that as a world changing technology.
“Please let us steal” - AI Industry
deleted by creator
Yeah, who the fuck gave all these rich assholes the right to make money on others’ work?
I’d like to know how these assholes get away with even training on GPL licensed code.
deleted by creator
Oh no…I’m super bummed. 🤭










