Anthropic Agrees to Pay Authors at Least $1.5 Billion in AI Copyright Settlement

Spread the love

Have agreed Pay at least $ 1.5 billion Display a case The authors of a group of books have come up with a copyright violation, approximately $ 3,000 per job. At a speed of the court on Friday, the plaintiffs emphasized that the terms of the settlement were “critical victory” and a “lot of” risk to the trial.

It is the settlement of the first class action in the United States, the first class action disposal and how the results go to legal debate on AI and intellectual property of the regulators and creative arts. According to the settlement agreement, the class action will be applicable to about 500,000 work, but the number of pirated materials may increase even if the list is finalized. For each additional job, the artificial intelligence agency will provide an additional $ 3,000. The plaintiffs plan to provide the final list of work to the court by October.

“This breakthrough settlement has surpassed any other known copyright recovery IT Godfrey LLP.

Anthropologists do not accept any wrongdoing or responsibility. “Today’s settlement, if approved, will solve the remaining inheritance demands of the plaintiffs. We are committed to developing safe AI systems that help to solve their power, advance scientific discoveries and complex problems,” ethnic Deputy General Council Aparna Srira said in a statement.

The case was originally filed in the US District Court for the northern district of California Riding waves Copyright case cases against technology agencies they bring against artificial intelligence programs for training. Author Andrea Bertz, Kirk Wallace Johnson and Charles Grabar complained that anthropological copyright was trained on their work models on their work without permission.

This June, Senior District Judge William Alsup Ruled That ethnographic AI training was protected by the “fair use” doctrine, which allows unauthorized use of copyrighted work on certain terms. It was a win for the technology company but came with a big warning. It rely on a corpus of pirated books from anthropological so -called “shadow libraries” as well as collecting materials for training AI tools The notorious site Leben, and alsupe has determined that the authors will still be able to bring the ethnographic an ethnic in class action on pirating their work. (Ethnically retained that it actually did not train its products in pirated work, instead prefer to buy a copy of the book))

“Downloading more than seven million pirated copies of ethnographic books, paid nothing and kept these pirated copies in his library, even after making decisions, it would not use them to train AI (at once or again). The authors argued that the pirated library was paid for the pirate library. Alsup wrote in his short verdict.

Leave a Reply

Your email address will not be published. Required fields are marked *