Another copyright infringement action has been filed against Microsoft and OpenAI.Authors of nonfiction Nicholas Basbanes and Nicholas Gage filed a lawsuit against the two businesses, claiming that the defendants had stolen their copyrighted works in order to use them in the development of the AI system.
A week ago, The New York Times filed a similar copyright infringement case against Microsoft and OpenAI, claiming the businesses used the newspaper’s material to train AI chatbots. The action was filed on Friday, January 5, in a Manhattan federal court.The most recent legal action comes after OpenAI acknowledged that plaintiffs and other copyright holders ought to receive payment for the use of their work.“Billions of dollars” in damages are sought in the NYT case.The petition states that up to $150,000 in damages are sought in the Basbanes and Gage lawsuit for each instance of copyright infringement.
In an article about its lawsuit against OpenAI and Microsoft, the NYT said, “We respect the rights of content creators and owners and are committed to working with them to ensure they benefit from AI technology and new revenue models.”
A planned class-action lawsuit against OpenAI was joined in September by a group of established writers from New York led by the Authors Guild, which included George R.R. Martin, John Grisham, Jodi Picoult, George Saunders, and Jonathan Franzen.Julian Sancton, a different author, is suing Microsoft and OpenAI for allegedly exploiting his nonfiction writing without permission in order to train AI models. A separate class-action complaint has been filed against the creator of the well-known chatbot ChatGPT in California for allegedly stealing private user data from the internet.On June 28, 2023, Clarkson Law Firm commenced legal proceedings in the United States District Court for the Northern District of California.
According to the lawsuit, OpenAI used data gathered from millions of blog posts, Wikipedia articles, family recipes, and social media comments to train ChatGPT without the users’ permission.