Here’s the prediction for 2025: Will there even be more Litigation over the unauthorized use of content to train artificial intelligence models.
In the music industry, we’ve seen major record labels and publishers launch legal action against the likes of gen AI startups Suno and Udio, as well as artificial intelligence giants Anthropic and Open AI.
In the broader media and entertainment field, OpenAI has also Sued by The New York Times and writers including Sarah Silverman. Meanwhile, Getty Images is suing Stablity AI for allegedly using its photos without permission Dow Jones and the New York Post are suing Perplexity for allegedly copying their work without permission.
The types of content involved in all of these lawsuits may vary, but the accusations are essentially similar: Artificial intelligence companies are using copyrighted material without permission to train their systems.
one A key argument made by some artificial intelligence companies and their financial backers in response to copyright infringement claims is the use of copyrighted content available online to train artificial intelligence It falls under “fair use” under copyright law.
In a scathing op-ed wealth This week, Getty Images chief executive Craig Peters dismissed that notion, arguing instead for a nuanced approach to assessing fair use that supports the potential of artificial intelligence to benefit society without harming creative industries, including music. .
As Peters noted in his column wealthGetty employs more than 1,700 People, representing the work of more than 600,000 journalists and creators around the world.
The company’s third quarter (three months to the end of September) revenue was US$240.5 million, an annual increase of 4.9%, and revenue in fiscal 2024 is expected to be between US$934 million and US$943 million.
“Copyright is at the core of our business and the livelihood of the people we employ and represent.”
Craig Peters, Getty, in a Fortune magazine op-ed
“Copyright is at the heart of our business and the livelihoods of the people we employ and represent,” Peters wrote.
He said he “strongly disagrees”[s] Peters wrote that Microsoft’s artificial intelligence chief executive Mustafa Suleyman and others have made comments in the past suggesting that “there is no copyright protection for online content.”
Peters added: “This disagreement highlights why we filed lawsuits against Stability AI in the US and UK. We did not grant Stability AI a license to use the millions of images owned and/or represented by Getty Images to train its Stable Diffusion model , the model has been put into commercial use in August 2022.
Peters noted, “As the litigation slowly progresses, AI companies make the argument that without the ability to freely scrape training content, there would be no AI, preventing us from harnessing the promise of AI to solve cancer.” , mitigate global climate change, and eliminate global hunger.
He added: “Please note that companies investing in and building artificial intelligence spend billions of dollars on talent, GPUs, and the power required to train and run these models, but clearly claiming compensation from content owners is a difficult hurdle to overcome. challenge.
Peters also argued that “fair use” “should be applied on a case-by-case basis” and that AI should not be viewed as a “single case” but as a “broad range of models, capabilities and technologies”. .
He asked in the column: “Will curing cancer affect the value of Kevin Bacon’s performance? Obviously not. Will solving climate change affect the value of Billie Eilish’s music? Obviously not.
“Does solving the problem of global hunger affect the value of Stephen King’s writing? Again, clearly not. Not only does it not undermine the value of their work, but they may never challenge such a use if it benefits those goals, Even though this use may be commercial in nature. As the CEO of Getty Images, I can say that we will never debate or challenge these applications, and we will wholeheartedly welcome any support we can provide for these applications.
Peters further noted that “content generation models” can generate “music, photos and videos based on text or other input” and have been “trained on content from absentee artists.” [of] Their permission” does not have the “potential to improve outcomes for our society.”
He said this use of artificial intelligence was “purely the theft of one group for the financial benefit of another”.
“Let’s stop the rhetoric that all unauthorized AI training is legal and that any requirement to respect creators’ rights comes at the expense of AI as a technology.”
Craig Peters, Getty
The CEO of Getty also said there are parallels between the evolution of the artificial intelligence content industry and the emergence of authorized players in this field and the rise and fall of Napster and other illegal download services, which gave way to authorized streaming platforms such as Spotify.
“As the licensing models of Spotify and Apple Music evolved from the infringing original Napster, there emerged models for artificial intelligence developed under license and business models that reward creators for their contributions,” Peters wrote in a Fortune column.
He added: “Like Apple Music and Spotify, they will cost a little more, but if we level the playing field by targeting companies that choose to ‘move fast and break things’, they can thrive. and is widely adopted.
Peters’ article concludes by arguing that “there is a fair path that rewards creativity and delivers on the promise of artificial intelligence.”
He added: “Let’s stop the rhetoric that all unlicensed AI training is legal and that any requirement to respect creators’ rights comes at the expense of AI as a technology.”global music business