Canada’s largest new organizations have united in a lawsuit against OpenAI, alleging the AI firm’s use of the companies’ journalism is “illegal.”
OpenAI, like many AI firms, has scoured the internet for content to use in training its AI models. The practice has already caused legal trouble for OpenAI, with The New York Times suing the company in late December over the use of its content.
According to Toronto.com, Canada’s largest media organizations are launching their own suit against OpenAI in a first-of-its-kind collaboration. The group includes the Toronto Star, The Canadian Press, The Globe and Mail, Metroland Media, Postmedia, and CBC.
“Journalism is in the public interest. OpenAI using other companies’ journalism for their own commercial gain is not. It’s illegal,” said a joint statement from the media organizations.
“The defendants have engaged in ongoing, deliberate, and unauthorized misappropriation of the plaintiffs’ valuable news media works. The plaintiffs bring this action to prevent and seek recompense for these unlawful activities,” said a statement of claim by the news organizations.
“To obtain the significant quantities of text data needed to develop their GPT models, OpenAI deliberately ‘scrapes’ (i.e., accesses and copies) content from the News Media Companies’ websites…. It then uses that proprietary content to develop its GPT models, without consent or authorization,” the suit continues.
AI models rely on massive amounts of data in order for training purposes, but the practice is in a legal gray area. Many critics have maintained that AI firms are rushing to scoop up as much data as they can and make their AI models as useful and indispensable as possible, so that by the time the law catches up and address the gray area, it will be too late unwind what has been done. What’s more, if the AI models gain enough widespread use, the law could conclude that the ubiquitous nature of AI outweighs copyright concerns.
Only time will tell if these lawsuits are able to help establish legal precedent before that happens.