An AI company based in London has prevailed in a landmark high court proceeding that addressed the legality of AI models using vast quantities of protected data without authorization.
The AI company, whose directors includes Academy Award-winning filmmaker James Cameron, successfully defended against allegations from Getty Images that it had violated the international photo agency's copyright.
Legal experts consider this decision as a blow to copyright owners' exclusive ability to profit from their artistic work, with a senior lawyer cautioning that it demonstrates "Britain's secondary IP regime is not sufficiently strong to safeguard its artists."
Court documentation showed that Getty's images were indeed employed to develop Stability's system, which allows users to create visual content through written instructions. Nonetheless, the AI firm was also determined to have infringed Getty's trademarks in certain instances.
The judge, Mrs Justice Joanna Smith, stated that establishing where to find the balance between the interests of the artistic industries and the artificial intelligence sector was "of very real societal concern."
Getty Images had originally sued Stability AI for infringement of its IP, alleging the AI firm was "completely unconcerned to what they input into the development material" and had collected and copied millions of its photographs.
Nevertheless, the agency had to withdraw its original IP case as there was no evidence that the training occurred within the United Kingdom. Instead, it proceeded with its suit claiming that the AI firm was still employing reproductions of its image content within its platform, which it called the "core" of its operations.
Highlighting the complexity of artificial intelligence IP disputes, the agency fundamentally contended that Stability's image-generation model, known as Stable Diffusion, constituted an infringing reproduction because its creation would have represented copyright violation had it been conducted in the UK.
The judge determined: "An AI model such as Stable Diffusion which does not store or replicate any copyright works (and has not done so) is not an 'violating reproduction'." She declined to rule on the passing off allegation and ruled in support of certain of Getty's arguments about brand infringement related to digital marks.
Through a statement, Getty Images said: "We remain deeply concerned that even financially capable companies such as Getty Images face substantial difficulties in safeguarding their artistic works given the absence of transparency requirements. Our company committed substantial sums of pounds to achieve this point with only one provider that we need continue to address in another venue."
"We encourage authorities, including the United Kingdom, to implement more robust disclosure regulations, which are essential to prevent costly legal battles and to allow creators to defend their rights."
Christian Dowell for Stability AI said: "Our company is satisfied with the judicial ruling on the remaining allegations in this case. Getty's choice to willingly dismiss most of its IP cases at the end of court proceedings resulted in a limited number of claims before the court, and this final decision eventually addresses the copyright concerns that were the core issue. Our company is thankful for the time and effort the judiciary has dedicated to settle the important issues in this case."
The ruling emerges during an continuing discussion over how the current administration should legislate on the matter of copyright and AI, with creators and writers including several prominent figures advocating for enhanced safeguards. At the same time, technology companies are calling for broad availability to copyrighted content to allow them to develop the most advanced and efficient generative AI systems.
Authorities are presently seeking input on copyright and AI and have declared: "Lack of clarity over how our copyright system operates is holding back development for our artificial intelligence and artistic industries. That cannot continue."
Industry specialists monitoring the issue suggest that authorities are examining whether to introduce a "content analysis exception" into British copyright legislation, which would allow copyrighted works to be used to train machine learning systems in the UK unless the owner chooses their content out of such development.
A seasoned gaming journalist with over a decade of experience covering esports and indie game developments.