24 March 2023

Trying to Keep Up:  Art, Museums, A.I., and NFTs 

By Amalyah Keshet

“You can say anything is a work of art… but if you burn a Banksy and then want money for it [selling it as an NFT], that ranks pretty low on the art scale for me.”  

Art has always reveled in being controversial. So far, 2023 seems to promise a lot of gripping technological and legal controversies. Two subjects that won’t go away – A.I. generated art and, still, NFTs – are everywhere. There are even two copyright cases in the news involving handbags. They involve image “scraping” and licensing, or lack of licensing, and they may indicate the direction copyright discussions will turn this year.  That could be towards an examination of originality – and humanity — as a foundational requirement of copyright protection. 

Getty Images sues Stability AI

A few weeks ago Getty Images began legal proceedings against Stability AI in a federal court in Delaware, claiming that Stability AI infringed intellectual property rights including copyright in content it owned or represented. The company claims that Stability AI unlawfully copied and processed millions of images protected by copyright, plus the associated captions and metadata, for commercial benefit and at the expense of the content creators.  Their official statement on the issue is worded with precision: 

“Getty Images believes artificial intelligence has the potential to stimulate creative endeavors. Accordingly, Getty Images provided licenses to leading technology innovators for purposes related to training artificial intelligence systems in a manner that respects personal and intellectual property rights. Stability AI did not seek any such license from Getty Images and instead, we believe, chose to ignore viable licensing optionsand long‑standing legal protections in pursuit of their stand‑alone commercial interests.”  

In other words, the option to license having been available, Stability AI would have to prove that “scraping” images from the Internet – including those with a Getty Images watermark – is allowed as data mining.  “Stability AI may argue that the temporary copy exception in s28A of the CDPA permits the reproduction of Getty’s images to train the model.”  However, these exceptions generally apply to non-commercial purposes, including search.  Getty Images claims Stability AI made use of the images to train its artificial intelligence image generator Stable Diffusion and to “build a competing business” through its revenue-generating interface called DreamStudio. In addition, it also provides open-source versions of Stable Diffusion to third-party developers for access, use, and development of their own image-generating models.  

The question is whether these AI image outputs are protected under copyright law and, if so, who owns the copyright. Section 9(3) of the Copyright Design and Patents Act (UK), (CDPA) provides that in the case of an artistic work which is computer-generated, the author shall be taken to be the person by whom the arrangements necessary for the creation of the work are undertaken. With limited case law citing this provision, there is some ambiguity and academic debate on the ownership of computer-generated works under English law.  Under U.S. law, (the images were used in the U.S.) copyright protection has been denied to non-human creators.  The US Copyright Office has refused to register AI-generated works, requiring human authorship as a prerequisite for protection.

In this instance, however, the user generating the images was in Ireland and the online software model generating the images was hosted in the US. While the Irish Copyright and Related Rights Act includes a similar provision concerning the authorship of computer-generated works, Irish academics have noted this provision may be inconsistent with the EU acquis. To add additional complexity to the situation, under the Stable Diffusion license, Stability AI claims no rights in the outputs of the model as long as the user has complied with use restrictions designed to prevent the dissemination of harmful materials. 

A class-action lawsuit was also brought earlier this year by three artists, Kelly McKernan, Sarah Anderson and Karla Ortiz, against three AI image generators. The artists said that they had not consented to “having their copyrighted works of art included in a database used by the image generators, they had not been compensated, and their influence was not credited when AI images were made using their works.”   All three generators use laion-5B, a nonprofit, publicly available database that indexes more than five billion images from the Internet, inevitably including the copyright-protected work of an untold number of artists. 

The question is, does this indexing, prompting and making available for use constitute copyright infringement – does it facilitate the making of unauthorized derivative works?  McKernan discovered that a website called Metaverse Post had suggested “Kelly McKernan” as a term to feed an A.I. generator in order to create “Lord of the Rings-style art.” Hundreds of other artists were listed according to what their works evoked: anime, modernism, Star Wars. On the Discord chat that runs an A.I. generator called Midjourney, McKernan discovered that users had included the name more than twelve thousand times in public prompts. “We’re not litigating image by image, we’re litigating the whole technique behind the system,” said the artists’ lawyers.     

For similar reasons, the Authors Guild, the leading writer-advocacy organization in the U.S., has “issued an update to its model trade book contract and literary translation model contract with a new clause that prohibits publishers from using or sublicensing books under contract to train ‘artificial intelligence’ technologies.” 

The Handbag Cases

Moving on to the world of handbags, the Joan Mitchell Foundation recently sued Louis Vuitton for unauthorized use of Mitchell’s artwork as the background for ads featuring LV bags. Apparently, permission was sought and denied but the company went ahead, indicating that this could be a textbook infringement claim.  

In contrast, a lawsuit brought by Hermès could be precedent-setting for future digital assets in the luxury fashion context. Last year Hermès sued artist Mason Rothschild   alleging he infringed the trademarks of its famous Birkin bag by creating and selling virtual interpretations called “MetaBirkins NFTs.”  Rothschild argued that “MetaBirkins” could be viewed as the title of an art project, not a trademark” and therefore not infringing. “Rothschild’s response asks what ownership means in the metaverse and whether NFTs can qualify as artistic expression, two questions that could impact brands eager to protect their intellectual property in web3.” A thorough look at the case can be found here.

In fact, art historian Blake Gopnik, amongst other prominent experts, offered a defense of Rothschild’s claim, comparing the MetaBirkins to works by Andy Warhol and Damien Hirst. 

No less a legal authority than Harvard Law School Professor Rebecca Tushnet, representing Rothschild, argued for the term MetaBirkins as the title of a digital art project commenting on the relationship between consumerism and the value of art, meaning the NFTs should be protected by the First Amendment of the U.S. Constitution. According to Tushnet, the MetaBirkins NFTs are also protected under case law: Rogers v. Grimaldi, a 1989 ruling in the US that established that use of a trademark would be shielded from infringement claims if their use is artistic expression that doesn’t explicitly mislead consumers. Digital bags shaped like Birken bags and called MetaBirkins, however, would seem misleading and clearly aimed at the consumers; the jury in New York agreed, awarding Hermes $133,000 in damages for trademark infringement, dilution, and cybersquatting.

To complicate matters further, NFTs cannot be deleted from the blockchain. Even with Hermès’s victory, “there are limited options for the brand to get rid of Rothschild’s MetaBirkins. The closest is “burning,” “which transfers the NFT — its record of sale, not the image itself — to an inaccessible wallet. This would (unintentionally) create a black market NFT, generating another slew of questions — the most important one being, when will this end?” 

Beyond handbags and fashion trademark issues, this case deserves the attention of museums and other institutions contemplating building and maintaining over time a collection of blockchain based art. 

More: 

Getty Images has also commenced legal proceedings in the High Court in London, issuing a “letter before action,” formal notification of impending litigation in the UK.  

A good explanation of how AI image software works can be found in this article on Getty Images and Stable Diffusion:  https://ipkitten.blogspot.com/2023/02/guest-post-litigation-commenced-against.html  

Of possible interest to those based in London:  Open Science & AI: A UK Policy Discussion, London, 25 April 2023   

Recent News

Back to News