Recently, Meta appeared in aUSfederal court to defend one of the most closely watched copyright challenges of theAIera. The lawsuit is being brought by agroup of prominentUSauthors including Ta-Nehisi Coates and Richard Kadrey. The plaintiffs allege that Meta unlawfully used millions of copyrighted books, downloaded from the shadow library LibGen, to train its generativeAImodel,Llama.
While the case is being fought in theUS, the legal and commercial ramifications will be felt worldwide, including here in Australia. The court's ultimate ruling will influence how courts, regulators and creators around the world approach the question, can artificial intelligence be trained on copyrighted works withoutpermission?
For Australian businesses developing or deployingAItools, and for creators whose works may be swept into these datasets, the issues raised in this case warrant close attention and aconsideration as to how they may be approached herelocally.
The case againstMeta
At the heart of the case is Meta's alleged use of LibGen, awell-known online source of books, academic works and comics that are often uploaded without authorisation from the author or copyright owner. According to court filings, Meta's internal communications show awareness of the legalrisks.
It is reported that Meta initially explored licensing agreements with publishers. Plaintiffs allege that those discussions were ultimately abandoned because LibGen offered access to millions of books without the associated costs. They argue this has deprived authors and copyright owners of compensation andcontrol.
Meta, however, argues that the use of this material falls under theUSdoctrine of"fair use," aflexible defence that permits unlicensed use of copyrighted content if it is considered transformative, such as being repurposed for anew and different purpose likeAItraining. Meta contends that there is no market for licensing books for this purpose, and therefore, no economic harm toauthors.
Fair use vs fair dealing: why this matters forAustralia
While Meta is relying on theUSfair use doctrine, abroad and open-ended test, Australia has astricter equivalent.
Australian copyright law instead operates under the narrower concept of"fair dealing," contained in theCopyright Act1968(Cth). This allows limited exceptions for use of copyright materials without alicense. These uses are limited to criticism, review, access by aperson with adisability, judicial proceedings or professional advice, news reporting, parody, satire, and research or study. Importantly, fair dealing does not include ageneral exemption for transformative uses, meaning that trainingAIon copyrighted material without permission would be far more difficult to defend inAustralia.
Accordingly, while Meta's actions may or may not be legally permissible in the United States, similar conduct here would likely constitute infringement unless aspecific exemption applies, or alicence isobtained.
Why Australian developers should payattention
ManyAImodels used or adapted in Australia, particularly those that are open source, may have been trained on datasets that include infringing content. Even if the training occurred offshore, deploying those models locally could result in legal exposure under Australian law. The absence of afair use defence here means that local users may be held accountable for copyright breaches embedded in third-partysystems.
Legal due diligence is therefore becoming an important part ofAIproduct development and deployment. Businesses should endeavour to be cognisant of how models were trained, what datasets were used, and whether any risk of copyright infringement arises. This is particularly critical for organisations working in regulatedindustries.
Lookingahead
Regardless of the outcome, this case represents awatershed moment for copyright law globally. It shines aspotlight on the legal uncertainty surroundingAIdevelopment and the pressing need for legislativeclarity.
The Australian Government has started to consider these issues, including as part of the VoluntaryAISafety Standard released in the second half of2024, which gives practical guidance to Australian organisations on how to safely and responsibly use and innovate with artificial intelligence. Of particular relevance to this scenario is Guardrail3: ProtectAIsystems, and implement data governance measures to manage data quality and provenance, which requires organisations to understand and document data sources, put in place processes to manage data and document the data used to train and testAImodels or systems, presumably with consideration given to rights and obligations granted to copyright owners under the CopyrightAct.
Looking forward, Australian policymakers may eventually need to consider whether the current fair dealing regime remains fit for purpose in an AI-driven future, or whether reforms like fair use should beexplored.
For further information please contact:
James Skelton, Partner
Phone: +61 2 9233 5544
Email: jas@swaab.com.au
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.