Last week, saw pornographic deepfake images of mega-star, Taylor Swift, circulated on various social media platforms, including X and Meta. Not only were the images themselves made without the singer's consent, but they also reportedly depicted her being assaulted in non-consensual sexual acts.

According to NBC News, the deepfakes of Swift on X amassed over 27 million views and more than 260,000 likes in 19 hours before the account that initially posted the images was suspended. X has since blocked searches for 'Taylor Swift' on the site. Joe Benarroch, head of business operations at X, described the measure as "temporary", adding that it was done with "an abundance of caution as we prioritise safety on this issue."

What is the law on deepfakes in the UK?

According to Michael Drury, Of Counsel at BCL Solicitors, "There is no direct law prohibiting the sharing of 'deep fakes' unless those images are pornographic. In that case, the recently created offences under the Online Safety Act 2023 will mean that a crime has been committed as long as the person whose image is shared (real or fake) has not consented and the person sharing does not believe they have consented.

"There is no direct civil wrong allowing the person said to be shown in the image to sue. For those in the same position as Taylor Swift, the obvious solution is to rely upon the copyright of one's image (if copyrighted), a breach of privacy or data protection laws; harassment (as a civil wrong), perhaps defamation, or criminal law more generally."

This article was first published by Glamour on 31 January 2024 and can be seen here.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.