Skip to main content
Share via Share via Share via Copy link

Bruce Willis AI and the problem with deepfakes

A deepfake of Bruce Willis is advertising Russian mobile phones. Many great artistic and metaphysical questions are raised by this performance. However, this article is going to look at the intellectual property law implications, from a UK perspective.

31 October 2022

A deepfake of Bruce Willis is advertising Russian mobile phones. Many great artistic and metaphysical questions are raised by this performance. However, this article is going to look at the intellectual property law implications, from a UK perspective.

I think there are two particularly interesting issues: the rights of the performer, and the copyright implications.

Performers rights

Performers rights are dealt with quite differently in different countries (and in the US, there are different rules in different states).

In the UK, the common law tort of “passing off” encompasses “false endorsement”. There are two elements this: firstly, someone has to have a significant reputation or goodwill. Second, the actions of the defendant have to give rise to a false message which would be understood by a not insignificant section of his market that the defendant’s goods have been endorsed, recommended or approved of by the claimant.

There are two well known false endorsement cases in English law. In the first, a photograph of Eddie Irvine, a well known racing driver, was edited to make it look like he was holding a “Talk Radio” portable radio; and this was false endorsement. In the other case, Rihanna’s photograph appeared on some Topshop T-shirts. In the particular circumstances of the case, people would likely to buy the t-shirts believing that Rihanna had approved or authorised the t-shirts – so again, this was false endorsement.

Using a deepfake of a well known actor in an advert would usually be passing off. Here, I understand that Bruce Willis gave permission to MegaFon, the Russian phone company. Of course, it may not be false endorsement if the original actor isn’t well known.

Things also get more complex if it is harder to identify a person. A deepfake of someone’s voice could be false endorsement, if that voice was recognisable; but voices can be less recognisable than faces.

There are other relevant rights other than false endorsement. Performers of sound recordings currently have a “moral right” to be identified and can object to “derogatory treatment” of their performances. An international treaty, the “Beijing Treaty”, provides for audiovisual performers a right to be identified and to object to detrimental modification of their performances. This has not yet been implemented in the UK. The UK Intellectual Property Office issued a call for views and is considering the responses.

In practice, however most contracts require performers to waive their moral rights, so implementation may not make a significant difference.

Performer’s rights and copyright law more broadly may need to evolve to meet the future challenges that AI will raise. Equity, the actors’ union, is particularly concerned about use of human performances to train AI and generate AI performances, and is campaigning for stronger rights for performers. The campaign is called “Stop AI stealing the show”.

Copyright issues

When we say AI, we usually mean machine learning, and that requires data to train the AIs. That data is often a copyright work – a dataset of historical weather data can be used to train an AI, but so can a novel, a film, or a sound recording. News reports say that the AI used to generate a deepfake of Bruce Willis was trained using Die Hard and the Fifth Element. Those films are copyright works.

When AI reads or learns from something it does so by copying them into memory. So an AI watching Die Hard is copying Die Hard. Copying is of course one of the things that is prevented by copyright.

However, AIs are in some jurisdictions getting statutory permission to copy. There are text and data mining exceptions to copyright in Japan, Singapore and the EU, and it may be fair use in the US. Each country has a slightly different approach.

The UK government has recently published the response to a consultation about AI which covered, in part, data mining. The response announced that it will also introduce a copyright and database right text and data mining exception. In the consultation response, “data mining” is given a very broad definition – “using computational techniques to analyse large amounts of information to identify patterns, trends and other useful information”. Deepcake, the Russian company which created the Deepfake of Bruce Willis, reportedly used 34,000 images of Bruce Willis to create an image of him. That is a large amount of information, and they used computational techniques to identify useful information, i.e. what Bruce Willis looks like.

The exception will apply “for any purpose” and the government has said it will not be possible to contract out of this (which is possible in some jurisdictions, like the EU). Rightsholders will be able to charge for use to platforms, but it’s not yet clear how this works. If I have a subscription to Disney+, will that allow me to train an AI to create animation in a Disney style? If I have a Spotify subscription, does that allow me to use data mining to listen to the most popular albums and train an AI to create similar music? It will be interesting to see the draft legislation.


This article was originally published on 21 October 2022 in the Artificial Lawyer, written by Giles Parsons.


Contact

Contact

Giles Parsons

Partner

giles.parsons@brownejacobson.com

+44 (0)20 7337 1505

View profile Connect on LinkedIn
Can we help you? Contact Giles

You may be interested in...