OpenAI left Scarlett in the face? Pushing for personality protections
Writing in World Intellectual Property Review, Associate Andrew Wilson-Bushell and Trainee Solicitor Catherine Clover argue that, although parliamentary scrutiny on AI in music is welcomed, legislation must strike the right tone.
The UK All-Party Parliamentary Group on Music (APPG on Music) published its report on 1 May 2024: “Artificial Intelligence and the Music Industry – Master or Servant?”, calling for the government to produce legislation to regulate AI in order to protect the UK music industry. The report echoes concerns expressed by experts and artists across the music and creative industries, as well as more broadly, and is eerily pertinent to the reports in late-May that Scarlett Johansson hit back against OpenAI’s integration of “Sky”, an AI voice that Johansson claims mimics her own.
The APPG on Music’s report made eight key recommendations to the government relating to AI and included findings from a poll by UK Music. In the UK (which differs from the US where Johansson’s complaints are focussed) there is no codified “personality right” or “image right” per se. A person’s likeness/voice is currently protected by interweaving protections for copyright materials, and additional rights like passing off, protections against false endorsement, and data protection rights.
The report calls for the introduction of a specific UK personality right that would protect creators and artists from deepfakes, misappropriation and false endorsement. This was proposed prior to Johansson’s complaint, in light of the circulation of AI generated explicit photographs of stars such as Taylor Swift and Dua Lipa. The sharing of intimate deepfakes was explicitly made illegal under the Online Safety Act in 2023 and the government has since proposed further criminalisation of the creation of intimate deepfakes. Yet to many this does not provide enough protection. According to the poll within the APPG on Music’s report, 83% of UK adults agree that a music artist’s creative “personality” should be protected by law from being copied by AI.
Many artists have voiced their concerns about protecting their voices and likeness from AI, including FKA Twigs, who announced that she used AI to create a deepfake of herself, trained on her voice and personality. Whilst this might sound counter intuitive, FKA Twigs recognised the power of AI as a useful tool, if harnessed in the right way. By creating her own deepfake, she considers that she gained control over what she gives consent to, and it will increase the time she can spend on her art while her AI takes over the promo side. Speaking to the US Senate Judiciary Subcommittee, FKA Twigs is reported as saying: "What is not acceptable is when my art and my identity can simply be taken by a third party and exploited falsely for their own gain without my consent due to the absence of appropriate legislative control".
Reports indicate that Johansson has claimed that OpenAI approached her about licensing her voice for the voice of Sky, but that she turned them down. The subsequent release by OpenAI of Sky (despite the reported breakdown in licensing discussions) in a form similar to Johansson’s voice (and Sam Altman posting on Twitter/X simply stating “her” – a possible reference to Johansson’s film where she stars as an AI assistant), further adds fuel to the fire within the creative industry that technology companies need reigning-in.
The unlicensed use of an artist’s image is not a new issue in the UK. Notably, when Rihanna had to battle against Topshop for using her image on a T-shirt, she won a claim for ‘passing off’, showing that Topshop caused damage to her reputation through their misrepresentation. That judgment clearly stated that such protections are not guaranteed. It is up to the artist to prove the facts in each case. The use of an image of a person on a garment is not, of itself, passing off, particularly where the item is overtly held out as not being an officially endorsed product (for example, by being marketed as “unofficial”), which makes it hard to establish any misrepresentation.
Not every use of a person’s face would necessarily be problematic, although, in the case of music artists, their identity is often so intrinsically linked to their brand that it should often be protected. Any legislative steps to introduce a “right to personality” under English law would be a sea-change to intellectual property rights in this country, with repercussions across the music industry. For example, contracts underpinning the usage of music (including between artists and their labels) would need to be scrutinised to see where these new rights fall.
In addition to the call for a specific personality right, the APPG on Music called for greater transparency with AI, also urging for an international taskforce on AI and the production of a UK AI Act. In the same way that copyrighted works require a copyright notice, perhaps in the future AI content will also be labelled. This would be broadly in line with the recently approved EU AI Act which will, when the relevant provisions come into force, impose transparency obligations on generative AI platforms.
The EU AI Act addresses, amongst other things, the use of copyrighted materials in training AI, as well as requiring deepfakes to be labelled. Those who breach the EU AI Act could face fines of up to 7% of their global annual turnover, or EUR €35 million. In theory, the UK Government could now build on the work done in the EU to create a UK AI Act that provides increased protection for artists. Increased homogenisation with the EU would have a positive impact on the music industry as a whole, by giving tech platforms similar regimes to comply with and to better help artists understand their rights on a global level.
Yet time will tell how the government will respond. The creative industries have seen some success in driving for change, for example by successfully preventing an expansion of the currently limited exception for text and data mining under UK copyright law. However, the government has generally expressed a “pro-innovation” approach to AI and technology regulation. Potentially the allure of big tech moving to England’s pleasant pastures may prove too great.
Ultimately, a collaborative approach between industry professionals and the government to safeguard the future of artists and their work, and an openness to licensing models under existing laws from both sides of the debate, will hopefully win the day.
Andrew and Catherine's article was published in World Intellectual Property Review, 30 May 2024.