Mishcon de Reya page structure
Site header
Main menu
Main content section
art abstract purple

Government to criminalise the creation of sexually explicit deepfakes

Posted on 17 January 2025

On 7 January 2025, the Government announced that it will be introducing new legislation to make it a criminal offence to create a sexually explicit deepfake image of another without consent. Sexually explicit deepfakes are non-consensual, AI-generated or altered images, videos or audio, which purport to show someone in an intimate state or engaging in conduct that could be deemed to be sexual. Advances in technology mean that deepfakes are alarmingly lifelike. Real or not, the impact on victim-survivors can be devastating.  

Under the proposals to be set out in the forthcoming Crime and Policing Bill, it will also be a criminal offence to: 

  • Take or record an intimate photograph or film without consent or reasonable belief that consent has been given. 
  • Take or record an intimate photograph or film without consent and with intent to cause alarm, distress or humiliation. 
  • Take or record an intimate photograph or film without consent or reasonable belief that consent has been given, and for the purpose of the sexual gratification of oneself or another. 
  • Install or adapt, prepare or maintain equipment, and do so with the intent of enabling the commission of any of the three offences of taking an intimate image without consent. 

Perpetrators could face up to two years in prison. The offences will only apply to images of adults as an equivalent offence already exists in respect of children. 

A stronger legal landscape

The proposed new laws seek to widen protections for victim-survivors of intimate image abuse and will build upon existing provisions introduced by the Online Safety Act 2023, which made it a criminal offence to share intimate images (including deepfakes) of another without consent.  

These reforms are important and very much needed. As yet there is no clear timeline on implementation, but it is hoped the Government will act swiftly. Technology and generative-AI has rapidly outpaced current legislation - urgent action is needed to address this abuse, which overwhelmingly and disproportionately impacts women and girls. Research shows 98% of deepfakes are sexually explicit, and 99% of those are of women and girls.

Prior to the Government's announcement, a Private Members Bill introduced in the House of Lords by Baroness Charlotte Owen on 6 September 2024, the Non-Consensual Sexually Explicit Images and Videos (Offences) Bill [HL], sought to strengthen legal protections around image-based sexual abuse. Notably the Bill also sought to make it a criminal offence to solicit the taking and/or creating of sexually explicit images without consent. This provision looked to address a potential loophole whereby a person could request intimate images be taken or created by another person in a jurisdiction outside of the UK, where such conduct is not a crime in that country.  

We await to see how and whether the Government intend to also address this issue in their proposed legislation. The Government must also ensure that it futureproofs any legislation by pre-empting technological advances and potential loopholes which might be exploited.  

The SPITE Project 

The Queen Mary Legal Advice Centre (QMLAC) set up and runs the SPITE project (Sharing and Publishing Images to Embarrass), which provides free legal advice to victims of sexual image-based abuse. Mishcon de Reya are proud to have worked with the QMLAC on this important project since its inception in 2015. 

Mishcon de Reya's specialist team of lawyers advise clients that have been victims of digital/online abuse and harassment, as well as publishers and platforms on their complex obligations in moderating online content. 

How can we help you?
Help

How can we help you?

Subscribe: I'd like to keep in touch

If your enquiry is urgent please call +44 20 3321 7000

I'm a client

I'm looking for advice

Something else