On Friday 13 December 2024, the House of Lords is due to debate the Non-Consensual Sexually Explicit Images and Videos (Offences) Bill, sponsored by Baroness Owen of Alderley Edge, at its second reading.
The Bill proposes to amend the Sexual Offences Act 2003 (the "Act") and introduce new provisions to make it an offence to take, create, or solicit the taking or creating of sexually explicit images of another person without consent. Among other things, this would criminalise the creation of so-called deepfakes or the soliciting of another person to do so.
Speaking on Times Radio this morning, Emma Woollcott, Head of Reputation Protection and Crisis Management at Mishcon de Reya, said:
“98% of deepfakes are pornographic, and most distressingly, 99% of those are of women and girls. And it's not just famous women. The campaign group, My Image, My Choice found that the most targeted group of people are ordinary women and girls.
“It's happening and it's happening at scale, and the impact is violating and deeply disturbing. The images aren't real, but they really look like they are, and the people that have been targeted have said that they felt utterly dehumanised, that they’ve been haunted and made physically sick by what they've seen.
“There is a lag between what the law currently protects and what technology enables people to do, and the Bill is very carefully drafted to try to be futureproof. The definitions are broad enough to catch behaviours that we foresee developing.
“The law has to be agile where loopholes exist, and this Bill seeks to close that gap in the current law which criminalises sharing intimate images without consent but doesn’t criminalise the creation and solicitation of deepfakes.”
Mishcon de Reya is a leading law firm with extensive experience advising victim-survivors of intimate image abuse and people subjected to online abuse, privacy intrusions, and harassment and disinformation campaigns. Alongside its client work, for nearly ten years, the firm has worked with Queen Mary University Legal Advice Centre on their pro bono SPITE project, supporting victims of image-based sexual abuse, and supporting their SPITE for Schools education work.
Emma Woollcott commented:
"We welcome this Bill which, if passed into law, would provide greater protections to victim-survivors of sexually explicit deepfakes and other sexual-based image abuse. Technology, including generative-AI, has rapidly outpaced legislation meaning that urgent action is needed to address this growing form of abuse which disproportionately impacts women and girls.
"In several respects the Bill seeks to futureproof the legislation by pre-empting technological advances and potential loopholes which might be exploited, to help combat the alarming rates of misogyny and violence against women and girls in the United Kingdom."