Today, 8 February 2022, is Safer Internet Day 2022 – which this year has the theme of ‘All fun and games? Exploring respect and relationships online’. With the draft Online Safety Bill (the Bill) progressing through Parliament and the source of much scrutiny from key stake-holders, the public and political parties, Safer Internet Day this year takes on a new importance – especially with the theme of online relationships.
The Bill seeks to promote online safety in the UK though online content regulation, with the objective to greater protect the vulnerable, and impose duties on internet service providers. If enacted, this Bill has the potential to dramatically change the face of the internet, the experience of users, the duties of service providers and the rights of those who have fallen victim to internet based offences.
With this potential in mind, in 2021, the Queen Mary Legal Advice Centre (QMLAC) and Mishcon de Reya LLP established a Policy Clinic to specifically consider the complexity and breadth of the Bill through the lens of victims of image-based sexual abuse (colloquially known as "revenge porn"), with the aim to analyse how the Bill might assist these victims, and provide recommendations for the Bill's improvement.
Analysing the Bill specifically through the lens of these victims stems from the years of work both QMLAC and Mishcon de Reya LLP have undertaken to provide legal advice and education to the community as part of the SPITE (Sharing and Publishing Images To Embarrass) project. Both organisations have collaborated on this project since the introduction of the Criminal Justice and Courts Act in 2015. The SPITE project provides free legal advice to victims, and undertakes community based public legal education in secondary schools on image-based sexual abuse. The project and Policy Clinic's joint expertise is founded in front-line project experience and a wealth of legal knowledge in areas such as criminal law, reputation protection, data protection and civil law.
Today QMLAC and Mishcon de Reya LLP publish a brief note on the findings of the Policy Clinic – summarising weaknesses of the Bill, how the Bill as it stands affects SPITE victims, and amendments the Policy Clinic propose to strengthen protections and address the Bill's limitations. Nonetheless, we should state that even in its current form, the Bill is a welcome step forwards – we are simply of the view that it can be even better.
The Mishcon Policy Clinic team are: Harry Eccles-Williams, Joshua Edwards, Liz Barrett, Sophie Hollander and Saba Tavakoli; the QMLAC Policy Clinic team are: Frances Ridout, Maria Padczuk, Isabel Hope Tucker, Heloise Anne-Sophie Lauret, Xhulia Tepshi, Nikela Allidri and Gerasimos Ioannidis.
To read more about the SPITE project or if you have been the victim of image-based sexual abuse follow this link to QMLAC's website.
Summary of findings
The objective of the Online Safety Bill (the Bill) is to promote online safety in the UK. The Bill imposes duties of care on regulated services and grants OFCOM new responsibilities to protect people online, whilst balancing the right to freedom of expression against the right to privacy.
We note, and largely agree with, the broader concerns with the Bill raised by interest groups, politicians and the public, including that: it does not go far enough to protect the individual user; it places too much reliance on organisations to monitor and remove offending content; it is too broad and leaves much open to interpretation; and it is not clear how effective the enforcement regime will be.
The purpose of our review, however, was to look at the Bill from the perspective of SPITE1 victims, and to propose modest amendments that would specifically benefit them. We are of the view that the below amendments would do this, whilst not having a material impact on freedom of speech or an unmanageable impact on business.
Definitions
- The terms "illegal content" in section 412, "content harmful to children" in section 45 and "content harmful to adults" in section 46, do not explicitly include image-based sexual abuse or violence against women and girls ("VAWG"). The Online Harms Consultation Response indicates that these terms may include violent and/or pornographic content but there is a lack of certainty around whether this is the case.
- Image-based sexual abuse should be explicitly defined and included within the scope of harmful content under the Bill3.
- The Bill should explicitly recognise VAWG in all its forms (this has also been recommended by a Committee report published on 24 January 20224).
- Commercial pornography websites should be specifically named within the Bill and subject to a higher level of scrutiny by OFCOM, which should be empowered to issue take down notices.
- The Bill should make ‘cyber flashing’ a specific criminal offence as recommended by the Law Commission5.
- The tier system for regulated services (which is currently based on the number of users and functionalities of the service provider) should be revised to ensure that harmful and illegal content is caught on more sites. There should be a rebuttable presumption that websites that host pornographic material6 are 'Category 1' services7.
Guidance
- The Bill should provide further guidance under Section 106 on which entities are eligible to make a super complaint to OFCOM. A suitable entity needs to be appointed to represent the interests of SPITE victims8.
- There should be a duty on OFCOM to assess the risks of harms to particular groups of users, like SPITE victims, and assess how such groups may be disproportionately exposed to online harms.
- The guidance on risk assessments under section 62 of the Bill should set out how harm arises from image-based sexual abuse content and how it plans to regulate service providers that provide a platform for such content.
Online Anonymity
- The Bill should address issues of online anonymity, which is a major concern for SPITE victims. Online anonymity often makes is very difficult to prove who is behind image-based sexual abuse (even if the victim knows that there is only one person it can be).
- There are a number of sensible recommendations in this regard. The recommendations in the 10 December 2021 Joint Committee report9, are a good starting point.10 These include the proposal by Siobhan Bailley MP that social media platforms: "First, give all social media users a choice: the right to verify their identity. Secondly, give social media users the option to follow or be followed only by verified accounts. Thirdly, make it clear which accounts are verified."
Warning notices and collecting information
- The Bill should require OFCOM to collect information linked to image-based sexual abuse (e.g. by consulting QMLAC and drawing on relevant data derived from its clients).11
- The skilled persons’ report under Section 74 of the Bill should be prepared in consultation with a professional with relevant experience in developing algorithms to identify and remove image-based sexual abuse content.12
- The provisions relating to OFCOM's power to issue a technology warning notice for content (Sections 63 – 69 of the Bill) should be amended to include specific reference to image-based sexual abuse content.13 OFCOM should require regulated services to use accredited technology to identify and remove this content.14
Media literacy
- The duty to promote media literacy under Section 103 of the Bill could be expanded to outline the specific goals it aims to achieve e.g. removing terrorism, child sexual exploitation and abuse and image-based sexual abuse content.15
- OFCOM should be required to work with schools to promote media literacy from an early age with specific reference to content which is image-based sexual abuse where appropriate (e.g. by way of initiatives such as SPITE for Schools, led by Mishcon de Reya and QMLAC).
- The Bill should be expanded to list the type, frequency and outreach objectives (e.g. number and demographic of people targeted) of media literacy campaigns that OFCOM must implement. Guidance about the evaluation of educational initiatives should also be included.