With the recent publication of the Online Safety Bill, many businesses whose services include an online element will be concerned to know whether they may be in scope of the proposed online safety regime. If they do fall under the Bill's provisions, they will also wish to know the likely steps that they will be required to take under the proposed regime, and the potential liability they may face in the event of a breach.
The answers to these questions are not immediately apparent, due to the way in which the Bill has been drafted. Leaving on one side the extreme complexity of the framework set out in the Bill, and the likelihood of aspects being clarified during its passage through Parliament, much of the detail will not become clear until after it receives Royal Assent – it is at that point, for example, that Ofcom will be tasked with setting out further details in codes of practice as to how the regime will operate and the recommended steps that in-scope companies will need to take place.
That said, the Government has estimated in its Impact Assessment of the Bill that 25,100 platforms will be in scope of the new regime, which is perhaps a conservative estimate. Indeed, its Impact Assessment also notes that approximately 180,000 platforms could be considered potentially in scope of the Bill. Many of those will, therefore, at the least need to undertake an assessment of whether the Bill applies to them or not.
Platforms/Services in scope – overview
The proposed online safety framework will apply to:
- User-to-user services: i.e., an internet service which allows users to generate, upload or share content (user-generated content or UGC) which may be encountered by others on that service. This is a broad definition and will include online marketplaces, dating apps, games with chat functions, forums and social media platforms, including where users can interact via direct messaging.
- Search services: e.g., search engines such as Google, but also any search service which allows users to search multiple websites or databases.
- Any service that publishes pornographic content (i.e., not just hosting user-generated content) which can be accessed by users in the UK (this article will not consider pornography publishers).
Whilst the definition is broad, certain user-to-user and search services are specifically exempted from the online safety regime including:
- Business-to-customer and business-to-business interactions.
- Certain functionality considered as low risk, such as user comments on digital content published by a platform or service, such as product reviews, and 'below the line' comments on articles and blogs (though it should be noted that such below the line comments are often highly abusive).
- Internal business services, such as intranets, customer relationship tools, enterprise cloud storage, productivity tools and enterprise conferencing software.
- Network infrastructure, such as ISPs, VPNs, and business to business services.
- Tools used by educational or childcare providers.
- Emails, voice only calls, and SMS/MMS.
Extra-territorial application
The Bill provides that its provisions also apply to providers of regulated services that are based outside the UK but have links with the UK. A user-to-user or search service will have 'links' with the UK where it has a significant number of users in the UK, or if the UK is a target market. It will also be in scope if it can be used in the UK by individuals and there are reasonable grounds to believe that there is a material risk of significant harm to individuals the UK.
A user-to-user and search service's duties only extend to the design, operation and use of the service in the UK, or how the service affects users and others in the UK.
The provisions on extra-territoriality jurisdiction are again extremely broad and could lead to some international platforms looking to block UK users, in a similar way seen following the introduction of GDPR. Furthermore - as has been the case under GDPR - those potentially in scope through these extra-territorial provisions may well vigorously resist attempts to assert jurisdiction.
Categories of regulation: thresholds
The Bill sets out multi-layered requirements for the companies in its scope, by defining different categories of regulated services:
- User-to-user services that meet the Category 1 thresholds
- Search services that meet the Category 2A thresholds
- User-to-user services that meet the Category 2B thresholds
- All other in scope businesses
- Pornography publishers that do not host user generated content or enable P2P interaction
The thresholds for Categories 1, 2A and 2B have not yet been determined and will be set out in secondary legislation. They will depend upon a platform's number of users, its functionalities, and resulting risk of harm on the platform. It may take some time for the thresholds to be published in final form – Ofcom must carry out research within six months of Royal Assent to inform the making of the relevant regulations, and could have potentially up to 18 months after Royal Assent to do so in relation to Category 2A and 2B services.
The Government has suggested that Category 1 platforms will be those that are the highest risk and with the highest reach (Facebook, Twitter etc), whilst Category 2A search services will again be high-risk, high-reach search platforms. Category 2B is defined as those services which are still high-risk and high-reach, but which may not necessarily meet the Category 1 threshold. Based on the current approach, the Government estimates that between 30-40 platforms will be in Categories 1, 2A or 2B.
Differentiated core duties on in-scope companies
Duty |
All UGC services |
Category 1 (User-to-user services meeting certain thresholds) |
Category 2A
(Search services meeting certain thresholds) |
Category 2B
(User-to-user services meeting certain thresholds) |
Risk assessment duty: assess the level of risk on the platform, and keep records of risk assessments. |
Yes
|
Yes
|
Yes
|
Yes
|
Illegal content duty: put in place systems and processes to minimise and remove priority illegal content and to remove non-priority illegal content when identified via user reporting. This could include the use of, for example, proactive technology, alongside other measures relating to the design and operation of the service.
|
Yes |
Yes |
Yes |
Yes |
Child safety duty: if the platform is likely to be accessed by children, put in place systems and processes to protect children from harmful content. Regulations will designate in due course 'primary priority content' and 'priority content'.
A platform will need to assess whether it is possible for children to access the service and/or whether it is likely to attract a significant number of users who are children (a 'children's access assessment'). A platform can only conclude it is not possible for a child to access a service where there are systems and processes in place such as effective age verification which ensure that children are not normally able to access the service.
|
Yes |
Yes |
Yes |
Yes |
Legal but harmful duty re adults: to address legal but harmful content accessed by adults, through enforcing platform's own terms of service. This has been described as the weakest of the duties, in that relevant providers are merely required to set out certain information in their terms of service relating to how such content will be dealt with. Priority content designated as harmful to adults will be set out in secondary legislation. |
No |
Yes |
No |
No |
Additional differentiated requirements on in-scope companies
Duty |
All UGC services |
Category 1 |
Category 2A
|
Category 2B
|
User reporting: provide mechanisms to allow users to report harmful content or activity and to appeal takedown of their content |
Yes
|
Yes
|
Yes
|
Yes
|
CSA content: if the platform is a UK platform or a non-UK one that does not already report, to report identified online CSA |
Yes |
Yes |
Yes |
Yes |
Transparency: to publish annual transparency reports about the steps they are taking to tackle online harms
|
No |
Yes |
Yes |
Yes |
Fraudulent advertising: to minimise the publication/hosting of fraudulent advertising |
No |
Yes |
Yes |
No |
User empowerment: offer optional user identity verification and user empowerment tools to give users more control |
No |
Yes (Services will have a discretion as to which form of identity verification they offer. We discuss this in more detail here) |
No |
No |
Freedom of expression and privacy: produce freedom of expression and privacy impact assessments |
No (But will be required to have regard to the importance of protecting users' legal rights to freedom of expression and privacy) |
Yes |
No |
No |
Protected content: protect journalistic content and content of democratic importance |
No |
Yes |
No |
No
|
Potential liability for regulated services
The Bill proposes a broad range of powers for Ofcom including imposition of financial penalties of up to £18 million or 10% of qualifying worldwide revenue, whichever is higher. In particularly serious cases, Ofcom can also apply to the Court for an order imposing business restriction measures such as 'service restriction orders' (e.g. requiring providers of ancillary services such as payment or advertising services to take certain steps) and 'access restriction orders' (blocking users' access to certain websites/apps via ISPs and app stores).
Ofcom will also have certain information related powers, requiring in-scope businesses to provide information so that it can discharge its online safety functions. As part of this, criminal proceedings will be possible against a named senior manager of a regulated service that fails to comply with an information notice from Ofcom.
Next steps
Second reading of the Bill will take place on 19 April 2022 and will provide the first opportunity for MPs to debate the main principles of the Bill. All online service providers should be aware of the very significant implications of the proposals. They should also monitor developments closely and start mapping out the extent to which changes may be needed to their systems and processes. It is important to recognise that the Bill is not yet law, and that – as with any Bill – there is the opportunity, through democratic engagement, to influence the outcome, but undertaking some preparatory work now is prudent.