Mishcon de Reya page structure
Site header
Menu
Main content section
laptop data colour

Netflix drama Adolescence spotlights online extremism: Can the Online Safety Act deliver?

Posted on 8 April 2025

The Netflix drama Adolescence, about a 13-year-old boy radicalised by online misogyny who murders his female classmate, has fuelled an urgent debate about one of the darkest corners of the internet. Politicians, as well as parents, are asking how and why increasing numbers of young men are being influenced by the 'manosphere', by toxic notions of masculinity that encourage men to dominate women, hide their vulnerability and commit violence. The issue is, of course, complex. The Prime Minister, Sir Keir Starmer, believes "culture" is part of the "emerging and growing" problem, former England manager Gareth Southgate has called for better role models, and Adolescence writer Jack Thorne wants to limit children's access to damaging material. He supports a ban on smartphone sales up to the age of 16, and a digital age of consent. We focus, in this article, on the legal liability of internet intermediaries such as social media platforms for hosting and targeting illegal as well as harmful content. As the Online Safety Act (the "Act") comes progressively into force, we ask whether – in respect of extreme misogyny – it is fit for purpose. 

Although Adolescence is fiction, it echoes the growing number of knife attacks by young men on young women. There are many bleak statistics. Data published by the National Police Chiefs’ Council in July 2024 found that crime related to violence against women and girls increased by 37% between 2018 and 2023. According to polling by King's College London in February 2024, one in five men surveyed aged between 18 and 29 approved of Andrew Tate, the social media influencer and self-proclaimed misogynist facing charges of rape and sex trafficking. In 2022, the Center for Countering Digital Hate analysed the world’s leading incel forum, a community of involuntary celibates who espouse extreme hatred of women. It found that forum members posted about rape every 29 minutes, and that 89% of people who expressed a stance on the issue demonstrated support for rape. Meanwhile, women and girls experience unique and serious risks online. They are more at risk of image-based sexual abuse, disproportionately targeted by misogynistic pile-ons and harassment, and are the main survivors and victims of online domestic abuse.  

The Act was intended to make the UK the "safest place in the world to be online". It imposes new duties on, in particular, social media companies and search engines: 

  • All regulated services must tackle illegal content, including extreme pornography and intimate image abuse, to prevent users encountering it, minimise the length of time it is up, and remove it when flagged. The illegal harms duties came into force on 17 March 2025. 
  • All those providing services likely to be accessed by children must take steps to protect children from harmful content and behaviour. Harmful content includes abusive or hateful content, and content that depicts or encourages serious violence or injury.  
  • Ofcom, the new regulator tasked with enforcing the Act, must produce clear guidance that summarises how in-scope businesses can implement holistic and effective protections for women and girls. The draft guidance (the "Guidance"), currently out for consultation, specifically addresses online misogyny. It recommends nine actions that reflect the safety-by-design approach required by the Act, including conducting risk assessments that focus on harms to women and girls, reducing the circulation of content that promotes gender-based harms, and taking appropriate action in response to such harms. 
  • The Act requires providers, as part of their risk assessments, to consider specifically how algorithms could impact users’ exposure to illegal and harmful content; take steps to mitigate and manage any identified risks; and publish annual transparency reports about the use and effect of algorithms. This is particularly relevant to online misogyny, where the Guidance recognises that "recommender systems reward influencers creating misogynistic content with greater reach, particularly to boys and young men. This happens because algorithms are optimised for high engagement, which over time can incentivise the production and exposure to polarising and harmful content." Amongst other good practice steps, Ofcom recommends that service providers deprioritise harmful content in recommender algorithms to reduce its visibility and reach, and de-monetise user-generated content that promotes online gender-based harms even if it is not clearly illegal, to prevent it from earning advertising income. 

On the face of it, the Act and Ofcom are alive not just to the scale and scope of gender-based harms, but also the power of algorithms to amplify content and feed it to those who are most vulnerable. The Act very clearly aims to regulate systems rather than content, to change the way in which, particularly the most powerful platforms, operate. In short, it makes them more responsible for users' safety. However, despite these clear aims, the Act does have noticeable shortcomings. The scale of what it is trying to combat necessitates a broad approach which, by its nature, struggles to tackle the nuance and inherent pervasiveness of the issues. For example, there is a concern that lawful but harmful content – including misogynistic content - seen by adults, does not trigger the same protections. Only "Category 1" businesses – those that are highest-risk and highest-reach – will have to take steps in respect of content harmful to adults, and then only to offer a "triple shield" of protections, including user empowerment tools. Essentially, they will be held to the terms of service they set themselves.  

The extent to which the Act ensures meaningful change will come down to platforms' engagement and, ultimately, enforcement. Ofcom has formidable powers to impose fines and compel information, and also to introduce business disruption measures that restrict services from operating here. In theory, the Act also has extra-territorial scope, applying to those even outside the jurisdiction with links to the UK. Ofcom will, most likely, need to exercise its powers boldly and strategically, to send a message that it will not tolerate inertia, and major service providers must step up. To drive systemic change, Ofcom will need to look beyond 'tick-box' compliance and require active engagement from platforms on this issue. 

It is clear that, for too many young men, radicalisation begins online. In addition to enlisting parents and teachers, among others, in tackling extremism, it is vital to target the platforms and systems that enable its spread.  

For more information about the Online Safety Act, please see our dedicated hub 

How can we help you?
Help

How can we help you?

Subscribe: I'd like to keep in touch

If your enquiry is urgent please call +44 20 3321 7000

Crisis Hotline

I'm a client

I'm looking for advice

Something else