Expert Comment: Online Safety Bill – a missed opportunity?


Monday 6th Feb 2023, 5.18pm

Early aspirations were for a potentially world-leading law which could keep children safe and tackle the worst online harms. There was a wide-ranging consultation and the resulting Online Harms White Paper, released in 2019, proposed a bold new framework which would see providers of online services given an over-arching duty of care for their customers.

But along the way the Bill has gained and lost new sections, like an old synthetic Christmas tree, with half the branches missing and decorated by fighting children.

Along the way the Bill has gained and lost new sections, like an old synthetic Christmas tree, with half the branches missing and decorated by fighting children

Controversy has raged about the inclusion (and exclusion) of legal but harmful content for adults, whilst committee scrutiny has introduced valuable new measures, such as duties regarding age-gated access to online pornography, and publication of non-consensual intimate imagery. Despite a general sense of positive forward motion, it is hard not to feel important opportunities have been lost.

When the first green paper was published there was no Truth Social, no BeReal, and TikTok had yet to make its mark. It was before Frances Haugen’s whistle-blowing testimony, before Molly Russell’s tragic death, before the pandemic left us dependant on online spaces for human connection, education and work.

In these five years, we have learnt a great deal more about the best and worst features of social media and online platforms. These changes are not all reflected in the legislation. The role of platform business models is a troubling omission. The most significant gap, though, is the failure to focus on regulating online services as systems, rather than just as content hosts.

The most significant gap…is the failure to focus on regulating online services as systems, rather than just as content hosts…holding platforms responsible for the design, moderation and governance decisions 

This would mean holding platforms responsible for the design, moderation and governance decisions they take, rather than simply the content they host. That content is created, shared and responded to by individuals. But the platforms decide whether and how it is amplified – who it is pushed to and what measures they might take to mitigate risk.

At present, the UK Bill tries to juggle both content and systems regulation, and as a result, does neither very well. The focus on content takes precedence throughout. This is a major missed opportunity.

It is worth remembering, users of online services face a variety of risks, many of them not really reducible to problematic content. There are harmful behaviours, such as bullying, grooming or fraud. Meanwhile, there are risks to wellbeing from predatory corporate behaviour or unethical data exploitation.

Another key benefit of a systems-focused approach is that it would require companies to address risks before, rather than after, problematic content is posted. Unfortunately, the lack of a focus on systems design is not the only obvious gap in the Online Safety Bill.  

The Bill addresses risks and harms, not opportunities and benefits. Online, as elsewhere, risks often go hand-in-hand with opportunities, and difficult trade-offs are involved.

Of most concern is the possibility that over-zealous application will reduce freedom of expression and access to information, despite explicit commitments to the contrary. This applies particularly to children and teenagers.

But also, the move from platform operator responsibilities to regulator responsibilities, it is vital Ofcom’s independence and expertise is protected from political interference. Current proposals allow both Parliament and the Secretary of State to intervene at key moments, which risks damaging Ofcom’s ability to assess compliance in the round.

It is not too late for change and for the UK to be a leader on a systems-based approach. A tiered-approach, based on size and reach of services may be a more proportionate way to address this. Ultimately, what is needed is a greater focus on how the services work rather than narrowly focusing on types of content. This is likely to be better adaptive to risks and drive effective results in the long term.



Hear from Professor Nash in her latest video to mark Safer Internet Day [today 7 Feb] 2023, sharing her insights on how we can make children’s voices heard more loudly in the policy space.