The UK’s Online Safety Act (“OSA”) 2023 has faced criticism from some quarters, with detractors labeling it an instrument of censorship that stifles online expression. This narrative completely misrepresents the Act’s purpose and effect. Far from censoring legal content, the OSA targets only illegal material, aligned with existing offline restrictions, while Section 22 introduces a groundbreaking statutory protection for freedom of expression in UK domestic law.
The Online Safety Act Does Not Censor Legal Content
The Act’s core aim is to enhance online safety by requiring user-to-user services (e.g. social media platforms) to address illegal content such as child sexual abuse material, terrorist propaganda and fraud. These categories mirror offline laws like the Protection of Children Act 1978 or the Terrorism Act 2000 and do not extend to suppressing lawful speech. The Act imposes no obligation to remove legal content for adults, ensuring lawful speech remains untouched. The Act also modernizes criminal law with new offenses like threatening communications and intimate image abuse, mirroring the principle that what is illegal offline is illegal online. Claims that the OSA enables censorship of legal content often stem from misinterpretations or early implementation hiccups, not the Act’s legal framework. For instance, some platforms initially over-blocked content when child safety duties took effect in July. This may be due to one or more of three reasons:
- Misunderstanding the Law: Platforms misread the OSA’s requirements applying overly broad filters to avoid penalties, and not noticing the duty to protect freedom of expression.
- Technical Limitations: Some may not have invested in the tools to target the narrowly defined “primary priority content” (i.e. pornography, information on how to harm or kill yourself or starve yourself to death) leading to much wider restrictions.
- Deliberate Overreach: Certain platforms, possibly to provoke backlash, blocked contentious but legal content such as Gaza-related coverage fueling oppostion to the law.
These issues reflect how those digital services chose to respond to the law, not the OSA’s actual design. Ofcom, the regulator, has clarified that over-blocking lawful content contravenes the Act’s principles particularly Section 22’s free speech protections.
Section 22: A new First Amendment for the UK
Section 22 marks a historic moment as the first time Parliament has legislated an explicit duty for online services to protect users’ lawful speech and privacy, enforceable by Ofcom. It mandates that regulated platforms prioritize users’ freedom of expression and privacy when designing safety measures. For larger “Category 1” services this includes publishing impact assessments on how content moderation affects free speech with specific safeguards for journalistic and news content. This is the first time UK domestic legislation explicitly protects online expression beyond the qualified rights under Article 10 of the European Convention on Human Rights (ECHR), as incorporated via the Human Rights Act 1998. Ofcom’s codes of practice embed Section 22’s safeguards ensuring platforms that follow recommended measures comply with free speech duties. Over-blocking lawful content risks enforcement action from Ofcom reinforcing the Act’s commitment to expression.
Unlike the ECHR which often requires costly and lengthy court action to enforce free speech rights, Section 22 embeds these protections directly into the regulatory framework for online platforms. Ofcom can proactively warn – and now has – or penalize platforms that over-block legal content ensuring compliance without requiring individuals to go to court. This makes the protection more immediate and practical, especially for rapidly countering overzealous moderation that restricts access to lawful material such as political discourse or conflict reporting.
Addressing Over-Blocking and Strengthening Rights
Early over-blocking by some platforms, particularly affecting children’s access and in some cases adults too if age verification options were not made available, sparked quite justified concerns. However Section 22 empowers Ofcom to address this directly. Platforms that excessively restrict content risk non-compliance as they fail to balance safety duties with free speech obligations. This regulatory oversight will, as the regime matures and is better understood, ensure lawful expression isn’t unduly curtailed.
While the UK lacks a single written constitution, Section 22 effectively strengthens free speech within the UK’s legal framework. It’s a tailored, enforceable safeguard for the digital age making platforms accountable for preserving expression while still tackling illegal content. Far from enabling, censorship the OSA through Section 22 sets a new standard for protecting online discourse.
Conclusion
The OSA is not a censorship tool; it’s a balanced framework that targets illegal content while embedding robust free speech protections. Section 22 is a landmark provision offering a practical regulator-enforced safeguard that surpasses the ECHR’s reliance on court action. By addressing over-blocking and prioritizing lawful expression, the OSA strengthens the UK’s commitment to free speech online proving critics wrong and paving the way for both a safer and freer internet.