The UK’s Online Safety Act: A Critical Examination in the Wake of Civil Unrest
The recent riots sparked by the tragic deaths of three young girls in Stockport have ignited a crucial debate about the spread of mis- and disinformation online and the adequacy of the UK’s Online Safety Act (OSA) to combat it. Politicians, including London Mayor Sadiq Khan and Prime Minister Keir Starmer, have publicly questioned the Act’s effectiveness, calling for revisions to address the perceived "law-free zone" of social media. While the government has signaled its willingness to consider amendments, a thorough understanding of the OSA’s current provisions and limitations is essential before embarking on any reform efforts.
At its core, the OSA aims to hold online service providers accountable for user safety on their platforms. It mandates that providers implement systems and processes to mitigate the risk of illegal content, requiring swift removal of such material. Additionally, the Act stipulates measures to shield children from certain types of legal but harmful content. Crucially, the OSA primarily relies on existing legislation to define illegal content, rather than creating new categories. This reliance on pre-existing laws creates a significant gap in addressing mis- and disinformation, as few legal remedies currently exist to combat these forms of harmful content.
The distinction between misinformation and disinformation lies in intent. Misinformation is false or misleading content spread unintentionally, while disinformation is deliberately disseminated with the aim to deceive and often cause harm. The OSA does address false communications through Section 179, which amends the Malicious Communications Act 1988 and the Communications Act 2003. However, this provision requires proof of both knowledge of falsity and intent to cause harm, a high legal bar that makes successful prosecution difficult. Furthermore, a significant loophole exempts "recognised news publishers" from this offense, effectively shielding online news outlets from accountability for spreading false information.
Another relevant piece of legislation is Section 13 of the National Security Act 2023, which criminalizes activities that prejudice the UK’s safety or interests, including engagement with state-backed disinformation campaigns. However, this provision’s focus on state-sponsored disinformation and Ofcom’s emphasis on individual harm, as opposed to broader societal impacts, limit its effectiveness in addressing the kind of homegrown disinformation that contributed to the recent unrest. The EU’s Digital Service Act 2022 offers a contrasting approach, explicitly acknowledging the societal and democratic harms of systemic disinformation and manipulative activities.
To effectively address the spread of mis- and disinformation fueling social unrest, legislative action beyond the OSA is necessary. New laws criminalizing such content should be enacted and incorporated into the OSA’s list of priority offenses. However, such efforts are likely to face fierce opposition from powerful segments of the mainstream press, who vehemently resist any measures they perceive as threats to their journalistic practices. Past instances, such as the backlash against the 2019 Online Harms White Paper’s proposal to outlaw disinformation, demonstrate the strong resistance to such regulations.
The News Media Association (NMA), representing numerous news outlets, has previously launched aggressive campaigns against initiatives aimed at combating online disinformation. Their arguments often center on the claim that such measures stifle free speech and unfairly target legitimate journalism. This resistance highlights the significant challenge in striking a balance between protecting the public from harmful disinformation and safeguarding freedom of the press. Furthermore, Ofcom’s demonstrated difficulty in enforcing its existing Broadcasting Code raises concerns about the feasibility of effectively implementing the OSA, particularly against the immense resources of global tech companies.
The current situation underscores the urgent need for a comprehensive and nuanced approach to regulating online content. Amendments to the OSA must consider the complex interplay between free speech, journalistic practices, and the societal harms of mis- and disinformation. Addressing this issue requires a robust legal framework that holds all actors accountable, including online platforms and news publishers, while protecting fundamental democratic values. The upcoming debate over OSA reforms promises to be a critical test of the UK’s commitment to tackling the growing threat of online disinformation and safeguarding its democratic processes. Finding the right balance between protecting free speech and mitigating harm will be paramount in shaping a safer and more informed digital landscape.