Online Safety Bill Second Reading briefing
ORG analysis and highlights concerning the Online Safety Bill and its adverse impact on human rights and the free internet
About Open Rights Group
Open Rights Group (ORG) is the leading UK-based digital campaigning organisation. We work to protect fundamental rights to privacy and free speech online, including data protection, the impacts of the use of data on vulnerable groups, and online surveillance. With over 20,000 active supporters, we are a grassroots organisation with local groups across the UK. We have worked on this Bill throughout the ‘online harms’ processes and consultations, and both Digital Economy Acts (2010 and 2017), accurately highlighting which parts of both DEAs would prove extraordinarily difficult to implement practically or fairly.
Parliament must set robust safeguards for freedom of expression
This is a Bill that seeks to make the online world ‘safer’ but relies largely on content removal and censorship to deliver it. It creates new risks to personal security and safety by downgrading privacy. Furthermore it relies on algorithms and Artificial Intelligence to combat problems caused by algorithms and Artificial Intelligence. By attempting to ‘fight fire with fire’ we can assume the results will be at best patchy, and at worst, corrosive of free expression and privacy.
This Bill vastly over-reaches its original remit. It mandates Internet services to police their platforms for some 28 criminal offences, including firearms, assisting illegal immigration, financial services, and harassment. It is literally asking private companies to enforce the law online on behalf of the State, without the checks and balances the State must apply, nor the means to punish criminals. Moreover, it extends its scope outwards to thousands or tens of thousands of small services that will have to pay a licence fee to Ofcom.
This Bill has been drafted without genuine reference to the wider legal and technological context. It fails to recognise the need to protect the integrity of the Internet ecosystem or the positive externalities of a global, cross-border network environment. Much of what this Bill seeks to implement is lurking in the sub-text, and raises tough questions for lawmakers. We address some of them here.
Automated censorship
This Bill gives online platforms a positive obligation to proactively monitor users’ posts. It is a radical departure from the framework on which our current law is based. It means that every image, video or piece of text uploaded by anyone, at any time, would be checked against databases for compliance with the government-mandated criteria, and non-compliant posts will be restricted (censored). The proactive monitoring is intended to ‘prevent’ individuals from encountering illegal content, and content that is harmful to children. It is done by AI-driven content moderation systems that use algorithmic processing to identify and restrict the content, and link to the recommender systems – such as Facebook’s NewsFeed – that algorithmically curate the content for users.
The Bill’s over-broad language, combined with heavy sanctions, incentivises the platforms to be over-enthusiastic in taking down or restricting lawful speech. They could be fined for not restricting it. There are no repercussions if they take down in error. For this reason, it poses a threat to freedom of expression.
The Bill’s provisions also apply to Google and other search engines. Search results show content from all over the web. Under this Bill, search engines will be required to modify their algorithms and demote content according to the government’s criteria. Hence, it will affect all websites, even those that are not in its scope. This raises concerns that websites could become hidden and if it was done in error, they would have little or no redress.
Monitoring private communications
The Bill defines content as anything that is “communicated publicly or privately”. It brings into scope private messaging services such as WhatsApp and Facebook Messenger and it opens the way for content moderation on private services. This can be required by Ofcom for the purpose of removing child sexual abuse material. It raises serious privacy concerns because it gives permission for the content posted by users on these services to be searched without a warrant or a court order. Warrantless searching of private communication is a major breach of privacy rights. It opens a Pandora’s box for an authoritarian-style censorship of ‘encrypted’ private messages and suppression of speech, not just in the UK, but globally, should such capabilities be implemented. Encrypted messaging is currently relied on by Ukrainian and Russian victims of Putin to afford personal security; the UK should not be seeking to weaken such technologies.
Age assurance by artificial intelligence
Consistent with the Bill’s requirement to ‘prevent’ children from encountering harmful content, it also mandates age assurance systems. They aim to estimate the person’s age, but do not verify exactly. Age assurance creates a Hobson’s choice for online platforms, who will either have to exclude all children (under 18’s), or manage their platform to criteria that are deemed suitable for children.
Age assurance systems rely on algorithmic processing of data, and they employ biometric checks such as facial recognition. This is a risky route. Protection of privacy is a concern. The governance of artificial intelligence systems is the subject of an ongoing international discussion. The government’s plans to weaken GDPR raise additional concerns in this context. Age assurance checks require a governance structure that includes dedicated privacy rules, and it arguably sits outside the scope of this law.
Filtering non-verified accounts
The Bill mandates that users should be able to filter out all accounts that are not verified. Whilst the intention is to filter undesirable comments or trolls, it is a double-edged sword. This provision would also serve to discourage anonymous accounts used by members vulnerable groups in society by enabling them to speak up without fear; such vulnerable users would be at a disadvantage to ‘verified’ users, unable to commicate with an important section of users, despite posing no risk to anyone. Solutions should not disadvantage vulnerable and innocent people.
Unprecedented Ministerial powers to define speech
This Bill grants unprecedented powers to the Secretary of State, who will set the strategic direction for the measures in this Bill and can insist that Ofcom’s Codes of Practice should ‘reflect government policy’. The two Ministries sponsoring the Bill are DCMS and the Home Office. Government Ministers will be able to define ‘harmful content’, but only after it has become law. The intended meaning of ‘harm’ is very broadly described as ‘ physical or psychological harm’. ‘Harmful content’ is that which ‘presents a material risk of significant harm to an appreciable number of [adults or children] in the UK’. Parliament cannot know how the Secretary of State may intend to interpret this in Secondary Legislation after the Bill is on the Statute. It should seek to remove these powers.
Robust safeguards for free speech on algorithmic systems
Freedom of expression is a right of all British citizens. It applies on the Internet just as it does off-line. The State has a duty to safeguard it, and may restrict it only under specific conditions. Any restriction should be specific and least intrusive. There is a concern that this Bill reduces it to a mere contractual breach of the terms of service of platform provider, ignoring the fact that the interference with freedom of expression is being ordered by the State. The Bill calls for a complaints procedure, but provides no specificity on the implementation. However, the Bill does create an expedited fast-track appeals process for the UK press and registered broadcasters.
This Bill must set a positive standard for restrictions imposed on any user by automated systems. It must include redress procedures that are fit for purpose, along with robust guarantees that lawful posts are swiftly reinstated, and that other restrictions, such as shadow bans, are swiftly lifted. It should include the possibility for a court hearing. This procedure should be statutory and on the face of the Bill.