Free expression online
A dangerous precedent for global censorship
ORG responds to Ofcom’s Online Safety Act plans
Last week, Open Rights Group responded to Ofcom’s Illegal Harms consultation, the first of a series of consultations Ofcom will be holding on the development of its guidance for the Online Safety Act.
Previously, ORG responded to both the Online Harms White paper and the Online Safety Act, while it made its way through the Houses of Parliament as a Bill. We have submitted numerous policy briefings to parliamentarians about our concerns around the Act’s impacts on online privacy, security, and free speech. And in June of last year, we coordinated a letter that was signed by over 80 civil society organisations, academics and cyber experts from 23 countries urging the UK government to protect encrypted messaging.
But there are still numerous pressing concerns in both the Act itself and Ofcom’s proposed guidance.
Resource constraints for civil society organisations
To start, the extensive length of the guidance (over 1,700 pages) and short timeline create issues with capacity and financial constraints for Open Rights Group and other small civil society organisations. We hope to see Ofcom do more to meaningfully engage with civil society in the future and to recognize the unequal playing field for smaller, nonprofit organisations compared to large corporations and lobbying groups. Not only will this give Ofcom a better understanding of the range of viewpoints on their guidance, but nonprofits are typically more likely to represent the views of communities who will be most impacted by changing regulations.
Freedom of expression
The Online Safety Act casts a wide net around content that must be removed and is likely to result in increased amounts of lawful content being taken down from the Internet. In July 2023, a legal opinion found that there were, “real and significant issues” regarding the lawfulness of a clause in the then Online Safety Bill, that appeared to require social media platforms to proactively screen their users’ content and prevent them from seeing anything deemed illegal. The opinion found that there is, “likely to be significant interference with freedom of expression that is unforeseeable and which is thus not prescribed by law”.
Despite the guidance running over 1,700 pages in length, the guidance pays lip service to freedom of expression but does very little to ensure that companies will be incentivized to take free speech obligations seriously.Companies are asked to balance the accuracy of content removals with the swiftness of content takedowns, without meaningful guidance on how this balance should be achieved and with incentives and penalties heavily leaning towards speed. Already vulnerable or marginalised groups, like activists, racialised or queer communities, and people posting in non-Western languages experience the highest rates of wrongful content takedowns and are likely to be impacted by the increased amounts of content removed under this Act.
ORG urges Ofcom to make it clear throughout its guidance that companies must ensure human rights and due process considerations are accounted for through all stages of the moderation process. Companies must also be transparent about how they are incorporating free expression and non-discrimination concerns into these considerations. Additionally, ORG recommends Ofcom consider new ways to incentivize and support companies to improve accuracy and transparency of moderation policies and decisions.
Privacy
Serious privacy issues are also raised by the consultation. Ofcom labels end-to-end encryption as a high risk factor for harm while also highlighting its importance for users in an aside. We believe that Ofcom should publish regulations that make clear that there is no available technology that can allow for scanning of user data to co-exist with strong encryption and privacy. ORG encourages Ofcom to guide end-to-end encrypted messaging services towards other methods of improving user safety, such as sign posting users towards help services or device-level safety options.
Several key messaging services, including WhatsApp, Signal, and Element have said they would withdraw from the UK if they are asked to undermine end-to-end encryption. If end-to-end encryption is weakened or these services are lost, online communication will be made insecure for everyone in the UK, as warned by cyber-security experts worldwide. In the guidance, Ofcom critically highlights the important role that encryption plays for members of the LGBTQ+ community who wish to safely discuss or explore their sexuality or gender. In addition to the LGBTQ+ community, many people in the UK and around the world rely on safe and secure messaging every day, including young people, activists, doctors, lawyers, journalists, victims of domestic abuse, and women seeking abortions in countries with restricted healthcare rights.
This month, in the case of Podchasov v. Russia, the European Court of Human Rights (ECHR) clarified that governments should not simply require that encryption be removed or limited in order to target criminals and thereby compromise everyone’s privacy. The Court ruled that doing so is not proportionate.
Finally, ORG also warned against the proposed requirements for age verification systems. Age verification systems are currently inaccurate and invasive, with significant privacy risks for users. This opinion has been backed by several governments in recent years. We recommend that Ofcom’s guidance should include provisions specifying that any age assurance or age verification systems should be effective at correctly identifying the age or age-range of users and strongly safeguard individuals’ data and privacy and their security. We also encourage Ofcom to work with the Information Commissioner’s Office to set out strong, clear guidelines for data protection requirements in these systems. ORG is concerned that with proposed changes to the UK’s data protection regime through the Data Protection and Digital Information Bill that will weaken people’s access to and control over their data, people’s biometric data will be particularly at risk in coming years.
Conclusion
As Ofcom continues to develop its guidance on the Online Safety Act, it is essential to consider the broader implications for freedom of expression, privacy, and democratic principles. Failure to do so could not only undermine fundamental rights and freedoms within the UK, but also set a dangerous precedent for online censorship globally if repressive regimes take the Act and Ofcom’s guidance as a licence to further censor and penalise legitimate speech. Our full consultation response can be found via the link below and we welcome thoughts or engagement at info@openrightsgroup.org.
Ofcom consultation response
Open Rights Group Response to Ofcom consultation: “Protecting people from illegal harms online”
Read now