Data and Democracy

King’s Speech Response

ORG’s analysis and response to the Labour Government’s legislative agenda.

There are several important pieces of digital legislation mentioned in the King’s Speech. Most importantly, changes to data protection are mentioned, along with the first steps to regulate AI technologies.

Thankfully, there is some caution in the framing of both changes to data protection and Artificial Intelligence. Unlike the commentary from Tony Blair Institute and others, AI is not pitched as part of a potential gold rush.

However, as we explain, parts of the resurrected Data protection changes look problematic, especially regarding research, and there is a deep need for a public conversation and consultation regarding AI regulation, to shape Labour’s thinking. Corporate lobbyists will always ensure they are heard, but public perspectives are harder. This is especially true given the impacts of AI on policing, migration, welfare and employment can have discriminatory impacts on already marginalised groups.

Labour has announced plans to bring back provisions of the Data Protection and Digital Information Bill regarding Smart Data and Digital Identity. While the intention seems to leave out the pointless and controversial changes to UK data protection law, the execution may contradict these progresses.

Firstly, the King Speech’s briefing does announce changes that will be able researchers “to ask for broad consent for areas of scientific research, and allow legitimate researchers doing scientific research in commercial settings to make equal use of our data regime”. These seem to be taken straight out of the previous version of the DPDI Bill, a rather surprising move in light of the widespread criticism of the scientific community against the Bill.

Secondly, the new Bill would also include a reform of the governance structure of the ICO, “accompanied by targeted reforms to some data laws that will maintain high standards of protection but where there is currently a lack of clarity impeding the safe development and deployment of some new technologies”. These last bits are the most worrisome: while the brief only mentions the adoption of a “more modern regulatory structure, with a CEO, board and chair”, proposals in the DPDI Bill were bundling these changes to several clauses that would have undermined the independence of the ICO, watered down its statutory objectives, and reduced its accountability.

Further, there are more urgent changes that the ICO needs, such as the transfer of its appointment to Parliament, the implementation of collective redress mechanisms, and a reform of Section 166 of the UK DPA to allow substantive scrutiny of ICO enforcement decisions by the Information Tribunal.

Likewise, the argument that UK data protection law needs be clarified in order to allow the deployment of new technology echoes very dubious arguments made corporate lobbyists over the past four years, and shows the legacy of the post-Brexit deregulatory agenda: technological innovation and adoption cannot be successfully promoted by lowering regulatory standards, disempowering individuals and excluding our agency and dignity from the equation. Doing so would also undermine the same plans to implement a digital identity system or promote data sharing of “smart data”: the public will not trust a government and data sharing initiatives if they are accompanied by less rights, accountability and redress.

Briefly mentioned after the announcement of the Employment Rights Bill, the Government commits “to place requirements on those working to develop the most powerful artificial intelligence models”. This seems to be linked to Labour’s Manifesto commitments, which included the maintenance of strong legal safeguards and the importance of an Industrial Strategy to support and drive the responsible development of AI and new technologies.

Taken at face value, this may signal a welcomed step-change compared to the previous approach to innovation and AI: legally enforceable rules independent oversight, as well as clear and effective avenues for redress for those individuals against whom things may go wrong are necessary conditions to ensures the safe deployment of new technology systems. On the other hand, the proposal seem to lack ambition: by focusing exclusively on “the most powerful models”, the Government may be leaving a huge grey area where harmful AI applications could be left unchecked. Furthermore, AI risks do not only originate from the development of these systems but also from their deployment: even the simplest and dumbest AI system can become a significant threat if used, for instance, to accuse people of benefit fraud or calculate their A-level results.

Finally, this commitment comes alongside the rather vague statement in the Digital Information and Smart Data Bill where the Government promises “targeted reforms to some data laws that will maintain high standards of protection but where there is currently a lack of clarity impeding the safe development and deployment of some new technologies”. This is significant—and worrisome—in the AI field, as this rhetoric was used to justify proposals in the DPDI Bill that would have undermined the few legal safeguards that already exist for AI, such as the removal of the right not to be subject to automated decision making. Whether intentional or not, the Government may end up doing one step forward and two step backwards in the field of AI regulation.

The Government has announced the creation of a digital private rented sector database to bring together key information for landlords, tenants, and councils. It is hoped this will enable tenants to access information to inform choices when entering new tenancies, and landlords will be able to quickly understand their obligations and demonstrate compliance, providing certainty for tenants and landlords alike. Councils will be able to use the database to target enforcement where it is needed most.

However, this initiative raises concerns about privacy and security risks, especially for vulnerable tenants such as victims of stalking, domestic abuse, or refugees at risk of persecution.

If law enforcement agencies have access to this database it might encourage unscrupulous landlords to instead enter into illegal black market arrangements to rent our property to tenants

To minimize these potential harms, the register could for instance contain details only about the property and the registered landlord.

The Government has proposed creating a duty on local authorities to have and maintain a ‘Children Not in School registers’, and provide support to home-educating parents. These measures aim to ensure fewer children slip under the radar when they are not in school and more children reach their full potential through suitable education.

While the intent is to protect children, it is essential to respect the rights of families who choose elective home education as a lifestyle choice. Such a register must consider the impact on traveling communities, religious groups, or migrants who have moved between countries. Parents of home-schooled children might feel victimized for their lifestyle choices or persecuted for non-compliance with registration requirements.

The bill also aims at keeping children safe, happy, and rooted in their communities and schools by strengthening multi-agency child protection and safeguarding arrangements. Politics UK reported that AI would form part of this plan to enhance coordination among schools, GPs, councils, and Ofsted. However, the government must clarify that it will not introduce algorithmic-based surveillance of children for child protection purposes.

Jen Persson of Defend Digital Me has previously highlighted concerns around these proposals in her article titled ‘School absence: Bums-in-seats-thinking is the wrong problem, and AI the wrong solution’.

The Government is taking steps to identify new and emerging business models in the supply chain, ensuring the responsibilities of those involved in the supply of products, such as online marketplaces, are clear. This will enable the Government to better protect consumers, so they can have confidence in the products they buy and from whom they buy them. Without these powers, it remains far too easy for unscrupulous overseas suppliers to place unsafe goods on the UK market through online marketplaces.

The ORG Digital Rights 24 manifesto has called for marketplaces to share some responsibility for scam adverts and sellers who flout regulations. Regulation to ensure big tech platforms ensure they know who their vendors are, and that compliance with regulations that ensure consumer-friendly, safe, and sustainable technology is in place should be examined.

The Crime and Policing Bill will introduce strict sanctions on senior executives of online companies who fail to operate within the law, particularly concerning knife crime. Legislation would need to be proportionate, and focus on what marketplaces can reasonably be expected to do, while ensuring that vendors are also tackled, if they are supplying unlawful weaponry, rather than relying on action against intermediaries who can be sidestepped. Drugs for instance can still be purchased online and delivered without the need to use Amazon or eBay. It is also vital that any strategy looks at youth and social services and other community support to tackle knife crime directly.