Digital Privacy

Briefing: The Data Use and Access Bill (Second Reading)

The new Data (Use and Access) Bill drops several concerning aspects of the previous Data Protection and Digital Information Bill. Open Rights Group welcomes the removal of provisions that would have: watered down the definition of personal data; expanded the scope of democratic engagement; lowered the threshold to refuse a data rights request to “vexatious and excessive”; removed various accountability requirements; allowed the Secretary of State to dictate the Strategic Priorities of the new Information Commission; required individuals to contact an organisation before lodging a formal regulatory complaint; abolished the Biometric and Surveillance Camera Commissioner.

READ OUR FULL BRIEFING

Read the Open Rights Group briefing outlining all of our concerns with the Data Use and Access Bill

Download now

Executive Summary

Unfortunately, the Data (Use and Access) Bill still includes several provisions that would lower important protections for our data protection rights, and threaten public trust toward the use and deployment of new technologies such as Artificial Intelligence.

Further, the Bill would open the door for a judicial challenge of the UK adequacy decision before the Court of Justice of the European Union, a move that would cost the UK between £1 to £1.6 billion, and require small businesses in the UK to spend at least £10,000 in compliance cost. Likewise, a judicial invalidation of the UK adequacy decision would impact the functioning of the EU-UK Trade and Cooperation Agreement and the Windsor Framework, an event that would undermine the government stated ambition to further economic and institutional cooperation with the EU.

Maintaining robust data protection standards is a necessary condition to enable innovation and economic growth. Empowering individuals with strong data protection rights protects the public from extractive and exploitative business practices, thus ensuring that data uses lead to mutually beneficial outcomes and sustainable growth. High data protection standards are also an important enabler of digital identity services, smart data schemes, and the use of data to improve public services. The public need confidence that when they use a digital verification service, an online banking service, or when they visit a General Practitioner, the data they provide will be used for the reason they intended. The public also need confidence that the deployment of new technologies will not constrain their rights or their avenue for redress, and that strong regulatory supervision is in place to proactively mitigate and prevent risks.

However:

  1. The Bill would remove important protections for automated decision-making and AI.
    Article 22 of the UK GDPR enshrines the right not to be subject to a based on solely automated processing that have legal or otherwise significant effects on the individuals concerned. This right has proven to be a highly effective right that protects individuals from harmful decisions and discrimination. However, Clause 80 of the Data Bill would deprive individuals of this important right in most circumstances, and exacerbate power imbalances by requiring individuals to scrutinise, contest and assert their rights against decisions that were taken by systems outside of their control.
  1. The Bill would reduce transparency, particularly in the field of Artificial Intelligence.
    Clauses 77 and 78 would reduce the scope of transparency obligations and rights. In particular, Clause 78 would effectively favour the irresponsible development of AI products by allowing organisations which deploy those systems to comply with Subject Access Requests only if a “reasonable search” is needed to do so: thus, it allows it ignore SARs to the extent the AI system was designed in a way that makes it difficult to search data and comply with such requests. Further, if an organisation’s capacity to handle requests becomes a consideration for the extent to which a SAR must be complied with, this would introduce a perverse incentive: an organisation with poor data management practices would find it difficult and resource intensive to comply with transparency obligations but, since their capacity to comply defines the extent of their obligation, they would get away with it.
  1. The Bill provides arbitrary and unaccountable powers to the Secretary of State.
    The Data Bill introduces several clauses that would allow the Secretary of State to override primary legislation and modify key aspects of UK data protection law, including data sharing, via Statutory Instrument, without meaningful parliamentary scrutiny. These powers are being introduced in the absence of a meaningful justification and, in the words of the House of Lords, they “make it harder for Parliament to scrutinise the policy aims of the bill and can raise concerns about legal certainty”. Further, these powers were identified by the EU stakeholders as a main source of concern, and constitute a major threat to the continuation of the UK adequacy decision and the smooth functioning of the EU-UK Trade and Cooperation Agreement.
  1. The Bill lowers accountability over how data is shared and accessed for law enforcement and other public security purposes.
    The Data Bill would remove the requirement to consider the legitimate expectations of the individuals whose data is being processed, or the impact on their rights, for a wide range of purposes such a national security, crime detection, safeguarding, or answering to a request made by a public authority. Further, the Data Bill would remove the requirement for law enforcement authorities to record the reason they are accessing data from a police database.
  1. The Bill does not address the issues surrounding the Information Commissioner’s Office’s failing victims of VAWG and other vulnerable groups.
    MPs have a unique opportunity to strengthen the UKs data regulator. In doing so they will be able to protect their constituents that suffer harms as a result of organisations misusing and abusing their personal data. This includes victims of VAWG who have a high need of privacy to protect themselves from abusers and stalkers. Furthermore with the removal of the Chair of the Board of the CMA, the Labour government has undermined one of the most fundamental safeguards of independent regulatory authorities in the UK—which is a requirement to retain the adequacy status with the EU. This further undermines the credibility of a regulator that, as shown by ORG research, already has a rather unsatisfactory track record of regulatory enforcement. Likewise, a recent FOI disclosure highlighted how the ICO acted upon only one complaint in 2024, out of the 25,582 complaints they received from the public. Instead of addressing these issues, the DUA Bill still carries over several problematic clauses from the DPDI Bill. This include the introduction of new primary and secondary objectives, the requirement to consult the Secretary of State before laying down a code of practice, and the appointment of the non-executive members of the new Information Commission.

Further, ORG is concerned by the government unwillingness to address the issues raised by civil society, independent experts, and members of the House of Lords. By selectively listening to the self-interested views of industry groups, the government is pushing forward legislation that works against its stated intent of unlocking the use of data to promote growth, improve public services and make lives easier.

Likewise, issues that were identified by EU institutions and the House of Lords Inquiry into UK adequacy as a threat to the UK adequacy decision remain unaddressed: by ignoring the threat of a judicial invalidation of the UK adequacy decision, the government risks undermining their own efforts to further
institutional and economic cooperation with the European Union.

We urge the Government and the House of Commons to allow meaningful scrutiny of this Bill to address the shortcomings it still inherited from the previous, ill-conceived Data Protection and Digital Information Bill. In particular, we recommend that:

  • The rights under Article 22 of the UK GDPR should be expanded to partly automated decision-making: Drop Clause 80 (Automated decision-making), and consider extending the scope of Article 22 to partly automated decisions.
  • Obligation and transparency rights should not be compromised: Drop Clauses 77 (Information to be provided for data subjects) and 78 (Searches in response to data subjects’ requests).
  • Maximise legal certainty and ensure that any delegated legislative power is subject to appropriate safeguards and judicial scrutiny: Drop, or change the nature of, Clauses 70 (lawfulness of processing), 71 (the purpose limitation), 74 (processing of special categories of personal data), 80 (automated decision- making), 85 (Safeguards for processing for research purposes etc) and Schedule 7 (Transfers of personal data to third countries etc: general processing), and ensure that the use of delegated legislative powers is left open to judicial challenge.
  • Accountability for access to data for law enforcement purposes should not be lowered, and data sharing should be underpinned a robust test to ensure individuals’ rights and expectations are not disproportionately impacted: Drop Schedule 4 (Lawfulness of Processing: recognised legitimate interests), Schedule 5 (Purpose limitation: processing to be treated as compatible with the original purpose) and Clause 81 (logging of law enforcement processing).
  • We urge to drop the ill-conceived changes proposed by the previous government, and seize this opportunity to address some of the core structural deficiencies that have emerged in the way the ICO operates and is held accountable: Drop Clauses 90 (Duties of the Commissioner in carrying out functions), 91 (Codes of practice for the processing of personal data) and Schedule 14 (The Information Commission). Further, Parliament should introduce changes in legislation to transfer the responsibility for appointment and budget of the Information Commission away from the government to the Science Innovation and Technology Select Committee, in line with previous recommendations on this topic from the UK Parliament and the Gordon Brown’s report.

Petition: keep human review of ai decisions

Amend the Data Bill to protect our right to request a human review of automated decisions that impact your life

Take action

Write to your mp: block big tech influence

Object to aligning our data protection rules with the interests of powerful corporations and snooping officials

Take action
Hands Off Our Data