Digital Privacy
ICO Alternative Annual Report 2023-24
In this report the Open Rights Group offers our perspective on the Information Commissioner’s Office’s (ICO) 2023-24 Annual report, and recently published data on the enforcement action it has taken (or not) over the most recent financial year. Our analysis scrutinises the ICO’s controversial policy experiment to limit fining public sector organisations to only the most severe data breaches – and explores the structural and cultural factors that have shaped the office’s overly cautious approach to enforcement.
While the new Data Access and Use Bill has removed some poorly thought out proposals to make the ICO beholden to ministers, it does not make any changes to the relationship of the ICO to Parliament or the courts; there is therefore a danger that the ICO will continue to feel little institutional pressure to improve: the message of this report needs to be heard.
ALternative ICO report
Read ORG’s full evaluation of the ICO’s activities investigating breaches of data protection law and our recommendations to strengthen the data watchdog.
Download nowData protection needs to be understood a critical component in delivering a fair society. It protects against abusive and discriminatory decisions being made with data. It is used to ensure transparency in disputes with employers, customer services and even the police. As technologies like Artificial Intelligence (AI) progress, data protection rights help ensure that technology is not abused, and remains accountable. However, the reality of these rights depends greatly on our Information Commissioner, and their willingness to take dissuasive action against unlawful practices. This is especially true as the people most likely to be impacted are also often less able to take enforcement action themselves.
The ICO has been entrusted with an extensive range of enforcement powers by Parliament, and by extension the UK public. Data protection law was designed to enable the regulator to use the full range of these powers, ranging from reprimands for lower-risk incidents through to substantial fines and criminal prosecutions for individuals for the most severe breaches.
But its enforcement track record shows the ICO’s use of these powers is skewed, having only issued four data protection law-related fines to private sector organisations in the last financial year. This record stands in contrast with the ICO’s international counterparts, where data protection authorities across Europe have issued fines against a number of social media, AI and adtech companies – all high-risk areas where the ICO has seemingly failed to act over the past year. The office’s two-year “public sector approach trial” has meant fines were reserved to one extremely severe case, where a Ministry of Defence (MoD) data leak risked the lives of 245 Afghanis. 90% of the office’s remaining enforcement actions resulted in public reprimands, but the prevalence of repeat offenders (including the MoD) suggests these interventions have not been sufficiently dissuasive. Other cases, including Home Office schemes tracking migrants through physical GPS tags, and destruction of police records needed for prosecution and defence, indicate that the ICO is struggling to prevent real harms through its approach to the state sector. Even regarding simple problems, like the late processing of subject access requests (SARs), the ICO has been reluctant to take action against state bodies including local councils or the police, despite issues persisting over several years. In these cases, problems with the delivery of local services and access to justice are the likely result of the ICO’s reluctance to act.
Several interconnected factors explain the ICO’s reticence to adequately enforce data protection law. Public statements from the Commissioner – and political pressures exemplified by the previous government’s proposal to give Ministers powers to influence ICO priorities in the Data Protection and Digital Information (DPDI) Bill – suggest their priorities have been swayed by resource pragmatism and political saliency.
Internally, the fallout from various unfavourable legal rulings, an over-focus on “assurance” initiatives, and the challenge of getting to grips with emerging technologies have all created a culture of enforcement caution. Externally, the ambiguity of the regulatory Growth Duty, and demands that the office engage with data protection reforms and digital regulations beyond its direct remit, have also distracted the ICO from enforcing the law. Exacerbating this all is a lack of independent oversight and constructive challenge: the government has seemed inappropriately keen to shape the office’s strategy, whilst parliamentary attention has been ad hoc and piecemeal.
Summary of recommendations
To address these challenges, and ensure the ICO’s enforcement approach adequately upholds the public’s data rights, we make eight overarching recommendations:
Recommendation 1
The ICO’s forthcoming Regulatory Action Policy should prioritise transparency and clarity and be subject to regular external review. Options and actions for doing so include:
- A biyearly independent audit of the Regulatory Action Policy, evaluating both how the ICO is implementing its policy, and its impacts on regulated entities’ data practices.
- Turning the Regulatory Action Policy into a live document with a clear hierarchy of enforcement policies. This should clearly articulate how enforcement-related policies interact with each other and be easy to navigate (and by extension scrutinise) in one document. The document must be updated before a substantial change in enforcement approach has happened (rather than being announced ad hoc by the Commissioner at semi-public events).
- Explaining how technology’s potential for systemic impacts on equalities and human rights is factored into the enforcement strategy.
- Including a statutory requirement within the Data Use and Access (DUA) Bill for the ICO to publish their assessment logic and evidence base for all enforcement actions. This must also include cases they have decided not to investigate following UK GDPR complaints past a certain reasonable threshold.
Recommendation 2
Independent research and legislative reform should be made to benchmark the ICO’s private sector enforcement approach against other data protection authorities. Options and actions for implementing this recommendation include:
- Amending the DUA Bill to mandate the ICO to publish a list of priority sectors for enforcement, where widespread data practices set problematic norms and cause harm (for example social media platform’s illegal use of children’s data, and the opaque adtech market). This should include information about the potential risks to equal and fair outcomes through an equalities assessment.
- UK Research and Innovation funding ongoing independent research benchmarking ICO performance against international comparators. This is compatible with the research council’s mission to enrich lives and drive economic growth, given the important role data protection compliance plays in both. This research could be extended to other regulators with cross-economy remits.
Recommendation 3
The ICO should use the full range of its enforcement powers in the public sector – until and unless it can prove alternative approaches result in a substantial improvement in data protection compliance. Options and actions for implementing this recommendation include:
- Publishing all evidence resulting from the two-year “public sector approach trial” where public sector organisations were only fined as a last resort. If the evidence paints the pilot in a positive light, they should launch an external consultation and enable an independent audit of relevant data to validate their findings.
- Parliament exploring approaches for mitigating the potential impact of public sector fines on public services and data protection breach victims. This could, for example, include ensuring a proportion of income from fines is invested in improving public sector data protection practices, or through establishing compensation or financial support funds for people impacted by breaches.
- The DUA Bill banning the ICO from issuing more than one reprimand to an organisation. Any subsequent breaches should result in an escalation of action – not additional “final reprimands” that both undermine the premise of the initial reprimand and have little impact on behaviour.
- The DUA Bill requiring the ICO to publish a league table of public sector bodies’ SAR performance. Organisations who consistently fail to meet the required SAR standards compliance could then be prioritised for enforcement.
Recommendation 4
The ICO should publish “lessons learnt” and develop international agreements that reduce the risk of enforcement action challenge. Options and actions for implementing this recommendation include:
- Securing commitments from international regulatory agencies (where formal cooperation agreements exist) to compel organisations subject to enforcement actions in those regions to demonstrate how they comply with UK data protection law. This should include the European Data Protection Board and international DPAs, and other UK sectoral regulators such as the CMA where relevant.
- Conducting an internal review of decision-making underpinning enforcement actions overturned by the Information Tribunal, to identify the root causes of failure to meet legal standards. This evidence should be periodically reported to the Science, Innovation and Technology Select Committee, or Parliament.
Recommendation 5
The data protection risks of AI should be managed through better use of ICO transparency and data restriction powers, and legislative reforms to promote risk transparency. Options and actions for implementing this recommendation include:
- Establishing a mandatory UK-wide public sector AI registry through the DUA Bill. This would ensure transparency to citizens using these systems, and enable external scrutiny of the ICO’s decisions not to investigate these applications. This could follow the precedent set by the Scottish government AI Register.
- Issuing temporary data processing prevention orders to high-risk emerging technologies that have systemic privacy impacts, until these applications can prove they are compliant with data protection law. This could include frontier AI models demonstrably trained on UK citizen data or automated public sector decision-making, and follows the precedent set by other European DPAs.
- Compelling frontier AI model developers to provide the ICO with detailed information about the provenance of model training data. This legal requirement could be enshrined in the DUA Bill, or in the forthcoming AI Bill.
- Publishing an Action Plan for the ICO to deliver on its international treaty commitments on AI safety. This could be incorporated in the updated ICO Strategic Approach on Regulating AI.
Recommendation 6
The ICO should clarify how it interprets the Growth Duty in its enforcement approach. Options and actions for implementing this recommendation include:
- Including explicit detail on how it will prevent unfair competition and consumer harm from data protection non-compliance in the ICO’s updated Regulatory Action Policy. This is a Growth Duty obligation. In doing so the ICO should formally consult with the CMA and refer to competition law enforcement decisions where the competition implications of data assets were considered.
- Ensuring the list of priority sectors for investigation (outlined in recommendation 2) explicitly factors in areas where data protection practices may create unfair competition.
Recommendation 7
The government should commit to providing additional funding to the ICO for functions that solely focus on engaging with non-data protection issues (for example online safety).
This would ensure these functions do not come at the expense of delivering the ICO’s core regulatory remit, and could be part of ICO reforms considered in the DUA Bill.
Recommendation 8
Oversight of the ICO is strengthened through reform of Commissioner appointment procedures, Select Committees, and legal institutions. Options and actions for implementing this recommendation include:
- The Science, Innovation and Technology Select Committee establishing a Sub-committee on data protection effectiveness and reforms. This would provide independent scrutiny of the proposed DUA Bill (following the precedent of the sub-committee on the online safety regime), and the ICO.
- Transferring to the Science, Innovation and Technology Select Committee the responsibility for budget and the appointment process of the Information Commissioner’s Office. Currently, the Information Commissioner remains a Ministerial appointment, and select committee opinions on appointments as part of pre-appointment scrutiny are non-binding. Making the Information Commissioner a parliamentary appointment would increase arms-length from the government, and is likely to foster more active Parliamentary oversight.
- Giving the Science, Innovation and Technology Select Committee a veto on ICO appointments, if legislators are less ambitious; this would begin the process of ensuring the ICO’s independence from government and giving a parliamentary committee more political responsibility for ensuring the appointments are successful.
- Establishing a Data Rights Ombudsman with powers to adjudicate on data subjects’ appeals on how the ICO has responded to their complaints. A new independent body is necessary to deal with the volume of potential appeals, which the Information Tribunal does not currently have the capacity to do. This body could also provide valuable insights (through caseload data) on if and how the ICO is effectively responding to public complaints.
- Proving funding and legal powers for the Equality and Human Rights Commission (EHRC), to periodically and publicly review the state of data-protection related rights in the UK. This would ensure comprehensive scrutiny of data protection from the perspective of fundamental rights – a precondition to promote inclusive growth and ensure that the public can reap the benefits of innovation rather than be damaged by its externalities.