ICO’s failure to enforce is putting the public at risk

  • A new analysis of the ICO’s Annual Report shows regulator is overly cautious when enforcing data protection law.
  • Open Rights Group’s Alternative ICO Annual Report shows that controversial policy of avoiding fining public sector organisations is failing to deter organisations from breaching data protection law.
  • The report is published in the wake of the regulator’s decision to close multiple complaints about Meta’s AI data grab.
  • The report highlights problems with the independence of the ICO which could be weakened further by the Data Use and Access (DUA) Bill.

ALternative ICO ANNUAL report

Evaluation of the ICO’s investigations into breaches of data protection law and our recommendations to strengthen the data watchdog

Find out more

Digital rights campaigners have published the Alternative ICO Annual Report, an analysis of the actions taken by the ICO against companies and organisations that have breached data protection law. The report finds that the ICO’s overly cautious approach to enforcement is putting government and corporate preferences before the needs of the public.

Very little data protection work was done in the private sector regarding ‘complex’ data protection issues such as profiling or the use of artificial intelligence.

It also examines the regulator’s controversial policy of not fining public sector organisations except for the most serious data breaches. This means that organisations have only received reprimands for data protection breaches with harmful consequences for the individuals affected.

In one case, Thames Valley Police, “inappropriately disclosed contextual information that led to suspected criminals learning the address of a witness”. Thames Valley Police received a reprimand but the affected person had to move house. The University Hospital of Derby and Burton NHS Trust (UHDB) failed to process failed to process outpatients’ data in a timely way. Again, they only received a reprimand, but for some patients this meant medical treatment was delayed for up to two years. Repeat offenders West Midlands Police made a “catalogue of errors” during 2020-2022, mixing the personal data of a crime victim and suspect – including attending the wrong address when attempting to find a person regarding serious safeguarding concerns and incorrectly visiting the school of the wrong person’s child. It appears to be clear that a reprimand was not a strong enough deterrent in this context, because WMP did not take steps to rectify the error with the urgency required in the first instance.

Some authorities have failed in their subject access duties to provide personal information to individuals for multiple years but had received mere ‘reprimands’ from the ICO, including a ‘final reprimand’ in 2023-4 (see page 14 of the report).

ORG is calling for the ICO to use its full powers against public sector organsiations, including enforcement notices and where necessary fines.

Executive Director of the Open Rights Group, Jim Killock, said:

“In an increasingly digital world, data protection is vital for our personal security. The ICO’s reluctance to take enforcement action, alongside its policy of not challenging public sector organisations where needed, is not working.

“As we see the development of AI technology and its increased use by public sector organisations, we need strong data protection laws and a strong regulator who will act as the first line of defence for the British public.”

Closing down complaints about Meta

The Information Commissioner’s Office (ICO) has also closed down complaints made by Open Rights Group staff about Meta’s plans to process users’ personal information for AI development. ORG staff had asked the data regulator to open a formal investigation into the company’s plans to scrape their posts, images, videos and comments.

The ICO told ORG that they are closing down the complain after they had “engaged” with the social media giant but refused to share details of this engagement. They also wouldn’t meaningfully justify their decision to close ORG’s complaint and several other complaints lodged by the public. The ICO have said that they do not have to disclose information shared by organisations under Section 132 of the Data Protection Act.


However, the regulator has publicly stated that they are satisfied by changes that Meta has made to the opt out process, including making it simpler for users to object to the processing of their personal data and providing them with a longer window to do so. While Meta has slightly simplified the process for opting out, the mechanism to do so remains unnecessarily difficult, confusing, and hidden in one hyperlinked word in a 166 word notification. More than 600,000 people have posted fake ‘Goodbye Meta’ posts on their Instagram and Facebook accounts in the mistaken belief that this would prevent the social media company from using their images for AI development. The proliferation of these fake posts has highlighted the lack of clear information being provided to the public by Meta and why it is necessary for people to have to opt in to their personal data being used. But the ICO has been silent on why they think that the requirement to seek opt in consent does not apply to Meta.

Data Use and Access (DUA) Bill


The DUA Bill, published this month, will undermine the ICO’s independence, and fails to shield the regulator from undue interference from government or corporations. The Government will remain fully in charge of selecting members of the new Information Commission according to their own discretion and to determine their salaries. This allows the Government to interfere with the independence of the ICO, which already faces criticism for its failure to stand up for the public against both government and corporate pressure from companies like Meta. There are no measures for Parliament to approve appointments, or for individuals to challenge the ICO in court, leaving it very hard for individuals to challenge substantive ICO decisions, even at Judicial Review. Likewise, there are no measures to shield the ICO from conflict of interest and regulatory capture, such as by addressing the plague of revolving doors.

AI enforcement gap in the UK

The ICO is leaving the one-sided commercial exploitation of our data to train AI unchecked and unchallenged

Find out more

ORG’s complaint about meta

Our complaint to the ICO about Meta’s plans to take users’ information to “develop and improve AI”

Find out more
Hands Off Our Data