Parliament must hold the ICO to account
The Digital, Culture, Media, and Sport Committee of the House of Commons will hold a hearing with Information Commissioner soon. In our view, the Regulator is failing to properly enforce data protection rules. Something is very wrong, and it needs addressing.
In 2018, when the GDPR came into force, the ICO acquired new powers and responsibilities. These included new teeth, such as the ability to fine companies 4% of their annual global turnover. GDPR also gave individuals new rights over their data – rights which the ICO is responsible for upholding. Yet it is now clear that the ICO is failing to use their powers and responsibilities to deliver GDPR’s regulatory expectations.
For us at ORG, the greatest of these failures is their work on Adtech, but there are many others. In recent weeks, we have highlighted how their strategy is systemically flawed.
GDPR was always going to require a big change in mindset from the ICO, which had only recently acquired enforcement powers and the ability to fine more than token sums. It arrived at its enhanced GDPR obligations with a weak culture of enforcement, as well as a legacy of only targeting high profile problems, such as operations blacklisting trade union members or automated spam “robo-calls”.
GDPR, however, envisages regulators which proactively work to prevent and punish a wider range of data abuses. This proactive enforcement was meant to create both carrots and sticks for companies to abide by the law.
DCMS Committee should ask the following six questions.
1. Why do the ICO’s fines nearly wholly target data leaks, spam, and robocalls?
Despite GDPR’s enhanced powers, the ICO’s fining practices have continued to target companies for very simple abuses, ones which are clear and easy to understand. Fines have been issued for spam emails and calls without consent, and for security failures leading to vast amounts of customer data being leaked, but only once because of an issue relating to more complex issues.
On the other hand, what is not being tackled through GDPR’s powers are abuses of data which are both systemic and egregious, but also, more legally complicated. Adtech is a clear example of this.
A few weeks ago, Experian and two other data brokers were found to be acquiring personal data records through data subjects’ consent, and then using the information for new marketing and profiling purposes which those individuals had not consented to. This is an abuse of trust, privacy and consumer rights, and it made them a lot of money. Two of those data brokers agreed to stop. Experian, on the other hand refused. The ICO issued an Enforcement Notice, ordering them to stop. Experian can appeal the Enforcement Notice if they wish.
The ICO’s decision to hold back from issuing fines sends a clear message. No matter how bad your reasoning is, no matter how rich your company is, and no matter how much money you make from data misuse, the ICO will almost never be willing to issue a financial penalty, so please feel free to carry on ignoring GDPR.
2 Why is the ICO silent when the Government fails to respect citizens’ data rights?
The ICO made some general observations about the COVID-19 Tracing App for England and Wales when it was asked to do so by Parliament, but failed to make any public statement when the app was launched without completing basic data checks, including the data protection impact assessment (DPIA) required by law. When the DPIA appeared, it was shoddy. The ICO said nothing.
When the call-centre Test and Trace scheme was launched involving multiple contractors, software platforms, and temporary staff, no over-arching DPIA had been carried out to assess the risks. The programme was launched and operating unlawfully. Since then, multiple data failures have emerged. The ICO has said nothing. As far as we know, the scheme still has not completed its DPIA.
Test and Trace was relaunched on a statutory footing, asking venues to record customer personal data. The Government has taken no direct responsibility for this data collection to ensure that it is legally compliant. The ICO has said nothing.
During the pandemic, other European data protection watchdogs have done their job, by calling out their governments for failing to safeguard public privacy, going so far as to suspend the use of the app in Norway’s case. Ours however stated that they viewed their role over government’s pandemic data practices as that of a “critical friend”.
The result of this is that government feels no reason to improve its privacy and data protection practices, or to take steps to safeguard user privacy, with a “friendly” watchdog prepared to look the other way.
Moreover ORG, Big Brother Watch, Foxglove, and other NGOs have had to step in to challenge privacy abuses which properly should be resolved by the regulator.
3 Why did the ICO do nothing to tackle automated A-level results?
When the scandal over automated A-level results broke, we learned that the ICO knew about the scheme and had been speaking with Ofqual. It would have been obvious to them that the students whose education was placed in jeopardy by the marking system would be owed the right to human review of those results under Article 22 of GDPR, as these had been automated decisions which had a legal or similar significant effect.
The ICO eventually issued a very weak statement washing their hands of the problem and saying that students should complain to Ofqual:
“Ofqual has stated that automated decision making does not take place when the standardisation model is applied, and that teachers and exam board officers are involved in decisions on calculated grades. Anyone with any concerns about how their data has been handled should raise those concerns with the exam boards first, then report to us if they are not satisfied.”
This was a strategic and calamitous decision. The ICO had a chance to educate the nation’s young people about their about key digital and legal data rights, including the right to human review and the need for data processing to be fair. A generation would have understood how important data protection rights are for their futures.
By contrast, in Norway, the data protection authority launched an investigation into the International Baccalaureate scheme for operating a similar automated results system, pointing out that fairness is a basic data protection principle. Why then did the ICO fail to act?
4 Why has the ICO failed to draw the line on profiling by political parties?
Several years into an investigation into profiling by political parties, the ICO released some recommendations into ways to improve data protection practices.
However, the critical question remaining is the use of electoral register data for personal profiling. All major parties have been found to be appending general profiling information about income and broad demographic characteristics to voter records. The Conservative Party, in particular, was found to be profiling voters for religious and racial characteristics.
Much of this profiling is likely to go beyond what data protection law allows, but the ICO still has not issued clear advice about what is acceptable. Meanwhile, parties will be preparing for the 2021 elections and are purchasing profiling information now. The lack of clear guidance and enforcement is failing both parties and voters.
5 Why is the ICO delaying enforcement in the Adtech industry, while other DPAs are litigating?
In Belgium, the data protection authority opened a formal proceeding against the Adtech industry body IAB, following the results of their investigations. The UK’s ICO however, has closed our formal Complaint, and gives no timescale for action. Widespread, unsafe sharing of millions of people’s personal data continues.
6 If the ICO is seen as ineffective or unaccountable, how will this impact an EU adequacy decision?
The EU will require the UK to have an effective regulator in order to grant a data protection adequacy agreement. This means that the ICO needs to show that strong, effective, and proportionate action is taken when information rights are abused.
The ICO’s current track record outlined above, where abuses are passing without comment or adequate enforcement, puts this into question.
Another issue for the adequacy evaluation is recourse and accountability. The ICO’s complaints procedure appears to be broken, as they do not seem to believe that complaints need a resolution at all.
The EU should also look at how accountable the regulator is to the courts. The right to a remedy when things go wrong was strongly made in the Schrems II case. Currently, the Information Tribunal has restricted itself to examining the ICO’s work in narrow procedural terms. This seems to be leading the ICO to believe that it can do as it wishes with Complaints, as the Tribunal will give it absolute discretion in how those are dealt with. No doubt that is a factor in the avoidance we have seen on many of the issues above.
DCMS can help push the ICO in the right direction
DCMS have a great opportunity to start to examine how effective the ICO is. Many people are upset about their work and are questioning their ability to enforce the law. If GDPR is going to deliver greater public trust and respect for individual’s data rights, then it needs to be enforced. When parliament agrees laws, these should result in behaviour change. Parliament should take a hard look at the work of the ICO and ask if it is living up to expectations.
Support our fight
Help us protect your data from the AdTech industry by taking the UK’s privacy regulator to court.
Back our action