Fighting and winning for privacy, where was the ICO
The Government admitted their Test and Trace programme is operating unlawfully, but we should never have had to threaten legal action. We ended up asking ourselves: why is ORG having to do the regulator’s job?
Bad signs from the start
Right from the start of the pandemic, the Government’s attitude to privacy has been worrying. Matt Hancock, within days of the first emergency measures, claimed that data protection laws no longer applied. While this could be argued to be a misconstruction rather than a policy position, it was a dangerous starting point for the minister in charge of the nations’ health data.
A data store involving surveillance company Palantir and analytics firm Foundry was set up. Privacy did not look like a Government priority. Yet it matters: the public needs to trust the systems.
The App: paving the way with good intentions
ORG took immediate action when we heard that the Government was choosing a “centralised” contact tracing technology, using Bluetooth to map contacts. While the goal was worthy, the downsides of amassing a database of proximity events would have been significant, and arguably unecessary. At the very least, we said, a Safeguards Bill would be needed, as proposed by Lilian Edwards and later the Human Rights Committee in Parliament. Vulnerable groups would be particularly likely to avoid participation in a system that was hard to trust.
It soon became apparent that other Governments were moving away from “centralised” solutions for a variety of ethical and technical reasons. Groups like D3PT had acted fast to show that privacy alternatives were practical. Their work was picked up by Apple and Google, who decided to support this mode. They are no doubt wary of the consequences of a variety of states and other actors using Bluetooth to map people’s contact events.
The German government was among those that changed course, along with nearly every other European country, except France. Norway was forced by its data protection regulator to abandon its centralised app, as too invasive and too useless. France, too, is under pressure for changes from its data protection regulator.
But the UK Government was never under serious pressure from our data protection regulator. When the Isle of Wight trial was rolled out, the App did not have a finished, published Data Protection Impact Assessment. What’s more, the document hadn’t been shared with the Information Commissioner, a fact that Elizabeth Denham made clear in her submission to the Human Rights Committee.
It is unlawful to launch services and apps without completing a DPIA, when they involve large scale processing. We were therefore very worried, given the kinds of risk that could flow from collecting and storing data about who meets whom, as well as utilising widespread Bluetooth ‘pings’ between millions of devices, that could be listened to and analysed by third parties. If they hadn’t looked, then they couldn’t know what risks they were going to run.
Furthermore, the Government had a duty to share their DPIA with the ICO. When a programme involves a large scale intrusion, and the risks cannot be fully mitigated, then the DPIA has to be shared with the ICO. The risks from using Bluetooth technology to collate data in this way simply cannot be fully mitigated, yet the Government did not feel that they needed to talk to the ICO.
Answers needed: action taken
We asked for donations from you, our supporters. We instructed our lawyers at AWO to write to NHSX, asking where the Data Protection Impact Assessment was, and demanding that they would be sharing it with the public and the ICO. Nevertheless, the Government repeatedly assured the public and to us that they were involving the ICO in their plans, which should therefore be trusted. When they did publish the DPIA, it was though be experts to be very poor work. Key risks were underplayed and ignored.
The Government’s App was clearly in trouble. The Government hit technical snags we had highlighted to them which they had denied existed, but fully knew about. Eventually it was delayed indefinitely.
Test and trace: who cares about ‘data protection’?
Attention had shifted to the wider Test and Trace programme, which would have to compensate for the lack of an app, as delays mounted. The programme was going to have to fill the void with more manual contact tracing.
The programme was launched earlier than expected, and in an obvious rush. Speculation mounted about the reasons why – Government needed some good news in late May.
Newspapers reported the alarming news that data stored in the system would be kept for 20 years. From our perspective, though, what was even more startling was the admission to Politico that yet again no Data Protection Impact Assessment for the programme had been conducted.
We demand answers
This was a complex programme that was going to collect the records of tens of thousands of people, that involved a number of complex IT systems, that brought in new contractors like SERCO and Sitel, and thousands of new, temporary employees to phone, email and trace people. As we later learned, it would also involve adhoc collection of records by pubs, restaurants and bars. Yet no assessment of the data protection, security and privacy risks seemed to have been carried out.
So we again instructed our lawyers at AWO to write to the Government, asking where this assessment was, and whether they were going to conduct it. In their correspondence, they evaded our questions, pointing to assessments of parts of the system, rather than the whole programme. They did however concede that the retention period should be reduced from 20 to eight years.
Patient data on social media, bar staff stalking customers
Problems with privacy began to emerge. The Sunday Times reported that workers were sharing patient data in semi-private social media and WhatsApp groups. The Government simply blamed the workers concerned, but the reason apears to be that the test and Trace system is so badly managed that employees were using social media to discuss problems with colleagues.
The solution – private chat software like Slack or Element – appears to have been absent. This kind of risk is exactly what a DPIA is designed to identify and mitigate.
Worse has emerged from the informal collection of data by bars and restaurants. This has ranged from businesses collecting data for marketing purposes, to allegations that contact details have been used for stalking and harassment. The Government has a duty to mitigate these risks as it is part of the overall programme.
We threaten Judicial Review: the Government concedes
Eventually, we lost patience. As reported in Wired, we threatened the Government with a judicial review to force them to do the first basic steps to ensure your data would be safe – and asked for your help to make this happen. We raised the funds we needed in three days.
After two weeks, the Government conceded that they were in the wrong (PDF). They wrote back to us agreeing that it was a legal requirement for them to have conducted a DPIA before commencing the programme, that they had not, and that they now would.
The story was covered on BBC Today, Wired, the Guardian and Sunday Morning Live.
Questions in Parliament
Matt Hancock was asked in Parliament by Caroline Lucas MP why he hadn’t conducted a DPIA: he said that he “wouldn’t be held back by bureaucracy”, and had done three DPIAs (for parts of the system) – therefore “that’ll do the trick”. Which is not the view of his lawyers.
Where was the ICO?
What we have learned in this saga was that the Government did not feel it had any reason to abide by their legal duties to ensure data protection, or subject other decisions affecting privacy to public scrutiny. During this pandemic, other data protection regulators have forced their Governments to change their practices. But where is our regulator? What have they done?
They signalled a lax attitude right at the start, creating a general enforcement ‘pause’. They applied for the Information Tribunal to be suspended. When the Government was plainly ignoring them, they spoke to Parliamentary committees and perhaps signalled their disquiet.
The regulator has powers to demand information, make assessments and enforce changes that the ICO says must take place. Other regulators have taken strong action over tracing apps, even acting to suspend the use of one in Norway. These powers have not been exercised in the UK during the pandemic, even when basic errors appeared putting public trust in the schemes in danger. Instead, ORG and our supporters have had to instruct lawyers to force concessions from Government.
The ICO must hold the Government to account, and explain how it will do so. This matters: if privacy protections fail, then we risk a break down of public trust, which becomes a public health problem.