Effective use of IT by Government
ORG’s response concentrates on these four questions:
- How well is IT used in the design, delivery and improvement of public services?
- What role should IT play in a ‘post-bureaucratic age’?
- What skills does Government have and what are those it must develop in order to acquire IT capability?
- How appropriate is the Government’s existing approach to information security, information assurance and privacy?
Our evidence concentrates on our experiences in Open Data, with the Information Commissioner and through our Database State seminars.
How well is IT used in the design, delivery and improvement of public services?
ORG would like to make some general points relating to government and IT. Firstly, viewing IT as a standalone area for policy is a bad idea. A lot of modern work on systems practice shows such an approach leads to problems.
ORG’s seminar project – the ‘Database State Seminars’[i] – involved practitioners across government IT projects. Although the starting point for our involvement was that several projects were potentially in breach of fundamental privacy rights, and others had substantial privacy risks relating to scale, access or data management, a constant theme of the participants was that IT was not being used in a coherent way to deliver the services and results that was expected.
In our Health seminar , for instance, participants observed:
A … concern is Government at all levels’ tendency to make arbitrary decisions. Researchers and advisors are simply unable to provide constructive solutions if problems are deemed politically infeasible. If their managers, particularly those popularly elected, are decided on a course of action there is a preference to tinker at the edges …
This conservatism is evident in official responses to the broadly constructive comments made as part of Government-commissioned studies of health systems. Evidence-based concerns identified by information security researchers are typically ignored if they do not fit the political view of a project’s development.
Another explanation for this failing is that competing interests – healthcare professionals, suppliers, civil servants and patient groups – forces projects in too many directions. Without appropriate coordination, these divergent interests encourage money-wasting stasis rather than clear decision-making.[ii]
Similarly, the Children and personal data seminar participants concluded:
The basic concern then is that these systems do not however actually lead to better outcomes. In practice they encourage carers to defer responsibility to the technology and colleagues. In the US, studies tentatively suggest the systems’ main use is to allow carers to shift the blame for their failings. …
What is clear is that [systems such as ContactPoint] reduce the amount of face-to-face time that carers actually spend with vulnerable children. The systems mandate an over-reliance on data entry and pooling, which means time with a machine rather than care recipients. The emphasis should instead be on quality, timely interventions. Ensuring the right information is available to the right people is one fact that enables such interventions, but these computer systems are only one tool of the trade and not a total solution.
In most cases, gathering the data is useless, because most children will not require an intervention. Whereas those most at risk may not exhibit any clear indicators that serve to show which low level problems will escalate into the most serious matters. It is in these cases that carers should be speaking to the children, their families and teachers to get a sense of what they can do to help.
The data that is stored is largely subjective, relying on individual conceptions of appropriate or inappropriate behaviour.
The impression we are left with is that in both health and child care, databases have been built to answer a problem, rather than as part of a coherent system. We conclude that :
1 IT is not a stand-alone policy area
To take a real life example, you don’t want a policy on shovels when your actual problem is gardening – you need a policy on gardening. Therefore, the question isn’t how to use IT the question is “How do we manage problem this problem, and does IT fit into this case.”
Recommendation 1: the review considers ways to and learn from modern systems practice and allow government systems to focus on the needs of whole systems
2 Systems need ‘Evolution not revolution’
We recommend that government tries to encapsulate the notion of “Evolution not revolution”. UK policy in our view suffers from a political imperative that “Something must be done” and reacts with “big announcements”, as opposed to an approach best encapsulated by a response to a problem such as:
“It’s already on the version 3.14 roadmap for 2012, testing in late 2011”
Recommendation 2: Adopt incremental improvements as the strategy for improvement of services
3 Systems can be designed for end users rather than departmental needs
Many government systems seem to be designed around the needs of government departments or officials, rather than real users, either as staff or citizens. These symptoms can be seen in relation to the health and children’s database projects outlined in the Database State Seminar project.
One example ORG has been involved with is the NHS Summary Care Records. There have been many problems with this project, which seem to stem from:
- A likelihood they are not needed in real life emergency medical situations
- A lack of understanding of how they might be used and therefore how to implement the system
Our briefing pack highlights two contrasting projects that demonstrate these issues:
An example of good practice in this area was the former Public Health Laboratory Service, which allowed public health doctors to log individual cases of infectious disease to monitor epidemic spread, but that identifiable case data was not shared with information systems outside the PHLS. This approach assured confidentiality: there were no recorded cases of data leaks.[iii]
Summary Care Records in England and Scotland give another contrast between bad and relatively good practice, according to our speaker at the seminars:
The English Summary Care Record (SCR) has never had a clearly defined use case. It was designed with no idea who the users would be or what tasks they would use it for. It was built on what might uncharitably be called a Field of Dreams brief: “If you build it, they will come.” Connecting for Health maintain that SCR will grow over time and meet a number of, as yet unspecified, needs. Nobody knows what it will grow into nor whether the is the right kernel from which to grow.
The designers don’t understand how the record will be used. As a result the system in general, and the security measures in particular, have some obvious design flaws. For example, the system mandates that individuals will have their own access controls and passwords, based on their roles. These security measures are not fit for purpose: sign on takes up to 90 seconds each time, and there is no guarantee that the role-based access will allow doctors and nurses to see the patient information that they need. Faced with a choice between loss of data and loss of life, in 2007 the A&E department of South Warwickshire General Hospitals Trust declared that they would knowingly break the rules and leave one senior -level smartcard in the computer for the whole shift.
In contrast, the Scottish Emergency Care Record (ECR) started with a specific use case in mind: emergency medicine. There is a small number of users, and specific settings in which the data is used: telephone triage nurses use it to provide medical advice, and to determine whether a patient should be sent to hospital. The nurse must ask the patient permission to view the record, and notification of each access is automatically sent to the patient via their GP. Security measures operate on the principle that the patient should be informed every time their record has been viewed, and by whom.
Recommendation 3: the review considers ways to allow government systems to focus on actual use cases and the needs of end users and citizens rather than be driven by Whitehall or narrow political decisions
Recommendation 4: the review reinforces to government that IT and databases do not solve social issues by themselves
What role should IT play in a ‘post-bureaucratic age’?
Creating a Big Society through open systems
While we recognize that no single approach will work for all circumstances, there are social and economic advantages to favouring an open approach. Furthermore, government spending creates a great deal of economic leverage, and should be used to the maximum public benefit.
We understand the ‘post bureaucratic age’ to be an essential component of the thinking behind the ‘Big Society’; that is, that information and citizen action are essential in the society we develop, as opposed to relying on top-down government projects.
The government must specifically enable:
- Engagement and creation of new value through making data available
- Use and promotion of open standards
Systems should be designed to benefit the whole of our Big Society, which means thinking about public interfaces, not just in terms of user interfaces, but data sets, APIs and giving people the ability to write their own interesting tools to do cool stuff with the data and to interact with government.
For instance, government should not just give pdfs of consultations or an index page. Official should think about how an index could be based on an open standard (RSS/RDF) and how society can work with that powerfully.
- Government should use publically protocols and exchange mechanisms when possible, both to fight vendor lock-in, and to enable efficient interfacing with the outside world.
- Where it can’t use existing ones it should ensure anything it creates is public or at least controlled by Government so it can be kept open for big society use and again avoid lock-in.
What skills does Government have and what are those it must develop in order to acquire IT capability?
Current levels and diversity of innovation in the IT sector make it impossible to predict what particular technical knowledge will be necessary to develop in house for the government to acquire IT capability. The in house skills to develop will probably be high level systems integration, strategic technology assessments, etc.
Each department will need to assess which particular skill they would need to strengthen. The potential list could be open ended: web services, geo-data, linked data, identity, usability, encryption and security, networks, etc.
However, given the size of government and the diversity of evolving challenges it is impossible to develop every skill and capability. The government should develop capacity for open innovation strategies that capture independent ideas and go beyond simple outsourcing to large providers.
The best example of open innovation is Procter and Gamble Connect and Develop, but even within government there are already initiatives such as ideas competitions and crowdsourcing. However these are currently experiments amounting to a negligible fraction of the large ICT contracts. We welcome the recent announcement of a new “Crown Commercial Representative” to help innovative small and medium sized enterprises (SMEs) pitch their services to government.[iv]
However, a wider transformation will be required to ensure changes beyond tokenistic initiatives. If the government officials responsible for assessing the new innovations of these SMEs are not capable to make informed technical decisions they will avoid the risk by default.
Transparency and an open process of decisions to arrive at particular technical solutions is paramount, and these should be decoupled from the tendering and commercial contracting of the services to provide those solutions.
This would allow the government to tap a large pool of technical expertise that would be very unlikely to be attracted to a civil service or ancillary state service job.
Skills in departments and policy makers to understand technical challenges
ORG frequently encounters levels of technical ignorance that suggest that government departments may be easily misled my salespeople or simply not understand the policy challenges they face. To give some examples from our work:
Electronic and Internet voting
During the Labour government, a policy of electoral modernization was pursued, with an underlying assumption that more accessible means of voting would drive electoral turn out up. ORG wrote policy briefings and observed the counts.[v] Thus methods varying from postal voting through to internet voting were examined. Electronic voting trials were conducted.
- The underlying challenge of delivering secure, anonymous but transparent and verifiable voting was not understood or examined. Academic study has for a long time viewed this problem as currently practically insoluble. Policy however was conducted on the basis that there was no fundamental problem with the technology
- The trials were badly run. This was not purely a question of the providers; government did not properly handle the process through its own lack of knowledge and rigour. The pilot providers were deeply unhappy with pressure they were put under as the result of late contract signings while the delivery date – an election – was immovable. Results were not comparable or useful.
- A fundamental but mundane problem around the accuracy of voter registration became obvious, but could have been identified at the start of the process, if electoral modernization had been thought of as a question of a system rather than ways to add technology.
- The Electoral Commission played a very useful role in identifying tasks such as standard setting, criteria and insisting that cost is an element in current questions such as the use of electronic counting.
- At all levels, a lack of technical expertise seems to add to the problem of understanding and evaluating technologies, and resisting the simplistic sales pitch of vendors.
Information Commissioner
The Information Commissioner operates a data protection regime which is based on principles enshrined in law. It may be supposed, therefore, that their role is technology neutral and advice need not consider technical aspects of implementation. However, in practice the ICO needs a great deal of expertise. There is a need for technical expertise to, for instance:
- Undertake investigations of datasets when breaches occur
- Understand the implications of the use of a specialist technology like cookies, or new technologies such as RFIDs or biometrics
- Understanding the potential of privacy-enhancing technologies and encryption methods
The ICO has recognized that technology has become central to their work, but we have yet to see how they build in this expertise, or what effect it has on their advice and guidance. It is remarkable that it has taken over a decade for the ICO to recognize the problem, and we believe this is parallel to many government departments.
How appropriate is the Government’s existing approach to information security, information assurance and privacy?
Security and privacy should be considered at the start of the process
In IT, the security and privacy of individual users is closely linked. Privacy relies on security, and both needs have to be considered early.
A number of government IT projects seem to have either underspent on security and privacy, or have ignored them until very late, when meeting requirements becomes very difficult. In relation to road pricing, one commercial developer at our seminar project observed:
Time-Distance-Position road pricing is potentially – but does not have to be – highly invasive. From what we can see so far, the DoT is reluctant to take on board the privacy issue as something that needs to be considered from the outset.
Anything that involves technology and money, especially in the public space, deserves to be thoroughly scrutinised through a security and privacy study, as we have tried to get DoT to do with respect to TDP. A lot of people try to tack on security and privacy at the end of system design, but the earlier you think about them the better.[vi]
The impact of government IT systems on people’s privacy can be very large and disproportionate. For instance, ORG examined the impacts of Oyster cards in London, where travel data is retained for two months, well beyond genuine business need. This clearly has a major impact on users’ privacy and is in effect an extension of police surveillance powers.
Similarly, NHS Care records, DNA databases and ID cards each had considerable impacts on individuals’ privacy. While the coalition has correctly identified the poor results in several of these cases, the purpose of this review is to ask how these problem may be avoided.
Furthermore, work by No2ID via FOI requests in the wake of the Database State report has revealed that many departments have little idea about what databases they are maintaining and what data therefore they are keeping. The scale of privacy impacts and risks is therefore impossible to understand.
ORG’s seminar project – the ‘Database State Seminars’[vii] – involving practitioners across government IT projects. To enhance privacy, the seminar attendees recommended these principles:
- Minimise data: collect only what is needed, and keep it no longer than necessary. Central systems should be simple and minimal and hold sensitive data only when both proportionate and necessary;
- Share data only where proportionate and necessary: Government should only compel the provision or sharing of sensitive personal data for clearly defined purposes that are proportionate and necessary in a democratic society;
- Give subjects ownership of their data: By default sensitive personal information should be kept on local systems and shared only with the subject’s fully informed consent;
- Build for anonymity: Citizens should have the right to access most public services anonymously.
Recommendation: mandatory privacy impact assessments for any IT scheme affecting large numbers of users
Recommendation: clear retrospective mapping of government data held on individuals, and transparency over who is holding what data
Recommendation: accept the principles of data minimization, limiting data sharing, giving users ownership of their data and building systems that allow access of government services anonymously
Appendix: Database State Seminars Report
The Database State report — the first ever comprehensive study of the area — showed that public sector databases, such as the DNA Database and ContactPoint, are out of budgetary and regulatory control.
Billions of our most personal details are stored in and have leaked from databases that are not fit for purpose. These data caches typically harm those they are intended to benefit, including society’s most vulnerable groups.
The possible solutions and underlying drivers for these problems were debated in a series of seminars convened in November 2009. These resultant briefings summarise the discussion to encourage further engagement with Parliamentarians and key decision makers.
These key principles were proposed and, broadly, accepted as necessary for future public sector IT projects:
- Minimise data: collect only what is needed, and keep it no longer than necessary. Central systems should be simple and minimal and hold sensitive data only when both proportionate and necessary;
- Share data only where proportionate and necessary: Government should only compel the provision or sharing of sensitive personal data for clearly defined purposes that are proportionate and necessary in a democratic society;
- Give subjects ownership of their data: By default sensitive personal information should be kept on local systems and shared only with the subject’s fully informed consent;
- Build for anonymity: Citizens should have the right to access most public services anonymously.
The Database State seminars: health and personal data
Speaker notes 1
In the last 30 years, we have moved from a doctor-patient relationship based on paternalism to one based on partnership which requires confidentiality, consent, and respect for the patient’s autonomy as its appoting principles. We require doctors to get consent for a course of treatment or an operation – but it’s often seen as less convenient to get consent for the use of their data for research purposes This is part of a huge paradigm shift from seeing people as individuals to seeing people as bundles of data.
People’s data should not be dealt with less respect than their persons. Any clinician understands the convenience of being able to search a database rather than piles of paper records, but we must not trash the principles of confidentiality underpinning the doctor-patient bond by abandoning the principle of consent for sharing. Searching for a memorable phrase I have described personal health data as the virtual naked patient.
In China, young criminals were executed to provide organs for transplant to foreigners based on tissue-typing data transferred from medical records to police, Governments can abuse information doctors have gathered to target patients. Maintaining a separation between government and government agencies and information gathered by health professionals is increasingly important. Even in this country, targeting is not unknown – 37 percent of black men are on the DNA database, but only 10 percent of white men
An example of good practice in this area was the former Public Health Laboratory Service, which allowed public health doctors to log individual cases of infectious disease to monitor epidemic spread, but that identifiable case data was not shared with information systems outside the PHLS. This approach assured confidentiality: there were no recorded cases of data leaks.
To uphold the ethical practice of medicine, it is essential that clinicians see themselves as primarily serving the interests of their patients and do not allow themselves to metamorphose into government informers.
Speaker notes 2
A tale of two care-records: the English Summary Care Record and the Scottish Emergency Care Record.
The English Summary Care Record (SCR) has never had a clearly defined use case. It was designed with no idea who the users would be or what tasks they would use it for. It was built on what might uncharitably be called a Field of Dreams brief: “If you build it, they will come.” Connecting for Health maintain that SCR will grow over time and meet a number of, as yet unspecified, needs. Nobody knows what it will grow into nor whether the is the right kernel from which to grow.
The designers don’t understand how the record will be used. As a result the system in general, and the security measures in particular, have some obvious design flaws. For example, the system mandates
that individuals will have their own access controls and passwords, based on their roles. These security measures are not fit for purpose: sign on takes up to 90 seconds each time, and there is no guarantee that the role-based access will allow doctors and nurses to see the patient information that they need. Faced with a choice between loss of data and loss of life, in 2007 the A&E department of South Warwickshire General Hospitals Trust declared that they would knowingly break the rules and leave one senior -level smartcard in the computer for the whole shift.
In contrast, the Scottish Emergency Care Record (ECR) started with a specific use case in mind: emergency medicine. There is a small number of users, and specific settings in which the data is used: telephone triage nurses use it to provide medical advice, and to determine whether a patient should be sent to hospital. The nurse must ask the patient permission to view the record, and notification of each access is automatically sent to the patient via their GP. Security measures operate on the principle that the patient should be informed every time their record has been viewed, and by whom.
The background to the criticisms of the Department of Health’s (DoH) systems is disagreement about the conditions for appropriate access to a patient’s personal records. The DoH has maintained the view since around 1995 that circulation can take place on a need-to-know basis. This is contrary to the view of the General Medical Council, British Medical Association and patient groups that records should only be shared with a patient’s consent.
The privacy-friendly view is now recognised by the European Court of Human Rights. A 2008 ruling under Article 8 of the European Convention on Human Rights Art.8, which guarantees a right to a private life, made clear that a citizen will have a cause of action where their personal data is unprotected. The Court found there is an unqualified right to opt out of central and local data collection. This authoritative principle should be paramount when reviewing and commissioning future systems.
Unfortunately, as shown by the ongoing drafting of the information governance principles for the General Practice Extraction Service (GPES), the opt-out standard is typically ignored in the UK in the context of medical records. The principles, planned for around 18 months before being discussed with patients’ representatives, entirely fail to require patients’ consent for information sharing. In addition, the system requirements do not specify appropriate levels of information security.
Such basic flaws fundamentally undermine the relationship between doctor and patient – which should be at the heart of healthcare planning – for scant if any public benefit. Best practice is for data subjects to be told when their data is being used and for what purpose. Patients also need a reasonable degree of confidence their records will not be leaked.
There seem to be a range of institutional and cultural difficulties that prevent these basic, top-level requirements from translating into working, delivered solutions. The typically high standards of project analysis and leadership pursued by the private sector are missing in the public sector. Just why standards are appreciably lower is unclear but demands urgent consideration.
Further, and more fundamentally, control over personal information should be returned to individual citizens. This model is now being used successfully by the commercial sector in, for example, Microsoft HealthVault and Google Health. This architecture disregards the most expensive and cumbersome centralised systems. It also entails substantial cost-savings benefits and reduced intrusion by the state into the private lives of citizens.
A closely-related concern is Government at all levels’ tendency to make arbitrary decisions. Researchers and advisors are simply unable to provide constructive solutions if problems are deemed politically infeasible. If their managers, particularly those popularly elected, are decided on a course of action there is a preference to tinker at the edges rather than recommend and implement the kind of wholesale changes recommended by the Database State report.
This conservatism is evident in official responses to the broadly constructive comments made as part of Government-commissioned studies of health systems. Evidence-based concerns identified by information security researchers are typically ignored if they do not fit the political view of a project’s development.
Another explanation for this failing is that competing interests – healthcare professionals, suppliers, civil servants and patient groups – forces projects in too many directions. Without appropriate coordination, these divergent interests encourage money-wasting stasis rather than clear decision-making.
Certainly the Government is making encouraging noises and there are signals that progress is being made. But, as noted in the GPES example, current systems development still exhibits the same old mistakes and few signs of learning after decades of sustained criticisms from civil society and Government-commissioned reports.
Finally, if civil society is to engage more effectively and better achieve its goal of securing improvements of Government practice, there may be a need for a shift in the emphasis of criticisms. Rather than pointing only to failings in practice, it may be worth recognising the systems’ benefits and highlighting where they do reflect our intentions. Of course, this more balanced approach is difficult to achieve when the focus of criticism lacks even basic principles of security engineering. Particularly when the public interest groups concerned have severe resource constraints and minimal opportunity to engage with decision makers.
The Database State seminars: law enforcement and personal data
Speaker notes 1
The police service needs clear direction and decisions from Government and politicians about the use of DNA profiling and how the National DNA Database should operate in the light of the ECHR Judgment. It is important to recognize that the DNA Database is an operational database: it generates a line of inquiry that can then be used to initiate an investigation; it doesn’t prosecute or convict people.
The heart of the issue in the ECHR Judgment is the retention of DNA from those who are arrested and not convicted. Even though the research in this area is not as detailed as we would like, those who are arrested and not convicted generate DNA matches. There will be occasions where individuals who are arrested and not convicted are truly innocent but others, for example, where a victim will not proceed with a prosecution even though there is strong evidence against an individual and they pose a public protection risk. The balance to be struck between civil liberty and public protection is at the heart of the debate. The police service supports an approach that is based on evidence and assessing risk. Within this debate the safeguards on the use of DNA profiling and the operation of the NDNAD need to be recognized. Removing someone’s DNA profile has limited practical impact if the arrest record is retained.
It is also important to recognize that the technique used analyse redundant, or junk, DNA; not genetic information in terms of ancestry or human characteristics. Of all the forensic and police databases the NDNAD has some, if not the most, strict safeguards; it is overseen by the DNA Strategy Board, which includes the Chair of the DNA Ethics Group who reports direct to Home Office Ministers, a Forensic Science Regulator who also has a separate line to the Home Office Minister, and a representative of the Human Genetics Commission. The Information Commissioner is also an observer on the Strategy Board. The Strategy Board is very clear in limiting the use of the NDNAD to the purpose of investigating and detecting crime and has resisted attempts to access for, for example, paternity tests, which have been taken all the way to the High Court.
Speaker notes 2
The number of piles of data law enforcement can fish in is going to continue to grow – it will be ten times the piles now. That is all the more reason to think about policy right now. The question is and will remain what observations (data) are within law and policy for an organization to collect and make sense of. The trouble is that if an organization makes a decision based on certain data and then throws this data away– it is difficult for the organization to prove it did the right thing – making for an accountability problem.
We did a piece of work to help reunite displaced loved ones after Hurricane Katrina. In the contract we included a provision which stated that at the end of the process the data amassed for reunification would be destroyed. We felt this was taking the high ground to ensure the data would not later be repurposed. Three years later, after the data had been purged, I realised that if the system had inadvertently matched people it shouldn’t have (e.g., debt collectors with debtors, the mob with people placed under the witness protection programme) we would not have been able to prove what had happened as the data had been destroyed. Similarly, supposing you find out the DNA algorithm was flawed and you want to rerun it to make sure a match is correct?
We need to be thinking about data tethering and accountability. A police agency that gives a record to another agency, particularly if it’s derogatory information, has to know what data they gave to whom so it can be recalled or changed downstream. If the holder of the data takes that person off the list it has to flow through.
With respect to data quality, I think it’s important for systems to be able to support dissent – don’t throw away the disagreeing piece just because it doesn’t agree; for example, don’t just suppress and throw away contrarian data showing that a suspect was reported to be at a location which was impossible when taking into account other facts.
As with other sectors, there has been an explosion in database building in the context of law enforcement. This growth is taking place despite decisions at the European level constraining the practice of data retention. This indicates a need for more responsible innovation from the authorities deploying new systems. Yet this will only happen as a product of better oversight.
At the root of this issue is the question of whether it is right and proper in a liberal democracy to pursue mass profiling of citizens. Claims of increasing detection and conviction rates must be balanced against the changes caused to citizens’ behaviour. For example, the ubiquitous CCTV camera is not solely an observer. It also has a powerful influence on the behaviour of both criminals and innocents.
The likelihood of unintended consequences makes these questions fiendishly difficult to answer in the context of system design. Even experts cannot predict just what piece of data will or won’t be useful in time, whether viewed by itself or in combination with other intelligence. And once gathered, information has a capacity to replicate ad infinitum. Taken together with the exponential growth in processing speeds, it is increasingly difficult for law enforcement authorities to resist the temptation to stockpile data.
The expectation is that data will be mined by future techniques to serve a useful purpose. But it is more likely that the purpose will be malicious, whether personally or politically motivated. In this context, one fail-safe that must be introduced are immutable audit logs, which would enable traceability and publicity of any interaction with data. Citizens will then be able to ascertain which officers have viewed data pertaining to them, and for what purpose. Immutable audit logs provide accountability that can only benefit law enforcement professionals in the course of their duties, although does present significant ongoing costs in securely data management.
Further complexity is added in because our sense of security operates more as a product of psychology than reality. Because we do not typically process risks to our person and family on a rational basis, judgements for appropriate responses are usually made regardless of the available evidence. We are thus more likely to accept compromises of our civil rights for threats that are perceived as serious than those that are actually serious. This is particularly so where decisions are made by elected politicians, keen to appeal to our sense of safety and protectionist interests.
While principles of necessity and proportionality, the keystones of our human rights discourse, are a useful backstop and guideline, they will be eschewed for operational need. This is particularly so here because the nuances of their application cannot be coded with any degree of accuracy into a systems’ design. This means there should be greater emphasis and acceptance – particularly at the level of front-line service delivery – of these principles, especially with regard to the concerns of information security and data protection requirements.
In parallel, processes to enable individuals to ensure removal of personal data must be made more transparent and simple. It is not enough that citizens have the right to scrub their data from a system. Rather the public service provider should offer a clear route and help where appropriate for individuals to reclaim their, for example, DNA.
New databases should not be established without a proper statutory basis, to ensure strict Parliamentary oversight, and must be subjected to a rigorous Privacy Impact Assessment. The initial statutory basis is also an important limiter of future mission creep.
In terms of subsequent oversight of information systems, it is important that bodies are independent of law enforcement authorities to avoid institutional capture. Governance systems should also be subject to Freedom of Information laws, to encourage active engagement with interested public communities, and answerable to Parliament, to ensure rigorous challenges by our elected representatives.
It really is vital that we address these difficult issues now. As time progresses and technology advances further, the capacity for data to be compiled on significant scales will be acquired by individuals as well as institutions. Rogue agents of the state and members of the public will develop data caches on a scale of the systems currently causing concerns. Regulating these actors will of course be far more difficult than institutional operators.
The Database State seminars transport and personal data
Speaker notes 1
These major central government projects are a nightmare for a security professional. The ongoing policies of centralisation and aggregation of data fly in the face of common-sense security. We see so many examples where security’s been ignored throughout the specification phase: take ContactPoint, for example. How can access controls possibly be managed when a database has up to 480,000 authorised users?
The problem is compounded by the highly prescriptive and often inflexible security standards the come out of CESG. It has traditionally been very hard to get products accredited for government use, so government security managers are fighting with one hand tied behind their backs. And the risk assessment processes don’t respond fast enough to emerging threats. Two years after the HMRC data loss, CESG is still discussing draft approaches for how to assess the value of personal information.
We need government to rethink how it delivers policy objectives without being so prescriptive about the technology approach. Time and again, centralisation has been shown not to work. Rather than come up with a system design and then try to secure it, wouldn’t it be better to allow good security principles to frame the basic system design, and then let the security managers get on with their jobs using the best possible tools and techniques?
Speaker notes 2
Time-Distance-Position road pricing is potentially – but does not have to be – highly invasive. From what we can see so far, the DoT is reluctant to take on board the privacy issue as something that needs to be considered from the outset.
Anything that involves technology and money, especially in the public space, deserves to be thoroughly scrutinised through a security and privacy study, as we have tried to get DoT to do with respect to TDP. A lot of people try to tack on security and privacy at the end of system design, but the earlier you think about them the better.
Our project, Trusted Driver, sought to implement a privacy-enhancing TDP system. An in-car device has a secure microcontroller (like a chip and PIN card) and GPS but no map database. Periodically, the car sends segments of its journey to a government server to be priced; the returned prices (but not travel data) are stored in the secure microcontroller. Periodically, the car also communicates with a different server to report accumulated charges so the driver can be billed. The two systems do not interact, and there is no data trail for tracking drivers. All communications and data are encrypted and authenticated using public-key cryptography.
We were convinced that our system was sound from the cryptographic point of view because we had it checked by the security group at Cambridge University. But we did not proceed further when there appeared to be little prospect that the system would be moved into a full trial.
Transport systems are becoming a focus for concerns about the state’s mishandling of our personal data. Civil servants and politicians perceive significant economic and environmental benefits in implementing technologies that will enable closer control of public transport infrastructure. Such benefits are not so apparent to citizens and consumers, who more immediately see the potential to misappropriate or simply lose collected data.
These technologies, such as the Oyster Card and other local smart cards, also have the potential to function as tools of surveillance. The combination of location, identification and billing data is a particularly dangerous cocktail. For example, road-pricing systems are recognised to carry significant privacy risks – by creating and storing records of where and when vehicles use the road, which are in turn made available to administrative authorities. Yet these risks are not insurmountable.
As demonstrated in 2007 by 1.7m people signing an online petition against road-pricing suggestions, this is sensitive territory. This strength of public feeling stems from both the notion of yet-another stealth tax and the huge value associated with the sense of freedom provided by cars and the open road. Exactly where privacy enters this red-button debate is not always apparent but Government needs to tread carefully and avoid trampling on public expectations of long-held freedoms.
Privacy should be stated as a headline concern in requirements documents when procuring transport systems. These additional cost requirements will increase the cost of the final system in the short-term. But taken across the lifetime of a system, introducing these concerns at a later stage would involve greater additional costs in terms of PR-damage, mistakes and taking remedial action.
The appropriate means to consider these concerns is through a privacy impact assessment (PIA). Although the Department has accepted this recommendation, it has taken too long to finalise these documents. The focus should be on using personal information only with a data subject’s consent, which must be given as part of an informed choice. Additionally, providing users with nuanced control over what information goes where is another prerequisite.
Also, to have a meaningful bearing on the proposed systems, PIAs should call for a ‘default’ setting of anonymous citizen profiling. Rather than presume users of systems will want to share personal information, systems should enable minimal sharing of identifiable data.
This ‘privacy first’ approach is necessary in order to realise the associated benefits. The public has, in general, little confidence in official’s ability to deliver large-scale technology projects and, in particular, supplying a new generation of transport systems. Demonstrating a commitment to privacy concerns and thereby customer service would be a significant step to winning back public trust.
A second default requirement is that data collected and processed for transport purposes shouldn’t be appropriated elsewhere. Blanket access for secondary purposes, for example security concerns, is simple unacceptable and would be fiercely resisted. Of course, warranted access, as required and overseen by the judiciary is a different matter and will be appropriate in particular circumstances.
The Database State: children and personal data
Speaker notes
There is a very important distinction between child protection concerns and welfare concerns. In some cases, it can be difficult to decide which category a family belongs to but this cannot be resolved by treating all family referrals as child protection. Yet this seems at the heart of the government policy with its emphasis on professionals sharing information about all concerns. Sidelining parents and having professionals talk to each other, part of the goal of ContactPoint’s design, should be a last resort – yes, if you’re talking about child abuse but not if you’re worried that the child isn’t doing well at school.
The problems we’ve seen in child abuse cases have been lack of wisdom, lack of professional competence, or time. Sharing information has been standard practice for decades. In the cases of Victoria Climbie and baby Peter, there was a wealth of data, but it was not looked at by somebody who knew how to put it together correctly. It is more sensible to save the money on databases and spend it on improving professional expertise and on services to children and parents. Mental illness, drug abuse, and domestic violence are the three biggest problems for parents. Services targeted at adults are key to improving parenting.
A danger in the government’s use of a standardised assessment framework (the CAF) and of performance indicators for child development is that it treats any deviation from the average or conventional as problematic. The idea that you can have a standard production line process for raising children is to me really repellent. It’s out of the oddballs that you can get the brilliant ideas.
The idea is to make the whole children’s sector workforce interchangeable. They have this idea that the database has all the relevant information and of course it doesn’t – effective help for families relies on the relationship you have with the family. This is a relationship-based service, so computers don’t do the job.
Speaker notes
For the stated goals I would not choose ContactPoint. I would prefer there to be an efficient way of actually communicating and sharing data between the agencies who need to know about vulnerable children, not about all children. But anybody who is vulnerable may well be shielded on ContactPoint– so any concerned doctor or teacher who logs on will not be able to see their data.
I would say that the database ought to start from the premise that if there is a proper concern about a child it is up the designated child protection contact (I am one of these) initially to investigate and if there is a genuine cause for concern to contact social services.
ContactPoint offers a get-out clause: since every child is on it no one has to make difficult decisions about who should be on it. We’re going to end up with an edifice that will make it harder to find vulnerable children.
The database is only as good as the people inputting the data. At our school we spend a lot of time cleansing our data, checking it, making sure it’s not corrupted and not incorrect, because a lot of people have access to it internally. What we really need is people out there working directly with families and children. You need to be able to provide places where three- and four- year-olds can go with struggling parents to day care – that’s a good use of money. It’s all very well having the database, but if you don’t have a large enough skilled workforce able to effect good as a result of it, then what is the point?
Computer systems have been deployed on a national scale to help public services achieve their stated objectives of securing better outcomes for children. The most infamous is ContactPoint, which has the stated aim of helping professionals reach children at risk. Critics argue the system instead puts children at greater risk. The basic concern then is that these systems do not however actually lead to better outcomes. In practice they encourage carers to defer responsibility to the technology and colleagues. In the US, studies tentatively suggest the systems’ main use is to allow carers to shift the blame for their failings. (The UK has as yet no reliable data on this point.)
What is clear is that they reduce the amount of face-to-face time that carers actually spend with vulnerable children. The systems mandate an over-reliance on data entry and pooling, which means time with a machine rather than care recipients. The emphasis should instead be on quality, timely interventions. Ensuring the right information is available to the right people is one fact that enables such interventions, but these computer systems are only one tool of the trade and not a total solution.
In most cases, gathering the data is useless, because most children will not require an intervention. Whereas those most at risk may not exhibit any clear indicators that serve to show which low level problems will escalate into the most serious matters. It is in these cases that carers should be speaking to the children, their families and teachers to get a sense of what they can do to help.
The data that is stored is largely subjective, relying on individual conceptions of appropriate or inappropriate behaviour. Permanent storage of such opinions is inherently risky, particularly as it could be leaked or otherwise made available for, say, employment purposes. There is also the likelihood that negative impressions will become fodder for gossip around the school yard.
The children themselves are already cynical about engaging with authorities. As they become aware of the risks associated with these systems they will be less likely to open up to carers. The distance created puts vulnerable children at greater risk of harm.
It is very much a case of going back to the drawing board with these systems. Front-line practicioners should be consulted at the pre-design phases to understand their requirements and work-flows. The systems should complement rather than dominate these practices if they are to bring genuine benefits, rather than merely additional financial and administrative expenses.
It is at this experimental stage that problems are simplest and cheapest to iron out. Once deployed on a national scale, faults are far more difficult to fix than under test conditions. That said, once a system has been implemented, those directly engaged in child protection and care should be given full opportunities to state whether the system actually helps or hinders their work.
Emphasis on how a system works for its users – which means professionals and the children – is an important counterweight to the bureaucratic momentum, with its preference for targets and reports. An emphasis on lean, usable system is also vital, given the intended users may not be confident and familiar with these devices.
One of the primary habits that should be ingrained in the systems is to force the question of whether interference will actually achieve results. This should sit alongside the recognised concerns from human rights dialogue.
[ii] See Database State Seminar report, attached as appendix
[iii] Database State Seminar Briefing Pack, Guest Speaker notes
[iv] http://www.telegraph.co.uk/finance/yourbusiness/8319582/David-Cameron-creates-Dragons-Den-for-small-business-suppliers.html
[vi] See Database State Seminar Transport report in Appendix