MoJ Data Protection call for evidence
Consultation website here
Question: How will these proposals impact you or the bodies you represent?
Wherever possible we would like quantifiable costs and benefits and real-life examples of the potential impact of the proposals
Overall, we view the Commission’s proposals in a very favourable light. We will outline some of the ways each of the key proposals would have positively affected some of the issues we have been involved in.
ORG’s role is to ask for protection of citizens’ privacy and fundamental rights, as well as their rights as digital services’ users, in consumer law. We are often approached for advice, sometimes by whistleblowers, and liaise with the ICO to try to get data protection complaints resolved.
We represent 1300 paying supporters and over 30,000 individuals who have chosen to help in our campaigns for free speech and privacy online.
We start from the position that technology has the potential to enhance as well as threaten fundamental rights, but that the dynamics of increasingly powerful and cheap processing and storage of data make the way that we develop privacy laws and practice especially important if we are to avoid creating a surveillance culture.
1 Cross EU Applicability and consistency
Principle
This is a good idea and very much needed. There is no reason why a UK citizen should have to guess at their rights according to which EU member state their data is held in. they should be able to understand that their rights are the same across EU member states.
Current practice creates a race to the bottom, distorts any single market and allows companies to engage in jurisdiction shopping.
Real world examples
Varying rights in relation to large Internet companies
Companies on the Internet are the most obvious place where rights may vary. If an EU citizen uses, for instance amazon.de, should they expect the same rights and penalties as amazon.co.uk? In practice, they do not as the law stands today.
EU vs Facebook
A further example of the abuse of this variation can be seen in the location of a number of Internet companies in Ireland, causing the suspicion among privacy groups that this is in large part because data protection is seen as relatively weak in Ireland. This is unacceptable in a matter of fundamental rights.
Nevertheless, Austrian and other EU citizens did pursue Facebook for their data, and complained when they found that some of the data they expected was missing.[i] While the complaint was investigated, the approaches the citizens complaining would have been different. They would have had to understand the Irish version of data protection and remedies in order to fully stand a chance of having their rights enforced. This places Facebook at a double advantage: in choosing the data protection regime it believes best suits it, and then understanding that regime much better than all of its customers.
Consensus on harmonisation
Nevertheless, whatever advantage companies may seek in choosing the DP regime, they clearly feel that the lack of a single regime is overall a disadvantage. It has been striking that all parties have sought harmonization, from businesses through the Commission to consumer and privacy advocates.
2 Applicability of EU law
The test for applicability in the draft is clearer, ie supplying goods or services to EU citizens. This is clearer for non-EU companies and EU citizens. Essentially, they should not have to worry about whether a company has a registered company address in Europe, but rather whether than company has chosen to do business with Europeans. There are sensible limits around the size of the company and whether it only occasionally does business with EU citizens.
3 Right to be forgotten
Many commentators have focused on this as a controversial proposal. However, it essentially strengthens existing rights. We have in Article 12 a right to delete data after the purpose for which they were collected has been fulfilled. This attempts to extend that principle in a meaningful way. The new right must however be drafted so it cannot be used as a tool for censorship.
Real world examples
Social networks terms and conditions
The terms and conditions of social networks and online services often make broad claims made to data and photos, especially on copyright grounds. They do this, it seems, to protect themselves from liabilities and to allow themselves to extend their services and share the material with third parties as they like.
The effect of these often appears to give the service a perpetual grant to that data. The contract themselves are of course not negotiated. Thus a right for the user to terminate the contract and remove their data seems only reasonable. Without it, it is difficult to see how a user could protect themselves from a company making whatever claims it felt was necessary.
For instance, the current Facebook Terms of Service reads:
For content that is covered by intellectual property rights, like photos and videos (IP content), you specifically give us the following permission, subject to your privacy and application settings: you grant us a non-exclusive, transferable, sub-licensable, royalty-free, worldwide license to use any IP content that you post on or in connection with Facebook (IP License). This IP License ends when you delete your IP content or your account unless your content has been shared with others, and they have not deleted it.[ii]
Although Facebook try at least to give you the right to withdraw your agreement, the apparent effect of this, it seems to us, is that data shared with Facebook may be permanently sub-licensed and transferred, shared and not be deletable. While a right to ask Facebook should have sensible limits, an agreement should not write a blank cheque to the service to share your data within and without purely on contractual IP grounds. Data that has been inappropriately shared is precisely the kind of data most likely to be the target of a request to delete, yet that might be precluded by the terms above.
4 Data Portability
The Principle
This is a very sound idea. It is important that customers have an exit route. It promotes competition as well as privacy by allowing people to vote with their feet. By being able to get their data back, and take it to another service, they potentially do not have to lose their historic data.
Several companies including Google and Facebook have recogised this as an important part of competition in the online environment, where they might be expected to try to resist
However, data portability is only part of the story. It will need to be backed up by promoting greater interoperability between services, or it may not have the desired effect. Services will still wish to create their own artificial “walled gardens”. The inability to communicate between Facebook and non-Facebook users, for instance, provides a serious barrier to a user that wishes to leave Facebook, even if they can retrieve some or all of their information. Some attempt to improve interoperability should be included in the Directive.
Additionally, this seems to be poorly drafted currently, implying that data will only be retrievable if stored in a commonly used format.
Real world examples
Google and Facebook changes to terms of service
Both Google’s Privacy changes and Facebook’s privacy changes represent points where users express a desire to leave those services. Users experience unwanted changes and wish to migrate. Data portability gives them a means to do so, when combined with competing comparable services and interoperability.
Mobile phones tied to privacy policies
Google’s terms of service present a particular and wider example of the tying of products to their service in the case of Android phones. These are potentially crippled without the use of a Google account, to which they are tied. Data portability and interoperability may allow users to escape Google without losing their investment. Similar issues exist with many other smartphones: Apple’s phones for instance require users to sign agreements with their Apple Store in order to activate them and install extra applications.
5 Privacy by design and default
The Principle
This is a good idea. The draft article reads:
The controller shall implement mechanisms for ensuring that, by default, only those personal data are processed which are necessary for each specific purpose of the processing and are especially not collected or retained beyond the minimum necessary for those purposes, both in terms of the amount of the data and the time of their storage.
We think this seems to tighten the idea of prior consent and data minimisation. New services or purposes for the uses of data are constantly expanding, the collection and retention of data becomes cheaper and easier, creating a dynamic where businesses may seek to simply keep data on the basis that it may be useful. This is poor privacy practice, and this article seeks to create a duty to balance the collection of data by insisting that it has a purpose. For this reason, we believe Privacy Impact Assessments are needed to strengthen it, however.
Viviane Reding explains that this principle should be useful in clarifying issues of consent:
The “privacy by default” rule will also be helpful in cases of unfair, unexpected or unreasonable processing of data – such as when data is used for purposes other than for what an individual had initially given his or her consent or permission or when the data being collected is irrelevant. “Privacy by default” rules would prevent the collection of such data through, for example, software applications. The use of data for any other purposes than those specified should only be allowed with the explicit consent of the user or if another reason for lawful processing exists.[iii]
Real world examples
Facebook: defaults, Netflix and Spotify
Facebook has consistently changed its privacy setting defaults to open up sharing of user data.[iv] Many users are not aware of how widely their data is being shared. Additionally, new applications are not forced to be clear with their users about the way data is being used. Both Netflix[v] and Spotify[vi] now allow Facebook users to log into their services, and then post the users’ viewing and listening habits with everyone on Facebook. There is a means to “opt out” but this does not get noticed by most users at the time they sign up. This is at the very least an annoyance, and might be more revealing than is wanted. Whether through a ‘privacy by default’principle or clear notions of explicit consent, users should not face problems like this.
Google’s new privacy policy
The recent changes to Google’s privacy policy, which provide for data to be combined and used in ways that the user did not expect, beyond the needs of collection, and without explicit consent, argue for this principle. While their changes may be illegal under current data protection rules,[vii] because of a lack of clarity and consent, reinforcing this is a good idea, not least again because the end user does not have negotiating power with Google. The sheer incovenience of moving is likely to result in users putting up with the new contract. Legal protections are therefore helpful in creating a balance between the user and the service provider.
Oyster and TfL, transport and smart meter systems
There are major holes in Oyster’s contract from a privacy point of view, as they state they will hand data to law enforcement without any notification to the Oyster user or reference to the courts. This places law enforcement’s operations outside of normal legal scrutiny. The privacy policy states:
In certain circumstances, TfL may also share your personal information with the police and other law enforcement agencies for the purposes of the prevention or detection of crime.[viii]
TfL state:
Each police request is dealt with on a strictly case by case basis to ensure that any such disclosure is lawful and in accordance with the DPA.
Assuming this is accurate, this is gives little comfort. The DPA gives wide discretion for data controllers to hand data to law enforcement.
There are from a privacy point of view much better ways to design systems than collecting all the data at a central point. Journey data, if needed at all, can often be collected and retained by the user.
There are similar issues with most transport payment systems and also new systems like smart meters. Viewing privacy as a data collection and creation issue, rather than just a permissions issue, is fundamental to understanding the idea of data minimisation and privacy risk reduction. This is why we advocate privacy risk assessments as a necessary tool to support privacy by design.
6 Data breach notification
The Principle
We believe this is a very sound idea. People have the right to know when their privacy has been breached, and what the risks to them are. How else can they avoid the potential consequences of lost passwords, bank or credit card details, or trace possible misuse of address or email data?
We believe there is an additional need to monitor breaches and collect evidence by the data protection authorities. A central register of breaches should be kept in order to find out where and why data is being leaked.
Real world examples
ACS: Law[ix]
Around 15,000 people had their data lost by this company as the result of a number of incompetent practices. The data became publicly available after the company’s web server suffered a denial of service attack. Email records that contained the large data files became open to the public. At the most basic level, the files with personal data should not have been emailed, but this appears to be the default means that data was shared by BT and ACS:Law and others, and also internally at ACS:Law.
The data revealed included names and addresses of BT and Sky Internet subscribers who had been apparently (frequently inaccurately) identified by ACS:Law’s contractors as sharing video games and hardcore pornography. Other information included correspondence and denials, and payments and admissions. The information was disseminated on peer-to-peer networks. One website even offered a search facility for people to see who near them had received allegations and for what.
ACS:Law had data from a number of companies, some of which may not have notified their customers after the breach. BT did notify their customers. But in any case, ACS:Law were the culprits and should have notified people whose data was revealed, and also what data. BT was not in a position to say what exactly had been published, without mounting a forensic investigation and themselves further breaching people’s privacy. And each company that had dealt with ACS:Law may have operated a different policy regarding notifying their customers.
7 Transfer of data to non-EU countries
The Principle
Unfortunately, the approach the draft regulation takes has been rendered useless. The idea was to restrict US authorities and others from taking EU citizens’ data purely on basis of it being held by US company.
Article 35 now reads that member states must have “adduced appropriate safeguards with respect to the protection of personal data in a legally binding instrument”.
EDRI however states that: “the US currently uses instruments such as the Foreign Intelligence Surveillance Act (FISA) and the Patriot Act to retrieve data on (e.g.) the political activities of foreign individuals, who may have no links whatsoever with the USA, via companies with US offices. This legal vacuum was meant to be addressed by article 42. It has not been”[x]
This is a key concern for the Open Rights Group, as it has previously been for the UK Parliament in the case of the census.
Real world examples
In the UK, Parliament had to prevent the UK census being run by Lockheed Martin because of worries about the use of the data in the USA.[xi] While recognising the problem in the context of the UK government’s own operations, the same issues exist in relation to other data sets. It is in our view inconsistent to recognise the problem of potential abuse of data in regard of the UK census and not recognize the same problem in relation to a company such as Facebook, which probably holds more or less the same data (date of birth, workplace, personal beliefs) and much more besides.
8 Fining powers
The Principle
It is important that DP authorities have range of sanctions. Fining should be on basis of who can pay is correct, so turnover is the right measure. It is not clear why 2% has been arrived at rather than 5% as was stated in earlier leaked drafts except commercial pressure.[xii]
Real world examples
In the UK, powers to fine are limited to £500,000. What does £500,000 mean to a company like Google or Facebook? Currently data breaches are only deterred for small companies, who might even face fines equal to their turnover or more
9 The right for organizations to represent people
Article 73 would potentially greatly aid our work. Currently we can advise the ICO of a problem but we cannot complain on anyone’s behalf, or build an expertise in bringing complaints to them.
This would provide us with a strategy to seek remedies and establish principles and the limits of what is acceptable.
The potential to use the courts is also highly desirable, as it would stop incompetent or weak authorities from underplaying the legal duties of companies as we suspect happens in the UK presently. Combined with the consistency of a regulation, this could help in the many cases where a large number of people do not know exactly how badly they are affected, want action but lack the resources to do much about it.
This would also be very useful in international cases, particularly where language barriers would otherwise be an issue.
Real world examples
Google Streetview
Many people were slightly affected by both the initial case of the publication of Streetview images, and then the collection of personal data by Google as they mapped wifi hotspots. It would be impossible or at least unlikely for many people to complain or mount challenges, or co-ordinate complaints.
[vii] http://www.cnil.fr/english/news-and-events/news/article/googles-new-privacy-policy-raises-deep-concerns-about-data-protection-and-the-respect-of-the-euro/
[viii] http://www.tfl.gov.uk/termsandconditions/12321.aspx#page-link-does-tfl-receive-requests-from-the-police-for-disclosure-of-information-about-the-use-of-individual-oyster-cards-
[xi] PASC examined the issue in 2009, for instance. http://www.parliament.uk/business/news/2009/06/committee-looks-at-2011-census-preparations/