Digital Privacy
19 Oct 2016 Jim Killock Privacy
Fig leafs for privacy in Age Verification
Yesterday we published a blog detailing the lack of privacy safeguards for Age Verification systems mandated in the Digital Economy Bill. Since then, we have been offered two explanations as to why the regulator designate, the BBFC, may think that privacy can be regulated.
The first and most important claim is that Clause 15 may allow the regulation of AV services, in an open-ended and non-specific way:
15 Internet pornography: requirement to prevent access by persons under the age of 18
A person must not make pornographic material available on the internet on a commercial basis to persons in the United Kingdom except in a way that secures that, at any given time, the material is not normally accessible by persons under the age of 18
[snip]
The age-verification regulator (see section 17) must publish guidance about—
(a) types of arrangements for making pornographic material available that the regulator will treat as complying with subsection (1);
However, this clause seems to regulate publishers who “make pornography material available on the internet” and what is regulated in 15 (3) (a) is the “arrangements for making pornography available”. They do not mention age verification systems, which is not really an “arrangement for making pornography available” except inasmuch as it is used by the publisher to verify age correctly.
AV systems are not “making pornography available”.
The argument however runs that the BBFC could under 15 (3) (a) tell websites what kind of AV systems with which privacy standards they can use.
If the BBFC sought to regulate providers of age verification systems via this means, we could expect them to be subject to legal challenge for exceeding their powers. It may seem unfair to a court for the BBFC to start imposing new privacy and security requirements on AV providers or website publishers that are not spelled out and when they are subject to separate legal regimes such as data protection and e-privacy.
This clause does not provide the BBFC with enough power to guarantee a high standard of privacy for end users, as any potential requirements are undefined. The bill should spell out what the standards are, in order to meet an ‘accordance with the law’ test for intrusions on the fundamental right to privacy.
The second fig leaf towards privacy is the draft standard for age verification technologies drafted by the Digital Policy Alliance. This is being edited by the British Standards Institution, as PAS 1296. It has been touted as the means by which commercial outlets will produce a workable system.
The government may believe that PAS 1296 could, via Clause 15 (3) (a), be stipulated as a standard that Age Verifcation providers abide by in order to supply publishers, thereby giving a higher standard of protection than data protection law alone.
PAS 1296 provides general guidance and has no means of strong enforcement towards companies that adopt it. It is a soft design guide that provides broad principles to adopt when producing these systems.
Contrast this, for instance, with the hard and fast contractual arrangements the government’s Verify system has in place with its providers, alongside firmly specified protocols. Or card payment processors, who must abide by strict terms and conditions set by the card companies, where bad actors rapidly get switched off.
The result is that PAS 1296 says little about security requirements, data protection standards, or anything else we are concerned about. It stipulates that the age verification systems cannot be sued for losing your data. Rather, you must sue the website owner, i.e. the porn site which contracted with the age verifier.
There are also several terminological gaffes such as referring to PII (personally identifying information) which is a US legal concept, rather than EU and UK’s ‘personal data’; this suggests that PAS 1296 is very much a draft, in fact appears to have been hastily cobbled-together
However you look at it, the proposed PAS 1296 standard is very generic, lacks meaningful enforcement and is designed to tackle situations where the user has some control and choice, and can provide meaningful consent. This is not the case with this duty for pornographic publishers. Users have no choice but to use age verification to access the content, and the publishers are forced to provide such tools.
Pornography companies meanwhile have every reason to do age verification as cheaply as possible, and possibly to harvest as much user data as they can, to track and profile users, especially where that data may in future, at the slip of a switch, be used for other purposes such as advertising-tracking. This combination of poor incentives has plenty of potential for disastrous consequences.
What is needed is clear, spelt out, legally binding duties for the regulator to provide security, privacy and anonymity protections for end users. To be clear, the AV Regulator, or BBFC, does not need to be the organisation that enforces these standards. There are powers in the Bill for it to delegate the regulator’s responsbilties. But we have a very dangerous situation if these duties do not exist.