Digital Privacy
13 Jun 2013 Jim Killock censorship
Website filtering problems are a ‘load of cock’
The motion laid down by Labour says:
That this House deplores the growth in child abuse images online; deeply regrets that up to one and a half million people have seen such images; notes with alarm the lack of resources available to the police to tackle this problem; further notes the correlation between viewing such images and further child abuse; notes with concern the Government’s failure to implement the recommendations of the Bailey Review and the Independent Parliamentary Inquiry into Online Child Protection on ensuring children’s safe access to the internet; and calls on the Government to set a timetable for the introduction of safe search as a default, effective age verification and splash page warnings and to bring forward legislative proposals to ensure these changes are speedily implemented.
The “1.5m” statistic has been debunked elsewhere, but the alarming point here is the deliberate conflation of child abuse images and legal material, potentially accessed by children. The motion slips from talking about child abuse images, to ‘safe searches’ to protect children from seeing adult material. Just as worrying is the adoption of a position in favour of default blocking by Labour. You can read a transcript of the debate on Hansard.
This is a symptom of a wider problem with this debate – a failure to properly distinguish between different categories of content, and the different methods of dealing with them. That requires at least some understanding of the technology – the details matter.
A further problem is an unwillingness from some MPs to appreciate or even acknowledge the problems with technical solutions. In the debate on Tuesday, I tried to outline the problems with filtering, including the over and under-blocking of content.
Claire Perry helpfully described such problems as a “load of cock“. Helpfully, because such a comment would be very likely to be caught by a filter and cause it to be blocked, while not, of course being pornographic.
Claire also got applause for suggesting that blocked websites were simply collateral damage necessary to protect children. This is the kind of woolly thinking that thankfully got rejected by her government, which recognised that economic harm stems from blocking legitimate websites, for instance. After all, if you can protect children, and avoid blocking for adults, why not? Can some balance not be struck?
Unfortunately, in the eyes of many MPs, arguing for balance is betraying children. If any children can access more porn than we can technically prevent, then we have failed. Of course, filters don’t always work and can be easily got round, but if our solution helps a bit, surely that is better than nothing?
These kinds of position, once you examine them, are pretty incoherent. Filters that don’t work well will probably get switched off. Defaults that block too much may encourage people to remove the filters. Parents may assume their children are safe when filters are switched on. Software design is iterative not legislative; yet legislation is often favoured over industry engagement.
The child protection debate over the last two years has won Claire Perry many friends, who believe she has raised the profile of an issue and got results. Certainly, the fact that ISPs are building network level filters points to this, but I was intrigued by a question at the debate on Tuesday. Apparently children are installing Chrome, because it was suggested that helps them access porn sites and gets round filters.
We did try to tell Claire this kind of thing would happen, before she persuaded ISPs to spend millions of pounds on network filters. Even with filters, if parents leave children with admin privileges, they will be able to use their computers to trivially defeat any blocks. Some MPs in the debate in Parliament suggested only ‘very clever’ folk will be able to get round filtering. This isn’t true – most children will find this easy.
Which leaves us with the harms on all sides, to websites, adults and children, without the supposed benefits.
Labour have essentially made the same mistake as Culture Secretary Maria Miller’s letter to online companies, in which she invited Internet companies to a proposed ‘summit’:
Recent horrific events have again highlighted the widespread public concern over the proliferation of, and easy access to, harmful content on the internet. Whether these concerns focus on access to illegal pornographic content, the proliferation of extremist material which might incite racial or religious hatred, or the ongoing battle against online copyright theft, a common question emerges: what more can be done to prevent offensive online content potentially causing harm?
It is clear that dangerous, highly offensive, unlawful and illegal material is available through basic search functions and I believe that many popular search engines, websites and ISPs could do more to prevent the dissemination of such material.
The debate and letter confuse legal, illegal and potentially harmful content, all of which require very different tactics to deal with. Without a greater commitment to evidence and rational debate, poor policy outcomes will be the likely result. There’s a pattern, much the same as the Digital Economy Act, or the Snooper’s Charter.
Start with moral panic; dismiss evidence; legislate; and finally, watch the policy unravel, either delivering unintended harms, even to children in this case, or simply failing altogether.
ORG, Index on Censorship, English PEN and Big Brother Watch have written to the Culture Secretary Maria Miller demanding that civil society be present at her ‘summit’, to make sure these issues are addressed. We have yet to receive a reply.
Read more about the Check if your website is being blocked by filters campaign