The International Forum for Responsible Media Blog

“Snooper’s Charter” Consultation – Paul Bernal

The draft Communications Data Bill [pdf] – the ‘Snoopers’ Charter’ – is currently up for consultation before a specially put together Joint Parliamentary Committee. The consultation period has been relatively short – it ends on 23rd August – and at a time when many people are away on holiday and while many other have been enjoying (and being somewhat distracted by) the Olympic Games.

Even so, it’s very important – not just because what is being proposed is potentially highly damaging, but because it’s a field in which the government has been, in my opinion, very poorly advised and significantly misled. There is a great deal of expertise around – particularly on the internet – but in general, as in so many areas of policy, the government seems to be very unwilling to listen to the right people. I’ve blogged on the general area a number of times before – most directly on ‘Why does the government always get it wrong?’.

All this means that it would be great if people made submissions – for details see here.

Here is the main part of my submission, reformatted for this blog.

————————————————-

Submission to the Joint Committee on the draft Communications Data Bill

The draft Communications Data Bill raises significant issues – issues connected with human rights, with privacy, with security and with the nature of the society in which we wish to live. These issues are raised not by the detail of the bill but by its fundamental approach. Addressing them would, in my opinion, require such a significant re-drafting of the bill that the better approach would be to withdraw the bill in its entirety and rethink the way that security and surveillance on the Internet is addressed.

As noted, there are many issues brought up by the draft bill: this submission does not intend to deal with all of them. It focusses primarily on three key issues:

1) The nature of internet surveillance. In particular, that internet surveillance means much more than ‘communications’, partly because of the nature of the technology involved and partly because of the many different ways in which the internet is used. Internet surveillance means surveilling not just correspondence but social life, personal life, finances, health and much more. Gathering ‘basic’ data can make the most intimate, personal and private information available and vulnerable.

2) The vulnerability of both data and systems. It is a fallacy to assume that data or systems can ever be made truly ‘secure’. The evidence of the past few years suggests precisely the opposite: those who should be most able and trusted with the security of data have proved vulnerable. The approach of the draft Communications Data Bill – essentially a ‘gather all then look later’ approach – is one that not only fails to take proper account of that vulnerability, but actually sets up new and more significant vulnerabilities, effectively creating targets for hackers and others who might wish to take advantage of or misuse data.

3) The risks of ‘function creep’. The kind of systems and approach envisaged by the draft Bill makes function creep a real and significant risk. Data, once gathered, is a ‘resource’ that is almost inevitably tempting to use for purposes other than those for which its gathering was envisaged. These risks seem to be insufficiently considered both in the overall conception and in the detail of the Bill.

I am making this submission in my capacity as Lecturer in Information Technology, Intellectual Property and Media Law at the UEA Law School. I research in internet law and specialise in internet privacy from both a theoretical and a practical perspective. My PhD thesis, completed at the LSE, looked into the impact that deficiencies in data privacy can have on our individual autonomy, and set out a possible rights-based approach to internet privacy. The Draft Communications Data Bill therefore lies precisely within my academic field. I would be happy to provide more detailed evidence, either written or oral, if that would be of assistance to the committee.

1 The Nature of internet Surveillance

As set out in Part 1 of the draft bill, the approach adopted is that all communications data should be captured and made available to the police and other relevant public authorities. The regulatory regime set out in Part 2 concerns accessing the data, not gathering it: gathering is intended to be automatic and universal. Communications data is defined in Part 3 Clause 28 very broadly, via the categories of ‘traffic data’, ‘use data’ and ‘subscriber data’, each of which is defined in such a way as to attempt to ensure that all internet and other communications activity is covered, with the sole exception of the ‘content’ of a communication.

The all-encompassing nature of these definitions is necessary if the broad aims of the bill are to be supported: if the definitions do not cover any particular form of internet activity (whether existent or under development), then the assumption would be that those who the bill would intend to ‘catch’ would use that form. That the ‘content’ of communications is not captured (though it is important in relation to more conventional forms of communication such as telephone calls, letters and even emails) is of far less significance in relation to internet activity, as shall be set out below

1.1 ‘Communications Data’ and the separation of ‘content’

As noted above, the definition of  ‘communications data’ is deliberately broad in the bill. On the surface, it might appear that ‘communications data’ relates primarily to ‘correspondence’ – bringing in the ECHR Article 8 right to respect for privacy of correspondence – and indeed communications like telephone calls, emails, text messages, tweets and so forth do fit into this category – but internet browsing data has a much broader impact. A person’s browsing can reveal far more intimate, important and personal information about them than might be immediately obvious. It would tell which websites are visited, which links are followed, which files are downloaded – and also when, and how long sites are perused and so forth. This kind of data can reveal habits, preferences and tastes and can uncover, to a reasonable probability religious persuasion, sexual preferences, political leanings etc, even without what might reasonably be called the ‘content’ of any communications being examined – though what constitutes ‘content’ is contentious.

Considering a Google search, for example, if RIPA’s requirements are to be followed, the search term would be considered ‘content’ – but would links followed as a result of a search count as content or communications data? Who is the ‘recipient’ of a clicked link? If the data is to be of any use, it would need to reveal something of the nature of the site visited – and that would make it possible to ‘reverse engineer’ back to something close enough to the search term used to be able to get back to the ‘content’. The content of a visited site may be determined just by following a link – without any further ‘invasion’ of privacy. When slightly more complex forms of communication on the internet are considered – e.g. messaging or chatting on social networking sites – the separation between content and communications data becomes even less clear. In practice, as systems have developed, the separation is for many intents and purposes a false one.  The issue of whether or not ‘content’ data is gathered is of far less significance: focussing on it is an old fashioned argument, based on a world of pen and paper that is to a great extent one of the past.

What is more, analytical methods through which more personal and private data can be derived from browsing habits have already been developed, and are continuing to be refined and extended, most directly by those involved in the behavioural advertising industry. Significant amounts of money and effort are being spent in this direction by those in the internet industry: it is a key part of the business models of Google, Facebook and others. It is already advanced but we can expect the profiling and predictive capabilities to develop further.

What this means is that by gathering, automatically and for all people, ‘communications data’, we would be gathering the most personal and intimate information about everyone. When considering this Bill, that must be clearly understood. This is not about gathering a small amount of technical data that might help in combating terrorism or other crime – it is about universal surveillance and profiling.

1.2 The broad impact of internet surveillance

The kind of profiling discussed above has a very broad effect, one with a huge impact on much more than just an individual’s correspondence. It is possible to determine (to a reasonable probability) individuals’ religions and philosophies, their languages used and even their ethnic origins, and then use that information to monitor them both online and offline. When communications (and in particular the internet) are used to organise meetings, to communicate as groups, to assemble both offline and online, this can become significant. Meetings can be monitored or even prevented from occurring, groups can be targeted and so forth. Oppressive regimes throughout the world have recognised and indeed used this ability – recently, for example, the former regime in Tunisia hacked into both Facebook and Twitter to attempt to monitor the activities of potential rebels.

It is of course this kind of profiling that can make internet monitoring potentially useful in counterterrorism – but making it universal rather than targeted will impact directly on the rights of the innocent, rights that, according to the principles of human rights, deserve protection. In the terms set out in the European Convention on Human Rights, there is a potential impact on Article 8 (right to private and family life, home and correspondence), Article 9 (Freedom of thought, conscience and religion), Article 10 (Freedom of expression) and Article 11 (Freedom of assembly and association).  Internet surveillance can enable discrimination (contrary to ECHR Article 14 (prohibition of discrimination) and even potentially automate it – a website could automatically reject visitors whose profile doesn’t match key factors, or change services available or prices based on those profiles.

2 The vulnerability of data

The essential approach taken by the bill is to gather all data, then to put ‘controls’ over access to that data. That approach is fundamentally flawed – and appears to be based upon false assumptions. Most importantly, it is a fallacy to assume that data can ever be truly securely held. There are many ways in which data can be vulnerable, both from a theoretical perspective and in practice. Technological weaknesses – vulnerability to ‘hackers’ etc – may be the most ‘newsworthy’ in a time when hacker groups like ‘anonymous’ have been gathering publicity, but they are far from the most significant. Human error, human malice, collusion and corruption, and commercial pressures (both to reduce costs and to ‘monetise’ data) may be more significant – and the ways that all these vulnerabilities can combine makes the risk even more significant.

In practice, those groups, companies and individuals that might be most expected to be able to look after personal data have been subject to significant data losses. The HMRC loss of child benefit data discs, the MOD losses of armed forces personnel and pension data and the numerous and seemingly regular data losses in the NHS highlight problems within those parts of the public sector which hold the most sensitive personal data. Swiss banks losses of account data to hacks and data theft demonstrate that even those with the highest reputation and need for secrecy – as well as the greatest financial resources – are vulnerable to human intervention. The high profile hacks of Sony’s online gaming systems show that even those that have access to the highest level of technological expertise can have their security breached. These are just a few examples, and whilst in each case different issues lay behind the breach the underlying issue is the same: where data exists, it is vulnerable.

Designing and building systems to implement legislation like the Bill exacerbates the problem. The bill is not prescriptive as to the methods that would be used to gather and store the data, but whatever method is used would present a ‘target’ for potential hackers and others: where there are data stores, they can be hacked, where there are ‘black boxes’ to feed real-time data to the authorities, those black boxes can be compromised and the feeds intercepted. Concentrating data in this way increases vulnerability – and creating what are colloquially known as ‘back doors’ for trusted public authorities to use can also allow those who are not trusted – of whatever kind – to find a route of access.

Once others have access to data – or to data monitoring – the rights of those being monitored are even further compromised, particularly given the nature of the internet. Information, once released, can and does spread without control.

3 Function Creep

Perhaps even more important than the vulnerabilities discussed above is the risk of ‘function creep’ – that when a system is built for one purpose, that purpose will shift and grow, beyond the original intention of the designers and commissioners of the system. It is a familiar pattern, particularly in relation to legislation and technology intended to deal with serious crime, terrorism and so forth. CCTV cameras that are built to prevent crime are then used to deal with dog fouling or to check whether children live in the catchment area for a particular school. Legislation designed to counter terrorism has been used to deal with people such as anti-arms trade protestors – and even to stop train-spotters photographing trains.

In relation to the Communications Data Bill this is a very significant risk – if a universal surveillance infrastructure is put into place, the ways that it could be inappropriately used are vast and multi-faceted. What is built to deal with terrorism, child pornography and organised crime might creep towards less serious crimes, then anti-social behaviour, then the organisation of protests and so forth. Further to that, there are many commercial lobbies that might push for access to this surveillance data – those attempting to combat breaches of copyright, for example, would like to monitor for suspected examples of ‘piracy’. In each individual case, the use might seem reasonable – but the function of the original surveillance, the justification for its initial imposition, and the balance between benefits and risks, can be lost. An invasion of privacy deemed proportionate for the prevention of terrorism might well be wholly disproportionate for the prevention of copyright infringement, for example.

The risks associated with function creep in relation to the surveillance systems envisaged in the Bill have a number of different dimensions. There can be creep in terms of the types of data gathered: as noted above, the split between ‘communications data’ and ‘content’ is already one that is contentious, and as time and usage develops is likely to become more so, making the restrictions as to what is ‘content’ likely to shrink. There can be creep in terms of the uses to which the data can be put: from the prevention of terrorism downwards. There can be creep in terms of the authorities able to access and use the data: from those engaged in the prevention of the most serious crime to local authorities and others. All these different dimensions represent important risks: all have happened in the recent past to legislation (e.g. RIPA) and systems (e.g. the London Congestion charge CCTV system).

Prevention of function creep through legislation is inherently difficult. Though it is important to be appropriately prescriptive and definitive in terms of the functions of the legislation (and any systems put in place to bring the legislation into action), function creep can and does occur through the development of different interpretations of legislation, amendments to legislation and so forth. The only real way to guard against function creep is not to build the systems in the first place: a key reason to reject this proposed legislation in its entirety rather than to look for ways to refine or restrict it.

4 Conclusions

The premise of the Communications Data Bill is fundamentally flawed. By its very design, innocent people’s data will be gathered (and hence become vulnerable) and their activities will be monitored. Universal data gathering or monitoring is almost certain to be disproportionate at best, highly counterproductive at worst.

This Bill is not just a modernisation of existing powers, nor a way for the police to ‘catch up’. It is something on a wholly different scale. We as citizens are being asked to put a huge trust in the authorities not to misuse the kind of powers made possible by this Bill. Trust is of course important – but what characterises a liberal democracy is not trust of authorities but their accountability, the existence of checks and balances, and the limitation of their powers to interfere with individuals’ lives. This bill, as currently envisaged, does not provide that accountability and does not sufficiently limit those powers: precisely the reverse.

Even without considering the issues discussed above, there is a potentially even bigger flaw with the bill: it appears very unlikely to be effective. The people that it might wish to catch are the least likely to be caught – those expert with the technology will be able to find ways around the surveillance, or ways to ‘piggy back’ on other people’s connections and draw more innocent people into the net. As David Davis MP put it, only the incompetent and the innocent will get caught.

The entire project needs a thorough rethink. Warrants (or similar processes) should be put in place before the gathering of the data or the monitoring of the activity, not before the accessing of data that has already been gathered, or the ‘viewing’ of a feed that is already in place. A more intelligent, targeted rather than universal approach should be developed. No evidence has been made public to support the suggestion that a universal approach like this would be effective – it should not be sufficient to just suggest that it is ‘needed’ without that evidence, nor to provide ‘private’ evidence that cannot at least qualitatively be revealed to the public.

That brings a bigger question into the spotlight, one that the Committee might think is the most important of all: what kind of a society do we want to build – one where everyone’s most intimate activities are monitored at all times just in case they might be doing something wrong? That, ultimately, is what the draft Communications Data Bill would build. The proposals run counter to some of the basic principles of a liberal, democratic society – a society where there should be a presumption of innocence rather than of suspicion, and where privacy is the norm rather than the exception. Is that what the Committee would really like to support?

Dr Paul Bernal

Lecturer in Information Technology, Intellectual Property and Media Law, UEA Law School

This post originally appeared on Paul Bernal’s blog and is reproduced with permission and thanks.

1 Comment

  1. Brian Ridgway

    This submission elegantly encapsulates all that is wrong with the approach taken by the draft CDB. I hope that it receives not only wide circulation but also serious consideration by the joint committee.

    Given the lamentable performance of Parliament in passing the Digital Economy Act in 2010 without any meaningful scrutiny, I would hope any bill that comes from the draft process does attract critical public attention.

    In his response to my concerns about the draft CDB, my MP believes that the bill only covers emails and phone calls. If this is typical of the level of understanding of most MPs then a lot more concerted public dialog will be required to ensure an informed debate.

Leave a Reply

© 2024 Inforrm's Blog

Theme by Anders NorénUp ↑

Discover more from Inforrm's Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading