“The way to hell is paved with good intentions” is a well known English saying. I am convinced that it applies to the claim that one can have "security and privacy" by adopting Privacy by Design (PbD) techniques.
For example, Dr Cavoukian in her document on the use of whole body scanners at airports (see last week’s blog - 6th Jan) states that a PbD implementation “can satisfy security requirements without sacrificing (and perhaps enhancing) passenger privacy”. At one level this is correct. Obviously anonymous scanning is far less invasive than having someone physically pat you down or give you a strip search.
But the more you think about it, serious worries emerge. I raised some of them at the end of September (see Blog "Do privacy enhancing technologies have an Achilles Heel?"; Sept 30th), but I have to admit that Dr Cavoukian’s bold assertion really “did it for me”.
I think PbD accelerates a gradual, step-by step, progression that I am calling "The descent of privacy” (or “The ascent of security” depending on your viewpoint). The decline (or ascent) has ten steps which are as follows:
(1) PbD techniques can make airport scanners privacy neutral – assume Dr Cavoukian’s starting premise is correct.
(2) Why are airports special? – What about main bus and rail termini or ports? These locations also need these privacy protecting scanners. Note that assertive and authoritative privacy reassurances can be uttered (e.g. such all-body scanning is more “privacy enhancing” than the alternative).
(3) A market for all-body scanners develops – they become cheaper, quicker, more reliable and more innovative and designed for specific locations or tasks.
(4) Screening gradually becomes an accepted norm (e.g. for travelling on public transport between major cities).
(5) The use of scanners has thus expanded. As there is no risk to privacy, everybody queuing to enter certain places can be screened on an increasingly cost effective basis.
(6) As costs fall, screening can be further extended to reduce the security risks at other locations: especially at gatherings where orderly queues of people form (e.g. sporting events, theatres, entrances to government or iconic buildings).
(7) If there is no screening at a public event where people queue, then people begin to think there is a security risk. The demand for screening increases irrespective of whether the evidence shows that screening is effective.
(8) Authorities install screening because:
(a) as installation becomes relatively inexpensive, the political desire to respond to demands of the electorate in Step 7 above can be satisfied;
(b) there is a genuine desire to be seen to be doing all one can to reduce the risks – the fact that there is privacy protection is a bonus;
(c) it provides the cover of having done “everything possible” if there were to be a terrorist attack and the press/public/politicians want to find “someone to blame”; and
(d) politicians reduce the risk of a serious attack “on their watch”.
(9) People become screened regularly and become conditioned to expect to be screened.
(10) People are a risk or not trusted unless they are screened
So the end-point of the “privacy enhanced” descent is Step 10. Even though the process is protective of privacy one has arrived at a position that can be rewritten in a more familiar guise: “If you have nothing to hide, you have nothing to fear”.
Now here comes the important bit. The anonymity associated with PbD cannot be guaranteed “for ever and ever, amen”; the anonymity guaranteed by PbD will always hang by a tenuous thread. Harold Macmillan, UK Prime Minister in the 1950s, was once asked by a journalist: “what was most likely to blow governments off course?”. He responded: “Events dear boy, events”.
So it is with PbD. Once implemented in its full “privacy protected” glory, it is only a matter of waiting for an event that allows the argument about removeing anonymity to prevail. The implementation of the infrastructure had been anonymous – but after the pivotal event and for all future time, it isn’t.
To be clear, I am not saying that full body screening at airports is wrong – nor am I saying that every surveillance activity ends at Step 10 - nor even that PbD should not be considered.
What I am saying that the promotion by privacy and security specialists of the idea that one can have privacy and security is seriously misguided. It will be seized upon by those who need security at the expense of privacy and who also know that a future event is likely to make the case for removing anonymity. In other words, PbD is merely a catalyst that can accelerate the descent down these 10 Steps: or “the way to hell”, if you prefer.
Final comment: Even without employing PbD techniques, the surveillance associated with CCTV is arguably is around Step 8 or 9; vetting against criminal records is at Step 9 or possibly Step 10. In reality, I think PbD cannot address what is the main problem: the serious democratic deficit in the system of data protection and human rights. See “Nine principles for assessing whether privacy is protected in a surveillance society (Parts 1 and 2)” which can be accessed on http://www.amberhawk.com/policydoc.asp.