Dr. Ann Cavoukian, the Information and Privacy for Commissioner for Ontario, has been championing the cause of Privacy by Design concepts for years. I can remember an International Commissioner’s conference over a decade ago where she presented her paper on Privacy Enhancing Technologies showing that there were technical solutions to real data protection problems. Now she has published two new documents: the first sets out seven principles for Privacy by Design; the second sets out the possibility of using of untraceable biometrics. Readers who have a strong interest in this field are recommended to look at these issues; she normally leads other Privacy Commissioners by years mainly because she is that rarity of a regulator that understands the technology.
Dr. Cavoukian sets out Seven Principles that define the Privacy by Design (PbD) approach. These are that the PdB approach is:
1. Proactive not reactive; preventative not remedial: PdB does not offer remedies for resolving privacy infractions once they have occurred; it aims to prevent them from occurring.
2. Sets privacy as the default: this means that if individuals do nothing, their privacy still remains intact.
3. Embeds privacy into the design: PdB is embedded into the design and architecture of IT systems and business practices; it is not bolted on as an add-on or after the fact.
4. Offers full functionality: PdB seeks to accommodate all legitimate interests and objectives not through a dated, zero-sum approach, where unnecessary trade-offs are made. PdB avoids the pretence of false dichotomies, such as privacy vs. security, demonstrating that it is possible to have both.
5. Permits end-to-end lifecycle protection: PdB, having been embedded into the system prior to the first element of information being collected can offer cradle to grave privacy protection.
6. Visible and transparent: PdB seeks to assure all stakeholders that whatever the business practice or technology involved, it is in fact, operating according to the stated promises and objectives, subject to independent verification. Its component parts and operations remain visible and transparent, to users and providers alike and operates by the mantra “trust but verify”.
7. Respects user privacy: PdB requires the design to be user-centric by offering such measures as strong privacy defaults, appropriate notice, and empowering user-friendly options.
Also published is a discussion document on “Untraceable Biometrics (UB), a term developed by Dr. Cavoukian to define a new class of emerging privacy enhancing technologies that embody standard fair information practices and provide for user control, data minimization, and a high degree of data security. UB has five distinguishing features:
1. There is no storage of a biometric image or conventional biometric template;
2. The original biometric image/template cannot be recreated from the stored information, thereby rendering it untraceable;
3. A large number of untraceable templates for the same biometric may be created for different applications;
4. Untraceable templates from multiple applications cannot be linked together; and
5. An untraceable template may, however, be revoked or cancelled.
Commentary
So will it work? Of course it will – I am sure of that. However, my own view is that most data controllers and governments start from the instinctive position that they want to exploit their personal data assets to make savings, or a profit, or to deliver policy objectives. Hence they are unlikely to trust a technology that is perceived to be designed to protect data subjects by restricting that potential for exploitation. This is especially the case if policy-makers see that their choice is, for example, between leaving policy options open by data retention or closing them through data minimisation.
In addition, I am sure the spooks and police forces will simply love the concept of “untraceable biometrics”; as soon as this causes a problem they will pressurise Ministers for laws that require the untraceable to become the traceable. That is what the UK Government is doing at the moment; it wants telecommunications companies to capture communications data that they don’t collect now.
Dr. Cavoukian’s counter argument to this is that her technical solutions can deliver both “security and privacy” (see her 4th Principle). There is “security” because the untraceable can be traced (if need be), and “privacy” as those who do not need to be traced remain untraceable. I worry about this argument. If one postulates that there are circumstances when the anonymous becomes the identifiable, the untraceable becomes the traceable and the encrypted becomes the unencrypted, then why not allow the authorities to collect more personal details, safe in the knowledge that these extra data will only be accessed if there is a pressing need?
In addition, if there is “security and privacy” there has to be a mechanism that transforms “privacy” into “security”. This in turn leads to the laws that govern this transformation, the political system that enacts these laws, the way privacy laws work in practice and the role of the privacy regulator. This in turn leads to detailed questions such as: “do we have a judicial warrant to allow access to the encryption algorithm or will a billet-doux signed by a senior police officer suffice?”; “Should the regulator report to Parliament or the Home Secretary?”and, “What powers should the regulator have?”. Very familiar territory isn’t it.
I think that a cultural change is needed. Currently, the data protection regime is often seen by data controllers as a barrier to the “proper” use of personal data. Indeed, I wish I had a £10 for the number of times Data Protection Officers have been asked to be “innovative” in an attempt to get round a data protection requirement. It is this cultural view that explains why all Commissioner-pleadings to data controllers to implement a privacy enhancing solution to protect data subjects usually fall on deaf ears.
In my view, that culture will only change when individual data subjects in the market place demand a change. When consumers or voters begin say “my choice of supplier or government depends on how well they protect my privacy” then the political and managerial system will consider responding with the implementation of privacy enhancing solutions. That change in culture is developing slowly on the Internet and aided by some lackadaisical security practices; however I think we are years or even decades away from that user-centric, privacy, nirvana.
In short, Dr. Cavoukian’s ideas do allow privacy advocates to argue a convincing privacy case to data controllers; however, we should realise that it is the ultimately the attitude of data subjects (forcing data controllers to respond to their expectations) that will deliver eventual cultural change.
Comments