BrochuresCartoon

COURSES (ISEB/BCS)
follow link for detail

Data Protection Training

London
Starts April 29th

Leeds
Starts May 1st

London
Starts July 15th

FOI Training
Leeds
Starts June 2nd

Information Security Management Training (CISMP)
London
Starts June 23rd

Training/Update/Events
Update: May 12th
EU Regulation: April 9th
PIA: April 14th
DP Audit: April 15th

Amberhawk

« Privacy Commissioner states that full body scanners can avoid data protection problems | Main | UK terror case judgment illustrates a failed system that cannot protect privacy »

11/01/2010

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a0115709c6f9d970b012876c4ff91970c

Listed below are links to weblogs that reference Privacy by Design can accelerate the decline of privacy:

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

Dr Cavoukian's comments that these scanners would enhance privacy were fundamentally flawed in that she assumed the scanners would be the only method of security employed at airports.

This is not the case, they will be used alongside the traditional pat down methods - so all of her comments that this would remove visual bias (such as security guards patting someone down because they look 'shifty') illustrate a complete misunderstanding as to how these scanners will be used.

This, in my opinion, made her entire article completely moot.

This analysis is less convincing than it should be for two reasons; first, PbD can be better represented throughout, and second, the conditions behind the slippery slope argument presented have not been fully elucidated.

To embrace PbD simply means that a system or technical implementation will not have privacy layered on as an afterthought, in response to external input such complaints or regulatory duties -- but rather that the controls and functionality required to assure privacy will be inherent. Those from the information security side have long known that this is the preferred manner to implement controls. If one wishes a system to provide privacy, PbD says this should be a design-level consideration.

How building such controls into systems raises a "descent of privacy" fear is beyond me; in all such real-world work with system designers and implementers, design-level privacy efforts have in fact considerably raised the profile of privacy, and put it in a more actionable context for such an audience. It also shifts privacy from a policy/procedures driven exercise to one more integral to the system in place. Again, it's not clear how technically bolstering privacy capablility or making controls inherent to the system places us in a "descent of privacy" situation.

The idea that PdB will somehow embolden or empower those who want "security at the expense of privacy" is strange; even a cursory look at the lay of the land shows that folks who see things this way require no such assistance with their agenda, and have much of the buy-in they'll ever need as things stand now.

The risk and cost analysis that's supposedly taking place during the slippery slope portion of this post needs to be examined in greater detail. There is nothing about the use of PbD techniques that alters the threat environment, nor inherently reduces the cost of systems with potential privacy impact. All of that analysis still must take place prior to implementation, and the piece here does not convince that PbD itself would skew the cost-benefit analysis behind enhanced adoption of such systems. There are also a litany of other factors that would have far more direct influence on such analysis. As the post here mentions, CCTV has proliferated without the help of any such PbD effort -- which is precisely why CCTV is a prime candidate for privacy-enhancement (something that Cavoukian has spoken about in the past). Why not improve upon existing systems?

Finally, that those in control may make the argument to "remove anonymity" in response to events is not only irrelevant to older systems already in place, but even more trivially accomplished within a policy/procedures model for these systems vs. a PbD model where the system is built to offer inherent controls. If cost is being held up here as a powerful deciding factor in adoption, then surely abandoning simple policy and practice is less expensive than re-engineering the system itself.

You are assuming that the cost of the scanner is the major limiter, and as it falls, scanners will become ubiquitous.

The cost of the salary for the person watching the screen is important too, if the goal is to stop attacks in real time.

I certainly agree with the argument that applying privacy controls to a fundamentally invasive system does not make it privacy-friendly, but then I don't think that is what Privacy by Design is about, and moreover I'm not convinced that whole body scanning is a valid case study - Dr Cavoukian may have picked a poor example here.

To suggest that this body scanners undermine the broader principles of Privacy by Design is unreasonable, since there is no evidence of Privacy by Design in use here. Privacy by Design is about building systems that fundamentally respect and preserve personal information rights throughout every aspect of their processing. A machine that is designed to undress people is clearly never going to be privacy-friendly no matter how one tries to apply controls: it is, however, possible to minimise the risk of information leaking outside of the operating environment, being stored or aggregated with other information.

Furthermore, the system is a poor example because most reasonable individuals would tolerate a degree of privacy loss to preserve their safety in an environment such as air travel, where we generally have an imbalanced perception of risk because of the terrible visual impact of the likes of 9-11.

I would almost (but not quite) like to see these systems operated without privacy controls, so that public outrage forces politicians to accept the unpalatable (and unglamorous) fact that the only effective way to catch terrorists in airports is to look for the terrorists, not their bombs. Detailed profiling, targeting of security inspections, watching body language - these skills have proven time and again to be much more useful than machines. But they're much less camera-friendly, and there will of course always be individuals who are unfairly targeted by profile-driven security checks.

The choices are tough. But this particular case doesn't herald the end of Privacy by Design.

If one had to choose between being scanned by one of these scanners or strip searched at the check-in desk, the scanner is the less bad option. I have no wish for a physical internal examination, still less one in full view of everyone traveling that day. Mind you, I would prefer not to be scanned at all, and I would like to be able to travel with a pair of nail scissors in my washbag.

If there is truly a need for such things to be done, I am relieved that someone is trying to make them less unpleasant than they might be. That said, less bad is less bad: it should not be confused with being good.

The Center for Democracy and Technology (CDT), in their Dec 21 testimony and submission to the FTC Roundtable on Consumer Privacy, have strongly endorsed the Dr. Cavoukian's Privacy by Design approach. Details:

Overview: www.cdt.org/blogs/erica-newland/privacy-design

Submission: www.cdt.org/files/pdfs/20091221_ftc_comments_privacy_design.pdf

The comments to this entry are closed.

All materials on this website are the copyright of Amberhawk Training Limited, except where otherwise stated. If you want to use the information on the blog, all we ask is that you do so in an attributable manner.