Following an FOI request instigated by yours truly, it has emerged that GCHQ are keeping tabs on FOI requestors. Even those requestors who have asked public authorities for a copy of GCHQ’s widely distributed, declassified, IT security documentation. The circumstances of my request illustrates why the FOIA exemption (Section 23) pertaining “national security” is ridiculously wide, and raises questions as to what “national security” now means.
The information that was the object of my desires is sexily entitled: “HMG IA Standard No.6: Protecting Personal Data and Managing Information Risk”. Each page is headed with the capital letters “NOT PROTECTIVELY MARKED” (which shows that there are no national security considerations with the content) and a footer which states “This information is exempt under the Freedom of Information Act 2000 (FOIA) and may be exempt under other UK information legislation”. This is followed by “Refer any FOIA queries to GCHQ...” (contact details provided).
How do I know the document says this? Well I obtained this unclassified document from a public authority web-site and it is relatively easy for readers to obtain their own “samizdat” copy.
The reason for this relative ease is because GCHQ’s “HMG IA Standard No.6..” has been widely distributed, by GCHQ, to Central Government, Local Authorities and NHS bodies and a host of other public sector organisations that have to implement the HMG Security Framework (which is freely available, and recommended download, on the Cabinet Office website). Private sector contractors, approved by GCHQ, are also given copies of the HMG IA Standard No.6.
The publicly available Security Framework documentation refers to the secret “HMG IA Standard No. 6...” as being a key document in helping to implement and understand the HMG Security Policy in the context of the processing of personal data. As this document (and other IA Standards like it) are not officially in the public domain, my FOI request was made with the simple intention of making it so. The objective is a simple one: how can the advice in the Security Framework be followed by other organisations if the supporting declassified documentation is also not made available?
Despite the “NOT PROTECTIVELY MARKED” label, I discovered the document is exempt from the FOI Act because of the wide nature of the national security exemption in section 23. This exemption requires that the requested information “was directly or indirectly supplied to the public authority” by GCHQ, a position that was upheld by an internal review which had input from GCHQ, I hasten to add.
So, suppose GCHQ supplied public bodies with its “Review of the collective works of Enid Blyton with a special emphasis on the roles of Big Ears and Noddy”. Such a document would be exempt from FOIA as the test the S.23 exemption applies is whether GCHQ supplied the text – not whether the exemption is needed in connection with any national security objective. As the S.23 exemption is an absolute exemption, there is no public interest test as to whether or not the document, if released, would prejudice national security.
This is unlike the national security exemption under the Data Protection Act, which provides for an exemption from the “fair processing” requirements of the Act if an exemption is necessary for “safeguarding national security”. As the document I requested is “NOT PROTECTIVELY MARKED”, my asking for it clearly does not raise any national security concern. It is therefore difficult to see why disclosure of personal data to GCHQ is (a) needed for presumably a national security purpose or (b) has to be kept secret from the data subject (i.e. the FOI applicant).
That is why I believe that, in the context of my kind of request, the disclosure of a FOI requestor details to GCHQ in the circumstances described above is a likely breach of the First (unfair) and Third Principles (excessive) and also cannot be legitimised in terms of Schedule 2 of the DPA.
By contrast, if the requestor asked for content that did carry a “confidential” or “secret” or “top secret” classification, then one could see arguments that such a disclosure could be justified. Indeed, one wonders whether this is routine practice already, and details of certain FOI requests and requestors are routinely disclosed to national security agencies. Who knows?
The handling of my FOIA request for an innocuous document provides evidence that the national security agencies will apply whatever exemption it can get it hands on, to be applied whenever it can, irrespective of the circumstances of the case. The mantra appears to be “if it can be kept secret, it will be kept secret”.
This is a worrying conclusion because the meaning of “national security” is expanding almost as fast as the Universe. In the latest “Wiki leaks” episode, the press were full of reports that a “snake venom facility” in Australia was a national security concern for the USA. Are we to assume, therefore, in the UK that “national security powers” could be applied to justify the processing of personal data in connection with the production of snake venom?
It is interesting to note that the Intelligence Services Act of 1994 provides the legal basis for GCHQ’s purpose associated with the provision of “bog standard”, declassified, IT security advice. This states that GCHQ’s role is to “to provide advice and assistance about ...the terminology used for technical matters, and cryptography and other matters relating to the protection of information and other material” to central government any other organisation chosen by the Prime Minister.
That explains why “national security” now embraces “providing advice and assistance on the security of communications and electronic data....”, business continuity and resilience planning (e.g. against Acts of God) and delivering “information assurance policy services” to public bodies. In fact, all situations envisaged by the Civil Contingencies Act 2004 are arguably now matters of “national security”.
The inclusion of these areas such as IT security is an example of “national security function creep”. These wider dimensions add to that creeping which has already arisen in the area of policing, as both the Intelligence Services Act of 1994 and Security Service Act of 1996, extended the function of the national security agencies to additionally support “the activities of police forces and other law enforcement agencies in the prevention and detection of serious crime”.
So when these national security agencies process personal data to support the police, these agencies do not process personal data for a “crime purpose” (which is the obvious purpose when the processing is to assist the police in connection with serious crime) but rather for the “national security purpose”. This position was upheld by previous Home Secretaries in the previous administration (see reference).
Why the difference? Well both exemptions are designed to protect the processing interests of the respective bodies, and they have successfully achieved this objective. For example, I have not heard the police arguing that the Data Protection Act stops them processing their criminal intelligence. However, the policing exemption in Section 29 of the Data Protection Act is fully subject to the Information Commissioner’s independent oversight, whilst the broader “national security purpose” in Section 28 is not. Hence the inevitable conclusion that the objective of morphing the “national security” purpose to include the “policing purpose” is to minimise and avoid independent supervision by the Commissioner.
I should add that this position cannot be effectively challenged, because Section 28 states that if a Minister signs a certificate that equates the “national security purpose” with the “policing purpose”, then that is the end of the matter.
The dilemma associated with balancing the national security purpose with individual privacy and transparency concerns is not a new issue. Back in 1979, the Lindop Report into Data Protection (Cmnd 7341, paras 23.21-23.24) stated that the national security agencies should be subject to a data protection Code of Practice that was independently supervised.
The Report concluded that it was important to take the national security agencies out of their “hermetically sealed” environment in order to ensure that these agencies would be "open to the healthy - and often constructive - criticism and debate which assures for many other public servants that they will not stray beyond their allotted functions".
So I end this blog by asking a basic question. Do you think that the national security agencies have “strayed”? Does an expanding definition of “national security”, supported by a lack of transparency and accountability, carry the risk of encouraging such “straying”?
Currently, the national security agencies effectively decide for themselves how far the “national security purpose” stretches: this risks increasing the “national security function creep” and establishing a position that is less and less accountable by the day.
In short, I think the time has come to urgently revisit how the exemptions in Data Protection Act and FOI Act should apply national security agencies.
References: See “Human Rights Legislation and Government Policy towards national security - 2006” which explores data protection in the context of weak regulation, a lack of Parliamentary and judicial scrutiny, and the national security purpose: http://www.amberhawk.com/policydoc.asp
Advert: We are running several sets of data protection courses next year. We are starting a set of the 7-day DP course in London (beginning 18th January) and running the 5-day intensive course in Edinburgh (beginning 24th February) and in Leeds (beginning 3rd March). These courses cover the DP ISEB syllabus and prepare delegates for the examination in April 2011, although you do not need to be seeking the qualification to attend.
Our next FOI course starts in Manchester on 26th January. As with Data Protection, these courses cover the FOI ISEB syllabus for the examination in April 2011, although you do not need to be seeking the qualification to attend.
Details on the “brochure” section of the Amberhawk website (www.amberhawk.com)
Is an IP address or an URL personal data? Should it be personal data? Anyway, given the European-wide review of the Data Protection Directive, I have decided to put my head above the parapet and describe what I think the definition of personal data should look like. Of course, readers might disagree – but all I would say that if we don’t have any idea of what the definition should look like, then Governments and the European Commission will happily impose a definition of personal data that suits them.
For convenience I shall amend the definition found in the UK Act (below). The paragraph(c) in italics has been added and the final part of the Act’s definition (from “and includes...”) has been removed. My proposed definition thus reads:
“personal data” means data which relate to a living individual who can be identified-
(a) from those data, or
(b) from those data and other information which is in the possession of, or is likely to come into the possession of, the data controller, or
(c) from the data and other information which has been provided by, or is likely to be provided by, the data subject
and includes any expression of opinion about the individual and any indication of the intentions of the data controller or any other person in respect of the individual;
The paragraph that has been struck out removes the legal arguments underpinning the controversial Durant judgment from the Court of Appeal. In that judgment, the Court ignored the Parliamentary record which explained why the text about “opinions” and “intentions” had been included, and decided to interpret the definition without that input.
In Durant, the Court determined that the fact that the definition of “personal data” had specified “intentions” and “opinions” as being components of personal data meant that the definition of personal data had to be a narrow one. This was because, if the definition of personal data was a broad one, there would be no need for the specification of “intentions” and “opinions” as part of the definition. Having rejected a broad definition of personal data, the Court then concluded that the words “relate to” had to be constructed in a narrow way so that that personal data had to be “focused” on the data subject or had to possess “biographical significance” for the data subject.
That is why removal of the text that begins with “... and includes...” should confine Durant to the dust-bin of data protection history.
The paragraph(c) I have added (in italics) states that if a data subject provides the relevant identifying information to the service provider (e.g. name, IP address, URL, date, time of use of service etc) then the data processed by the data controller becomes personal data.
So, for example, if Google Street View published details of say, a house, and if the individual concerned provided Google with details of the URL link to that house, and identifying details about himself, then the data becomes personal data and regulated by the data protection regime.
This linkage then engages the data protection regime and allows the data subject to raise the issue of whether it is legitimate for Google to continue to process his personal data (e.g. whether the data subject’s right to object to the processing of personal data should prevail). Similarly, service providers who processes IP addresses in connection with marketing related activities should, if the mechanism in above is used, would need to satisfy the right to object to marketing purpose.
I accept there is an philosophical objection to this process that requires the data subject has to identify himself to the ISP or Google or whoever etc. There are many who would find that such an idea is an anathema; after all, many hold it as axiomatic that Internet use should be anonymous.
However, all I ask is that such a reader to put that objection to one-side for the moment, and consider the impact of the words “....or is likely to be provided by, the data subject” in the definition I propose.
This covers the situation where the data controller has not got the identifying details from the data subject but there is a reasonable expectation that the controller may be furnished with such details in future. So, if more and more individuals contact a service provider in order to disclose identifying details as described above, the more a service provider should anticipate that further identifying details are “likely” to come into his possession.
In other words, there will come a time when the number of contacts made by data subjects with the service provider will be such that a service provider will be obliged to treat the data as personal data on ALL users (i.e. without the need for any further contacts from any other data subject).
For example, suppose a supplier has a database of 1,000,000 IP addresses and there is only one data subject contact. One would argue that the frequency of contact from data subjects is such that a future contact from another specific individual is “possible” rather than "likely". However, suppose there were 100,000 data subject contacts – one would then say that contact from any specific individual in future is “likely” to occur. If you agree with this conclusion, then the data controller will be processing personal data (even though 900,000 data subjects have not identified themselves).
Note that the definition empowers data subjects – they decide what happens. The frequency of contact will depend on how all users of an Internet service view a particular service; the more the suspicions about a service, the greater the number of contacts from users. The definition thus includes an implicit encouragement to service providers to arrange their processing affairs transparently so that users do not provide their details and engage the Act. Indeed, I would also expect consumer and privacy advocates to develop software applications from to allow users to provide a log of the necessary URLs and IP addresses so they can be furnished to data controllers.
I think the approach I suggest is better than a definition of personal data that includes a description of technology. Just look at the UK’s new Privacy in Electronic Communications Regulations (PECR) that are under discussion at the moment (see blog of 23 September 2010; “Coalition Government chooses to minimise privacy protection against spammers and behavioural advertisers”).
This is the third time that these PECR provisions have been enacted since 1997, mainly because the law in this area has had to be refreshed in order to play “catch up” with new developments in technology. My solution is not dependent on future technical development; the data subject can assume a degree of control whenever they want to.
James Callaghan, a Labour Prime Minister of the 1970’s, once told a new Minister that if he behaved like a doormat, he should expect to be trod on. I think this sentiment also applies to the privacy of data subjects. They are responsible for their own privacy – no-one else is.
It follows that the law should provide individuals with the means to protect themselves whenever they feel an organisation has trod on their privacy – and that is what my definition of “personal data” delivers.
Advert: We are running several sets of data protection courses next year. We are starting a set of the 7-day DP course in London (beginning 18th January) and running the 5-day intensive course in Edinburgh (beginning 24th February) and in Leeds (beginning 3rd March). These courses cover the DP ISEB syllabus and prepares delegates for the examination in April 2011. Our courses are structured so they are also suitable for those who do not seek the ISEB qualification. See the “brochure” section of the Amberhawk website (www.amberhawk.com)
All materials on this website are the copyright of Amberhawk Training Limited, except where otherwise stated. If you want to use the information on the blog, all we ask is that you do so in an attributable manner.