I thought I would do a blog on how the current GDPR applies to the tracing of people via the APP being promoted by NHSX. There is a dearth of data protection detail (NHSX has yet to publish a DPIA), and I think this could be useful contribution to the public debate (especially as Ministers are promoting the APP heavily). Note: the DPIA for the Isle of Wight was published on May 8 (see references for a link).
So please make commentary as appropriate.
How the APP works
NHSX claim that every operation related to the APP is voluntary (e.g. to download, to communicate to a central database if need be). Users can choose to delete the APP at any time; this would also delete the data on the phone but not any data previously uploaded to NHSX’s central database.
The APP does not use any identifier already on the phone (e.g. phone number, SIM card identifier) or location details. However, if you download the APP it generates a random ID number for your phone which is exchanged with other phones that have also downloaded the APP. A record of any contact with APP loaded phones (e.g. proximity and time) is kept on both phones.
So, suppose someone catches COVID19. They can volunteer to download their ID number to NHSX’s central database and download a record of all the interactions from other APP enabled phones. This data then goes through a risk assessment to calculate whether the other APP phones need to be contacted with a warning message for the user (presumably something like: you have been in contact with someone exposed to COVID19; isolate or get a COVID test).
The NHSX website (see references) makes the following claims:
- “The APP doesn’t have any personal information about you, it doesn't collect your location and the design works hard to ensure that you can’t work out who has become symptomatic.
- NHSX systems don’t build a social graph in the traditional sense, although they do have pairwise proximity events for anonymous identities.
- The design makes sure that it’s hard to use the APP to track you by being physically close to you - although again there are balances to be struck.
- The back end is built to be as secure as is practical, but remember it holds only anonymous data and communicates out to other NHS systems through privacy preserving gateways, so data in the app data can't be linked to other data the NHS holds.”
Finally, the first half of the postcode is collected from APP users; this is to alert local hospitals of any surge in COVID19 cases. I understand that the make and model of the phone is also retained to facilitate communications. The intention is to allow the data to be used for “pandemic research purposes” (or lesson learnt for “future pandemic management”).
Definitional issues
There are three definitional issues: (a) is personal data processed? (b) if so, who is the controller processing personal data? and (c) if there is a controller, does the voluntary nature of the APP equate to “data subject consent”?
First to the claim that the APP has no “personal information about you”; is that the same thing as processing “no personal data as defined by the GDPR”? The answer is “NO”; as I am confident that personal data are processed.
To get to this position, you need to consider the definition of identifiable living individual.
An identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person. (my emphasis)
So is the NHSX system singling out a particular individual via a specific ID number to send an alert message to a specific phone because of the exposure to COVID19; answer “YES”. When the individual responds to that alert message (e.g. makes contact with NHS 111 online) is it reasonable to expect that will they identify themselves and say they have had a message via the APP. Answer “YES”.
Does the message relate to the health of the individual. Answer: “most likely” (as I have no detail of the message sent to the individuals’ phone).
So I am pretty sure the APP is processing personal data and special category of personal data. As such it needs an Article 6 lawful basis and a condition that overcomes the prohibition in Article 9(1) from processing health personal data.
So who is the Controller? Well it appears that there are joint controllers as the NHSX website states that “NHS England and NHS Improvement and the Department of Health and Social Care are delivering NHSX together”. Article 26 requires any processing of personal data “shall in a transparent manner determine their respective responsibilities” for the processing and make that transparent to the data subject. This has yet to be done.
Is there "data subject consent" for the processing given that the APP is voluntary? Well, the NHS text on the website is very careful to avoid the use of the “C word” and it is easy to see why. For instance, suppose the following:
- If members of the public are desperate to get back to work and generate a household income, will they feel obliged download the APP?
- Is it reasonably likely that sometime in the future, if someone is claiming benefits related to COVID and has not downloaded the voluntary APP, they will be asked why they have not done so?
- Is it possible that if the phone is lawfully seized by the authorities (e.g. in rape cases), that the content of COVID messages will studied?
In addition, Recital 43 states that with respect to consent:
- “consent should not provide a valid legal ground for the processing of personal data in a specific case where there is a clear imbalance between the data subject and the controller” (is there such imbalance do you think?)
- “in particular where the controller is a public authority and it is therefore unlikely that consent was freely given in all the circumstances of that specific situation.”
So, back to the question: does voluntary mean “consent”? At best the jury is out but I strongly suspect the answer is "NO". Indeed, I am very uncomfortable with consent as the lawful basis, especially as there is likely to be increasing pressure from employers, Government and the mere economic circumstance of data subject to download this “voluntary” APP.
At best, I suspect, we can say “voluntary for the moment but perhaps not for ever”.
What does voluntary mean?
The next question to ask is: “what stops the current voluntary approach to the APP morphing into something like the “voluntary” arrangements made for the ID Card in 2006?”.
For those who don’t recall, the ID Card holder could “volunteer” to show their Card to any service provider but that service provider could decide to deny a service if the ID Card holder refused to “volunteer”. So if you wanted to interact with any public service, you chose to “volunteer”.
Lawful basis of the processing
The first comment is that any interference by a public authority (NHSX) is unlikely to infringe Article 8(1) of the European Convention on Human Rights. This is because Article 8(2) permits legislation to be enacted by a Parliament process (e.g. in the Coronavirus Act 2020) which sets aside the A.8(1) right in limited circumstances (e.g. any interference deemed necessary of “for the protection of health”). It all hangs on the word “necessary”; it is either necessary or it isn’t.
If there is reliance on data subject consent (which I think is unreliable), then there is also no A.8 breach (so long consent is properly formed).
Turning to the GDPR/DPA2018, if the processing is “necessary” it will have an Article 6 lawful basis (e.g. candidates are A.6(1)(c) or A.6(1)(e); A.6(1)(d) could be used in a case-by-case emergency). If there is need for an A.9 condition that lifts the prohibition on processing of health personal data the candidates are A.9(2)(c), A.9(2)(g), A.9(1)(i), Schedule 1, paragraphs 2, 3 or 6 of the DPA2018.
Similarly, any research purpose has to have a lawful basis unless the data are truly anonymous.
If there is reliance on consent or necessary or public task (or even necessary legitimate interests of controller/Third Party), the controller has to specify publicly (A.13; A.14) what happens to the personal data if there withdrawal of consent or exercise of the exercise of the right to object to the processing.
Finally in this section, can I repeat my long expressed view that the ICO should take up Article 8 ECHR cases on the grounds that necessary as used in the GDPR has the same meaning as in Article 8 (see references)
Data Protection Principles and rights
In summary, I can’t find a general exemption in Schedules 2 to 4 that could apply in these COVID APP circumstances - but as we shall see, this could change.
Currently, data subject rights are not exempt (e.g. to be informed, of access), nor are the Principles that relate to data minimisation, storage limitation, purpose limitation and security. This means the processing purpose(s): has to be transparent to data subjects (A.13; A.14); use the minimum personal data in the processing; retain the personal data for the minimum length of time, and has to be secure. The purpose of the processing is limited to contacting people in an era of COVID19 or perhaps the related research purpose (if that occurs).
Any change of purpose (i.e. function creep) has to be identified to data subjects (see A.13(3); A.14(4)). The research purpose (if not using anonymous data) has to be transparent, have a lawful basis, and subject to security and data minimisation obligations.
In summary, the GDPR provisions do a good job for data subjects – assuming they are adopted and enforced appropriately.
Centralised or decentralised?
A major debate is whether the APP should be supported by a central database or is decentralised around the user’s phone. NHSX prefer the centralised approach (at the moment) as it claims more flexibility re COVID19 monitoring. The ICO clearly prefers a decentralised model as a centralised database would be more difficult to secure and more open to later function creep.
In summary, a centralised approach is vulnerable to the future legislative mood of Government; a decentralised approach would reduce that risk.
For instance, suppose a future law defined some health related function creep (e.g. a move to direct identifiability of data subjects). This could fall within the exemption in A.23(1)(e) which allows health exemptions for:
“other important objectives of general public interest of the Union or of a Member State, in particular an important economic or financial interest of the Union or of a Member State, including monetary, budgetary and taxation a matters, public health and social security” (my emphasis)
If such an exemption were to be enacted, then Article 6(4) would disapply the Purpose Limitation Principle and potentially all data subject rights (e.g. the right to be informed). In addition, because the UK has Brexit, powers in the European Withdrawal Act 2018 could be used to modify any UK_GDPR provision without recourse to Parliamentary scrutiny. This could involve the application of any Principle or obligation that is not subject to GDPR flexibility already.
It could be tempting (e.g. to reduce the pressures on the public purse) for Government to enact legislation that makes certain processing of personal data compulsory. For instance, to prove entitlement to a COVID related benefit, there is a requirement to show that you have downloaded the APP and have received the COVID warning message.
Note that if direct identification was ever developed, then law enforcement function creep would be inevitable (e.g. following a severe terrorist incident if an identifiable track and trace had been established).
The above explains why a centralised approach requires complete trust in our legislators not to legislate for function creep. By contrast, a decentralised model based on each user’s phone would make it far more difficult to get the technology to work for such function creep purposes.
Other issues?
Should there be a separate commissioner for the APP. My answer is “NO”, if the ICO commits to enforcing “necessary” and data protection issues relating to Article 8 of the Human Rights regime (as has been done twice before; see references). This is because, the word “necessary” permeates Articles 6 and 9 and two Principles in Article 5.
I should add, that the concept of fairness and lawfulness additionally involves other Human Rights obligations (e.g. not to be discriminated on grounds of race, religion or sexuality).
Is specific APP legislation needed instead of DP legislation? In theory, if the GDPR fully applies then I think this step is unnecessary.
Will the APP work as planned?
The Government primarily wants the APP to get the economy back. So suppose someone who is desperate to remain employed receives a message on the phone that they should be tested, knowing that if the test is positive, then they should self-isolate.
Will such a person adhere to the message or would they continue to work, especially if they are symptomless or experienced minimal symptoms? Would an employer be sympathetic for more time off work? Will the benefit or furlough system require proof of message or download of APP in order to receive a benefit? Is it easier for that person delete the APP to avoid the above hassle?
Could the APP work in a contrary way? For instance, someone more youthful than me who argues that “I don’t need to social distance anymore because I have loaded the APP, and I will be warned of any danger? Pubbing and clubbing – here I come!”
Important questions yet to be answered. Hope you found this useful.
References
Details of the APP: https://www.ncsc.gov.uk/blog-post/security-behind-nhs-contact-tracing-app and https://www.nhsx.nhs.uk/covid-19-response/nhs-covid-19-app/
Recording of Human Rights Committee meeting (ICO involved) re the APP https://parliamentlive.tv/event/index/6f0f52cf-9fda-4785-bf63-af156d18b6c7
Actual Human Rights Committee Report (published 3 days after the blog): https://committees.parliament.uk/committee/93/human-rights-joint-committee/news/146351/report-on-the-contact-tracing-app-published/
NHSX privacy policy: https://www.nhsx.nhs.uk/privacy-policy/
DPIA: (added 8th May) accessible from : https://faq.covid19.nhs.uk/article/KA-01043/en-us
Information Commissioner’s enforcement proceedings links Article 8 ECHR to unlawful processing”. https://amberhawk.typepad.com/amberhawk/2012/11/information-commissioners-enforcement-proceedings-links-article-8-to-unlawful-processing.html
Upcoming Data Protection Courses (in London)
Obviously COVID19 has put a spanner in the training works, but hopefully the following courses will be running from late June (fingers crossed).
All courses lead to the relevant BCS qualification:
- Data Protection Foundation: July 7-9 (3 days)
- Data Protection Practitioner: July 14-16 and September 8-10
- Data Protection Upgrade Practitioner: June 23-24 (2 days)
Full details on www.amberhawk.com of by emailing [email protected]
In addition to the data protection aspects, as usual well explored by Chris, the potential for abuse is huge.
Given that the users will self-diagnose, a desire to remain off work or school could induce someone to report falsely that they are symptomatic. This would spread quickly and exponentially. It could be engineered to extend absence by 14 day intervals through repeated "exposure" (a challenge shared by the Apple/Google API).
"Genuine" false positives could abound, data could be exchanged between adjacent flat-dwellers; imagine two back-to-back bedrooms where the occupants sleep with their phones on their bedside tables a foot or two apart.
And there's a big question over its adoption, only some 20% of Singapore's residents downloaded that country's app.
Then there is the issue of incompatibility and the potential for the app not to be acceptable to other countries.
A typical f**k-up by HMG/PHE/NHSX/whoever, familiar to me from my days in 1991 as Technical Architect for Security on the NHS-Wide Networking programme (abandoned a few years ago after effectively wasting £10b). [You will remember those days, Chris.]
[Keep well; keep sane; and keep up the good work.]
Posted by: Michael B | 06/05/2020 at 12:01 PM
How would a SAR work in the context that the Controllers for the app have no means to identify you and in order to satisfy the SAR you’d have to provide the information to identify you?
Posted by: Anthony Weaver | 06/05/2020 at 02:47 PM
Excellent analysis from Chris (no less that we would expect from him), it certainly reflects very clearly the concerns that the privacy community will have considered, but supports our reservations with clear legal argument. The risk of associated mis-use and of greater concern; purpose creep eating away at civil liberty is one we should all be very cognisant of.
Thank you Chris
Posted by: Cindy Paul | 10/05/2020 at 08:19 AM