The ICO’s enforcement (or lack of enforcement depending on your view) in the Royal Free/DeepMind case has divided the data protection community. The ICO found that the Royal Free had breached four data protection principles, had breached the medical confidentiality of 1.6 million patients but concluded that such a breach warranted an Undertaking.
Reaction from many data protection specialists has often been on the following lines:
- “If a breach on this scale involving millions of patients’ Health Sensitive Personal Data does not warrant a Monetary Penalty Notice, what does?”.
- “At the very least the ICO should have served an Enforcement Notice to require the deletion of personal data?”.
- “This Undertaking is a major let-off and will undermine enforcement in general and confidence in the ICO who is weak and ineffective”.
By contrast, reaction from commentators in the press and the research community has often been of the kind: “The enforcement is an interference with valuable medical analytics that could prove to be immensely beneficial to patients”.
For example, David Aaronovitch, in his Times column (6 July), remarked that these circumstances arose because many privacy fundamentalists were also Brexiteers (i.e. obviously have too many loose screws), and then lost his argument by claiming that DeepMind did not breach the Act (not understanding that as a data processor, DeepMind had no obligations under the Act).
In this blog, I explain many data protection specialists were too hasty to condemn, and the why the Undertaking was the only course of action.
I also think that the Royal Free could have rejected the Undertaking and called the ICO’s bluff to see whether she would serve an Enforcement Notice which (as I explain below) would likely to be overturned if appealed to the Tribunal.
Finally, I think the Royal Free Undertaking exposes a serious problem with the law in relation to Enforcement and Monetary Penalty Notices in general.
Is DeepMind a data processor?
The first thing to understand is that the ICO’s analysis limits itself to the testing phase whereas the Undertaking considers the live deployment of the system. The Undertaking, because the data controller accepts it, can focus on such further processing by Royal Free; if the ICO went down the line of formal enforcement, then the focus would have been only on the testing purpose.
In the ICO Letter that accompanies the Undertaking, the Commissioner writes “For the avoidance of doubt, the investigation has proceeded on the basis that the Royal Free is the data controller under the Act” (i.e. DeepMind is firmly a data processor). Unusual term that “avoidance of doubt”; it suggests that DeepMind might not be a processor but a controller, perhaps.
In the live deployment phase, I think much of the processing currently undertaken by DeepMind, is where it acts as a data controller. For instance, DeepMind was engaged to develop and deploy a new clinical detection, diagnosis and prevention application for the Trust. The Trust merely provided identifiable information on patients who had presented for treatment at the Trust in the previous five years for pathology tests together with data from the Trust’s existing radiology and electronic patient record system.
So who is using the skills to determine purpose and manner to identify what and how personal data are processed for what clinical diagnosis, detection etc purposes? Well I think it is DeepMind and not the Trust.
By keeping the analysis at the “testing” level, the ICO’s Undertaking does NOT need to consider this complicated “who is a data controller?” aspect. It can happily accept that DeepMind is a data processor.
So why no Enforcement or Monetary Penalty Notice?
Consider Enforcement Notice arrangements in a context that is limited to this “clinical testing purpose”. To serve an Enforcement Notice, the Commissioner has to be satisfied that ICO has to be satisfied that there has been or is a breach of a Principle (or Principles). Well that’s OK, there has been a breach of four Principles.
However, the ICO must then consider whether the damage or distress has been or likely to be caused to any person by processing.
I agree that data subjects were not informed about the “clinical testing” purpose and the data processor contract was, to put it politely, “deficient in many areas”. However, does this testing purpose damage the interests of data subjects or cause distress (or likely to do so)? Is there any evidence of data subjects queuing up to claim distress because some kind of secret testing routine was underway?
If you look at the actual testing arrangements in the Undertaking, the health personal data did not leave the Trust’s control (the Undertaking says the Trust is the only data controller remember). Additionally, there are no issue raised by the ICO that concerns the risk of data loss or unauthorised access leading to the personal data being used for other purposes. The only relevant purpose (as far as a putative Enforcement Notice is concerned) would be “clinical testing”.
The same problem arises with a Monetary Penalty Notice. Here again, the test is there has been a serious contravention of a Principle (yep, we have that) and the contravention was of a kind likely to cause substantial damage or substantial distress (oops).
If establishing damage and distress is difficult for an Enforcement Notice, it won’t be established for a Monetary Penalty Notice. If you remove these two from the enforcement equation, you are left with an Undertaking.
Now look at what would happen if there were a formal Notice served about the testing phase? Would there be an Appeal?
I can just imagine a Tribunal hearing where Professors and Nobel laureate researchers rock up to say that if you can’t test a clinical system to see whether it improves patient outcomes (i.e. 100% in the interests of the data subject) what can you do?
Then they would say “yes” there were some minor technical errors (e.g. a missing sentence or two in a fair processing notice or in a data processor contract) but these textual omissions do not deserve a formal Notice. After all the personal data never left the Trust, there were no security concerns, and there were no complaining data subjects.
And if there were to be an Appeal would DeepMind call on Google’s legal resources to assist behind the scenes? You bet they would. To paraphrase in a Trumpian style: I think the ICO would lose bigly.
The Undertaking itself
Because there are no statutory rules governing an Undertaking, its content can extend to all operational aspects of the Royal Free/DeepMind relationship (e.g. when DeepMind could operate as a controller – even though the contracts in the public domain maintains the data processor pretence).
There again, to avoid this processor/controller issue the Undertaking refers to the “arrangement with DeepMind”. In relation to this “arrangement” the ICO wants:
- A PIA covering any future application development and use of personal data subject to the “arrangement”;
- The identification of a condition for processing personal data under Schedule 2 and Schedule 3 ground in relation to its “arrangement” with DeepMind. In practice, this means patient explicit consent or a Statutory Instrument, approved by Parliament, to authorise the processing.
- Details of about how the Royal Free will comply with its duty of confidence to patients (there again, release from an obligation of confidence requires patient consent or statutory authority).
- Perform an audit, undertaken by a Third Party, of the current processing “arrangement” between the data controller and DeepMind covering all aspects of the processing of the personal data. The Commissioner will approve the choice of auditor, agree the terms of reference, and retain the discretion to publish parts or all of the audit findings as appropriate. Roughly translated this means the audit has to be undertaken by someone who knows about data protection using terms of reference that the ICO knows consider the main data protection issues.
In summary, the Undertaking will require all the main questions that the Royal Free need to answer with the implication, that if it cannot do so, that processing has to cease.
Contractual issue
Can I add something to add to the above.
I always get suspicious when I see “arrangements” which uses both “Data” and “Personal Data” because you have to distinguish the two when reading the clauses. So when a provision states something like: “we shall return or destroy all Personal Data at the end of the contract” it can be interpreted as “we are not obliged to return or destroy the Data associated with the contract”.
This becomes important when case of Source Informatics (see references) is considered. In this case, there were obiter remarks made about the Data Protection Directive 95/46/EC (as the 1998 Act was not in force). These had the effect of implying that the processing of personal data in order to produce anonymised data (e.g. that processing that stripped off all patient identifiers so to produce anonymous data) was not subject to a data protection regime.
This conclusion was reached because the purpose of the Directive was to protect privacy; in addition, the purpose of anonymization is to protect privacy. So, if some processing of personal data is undertaken to protect privacy (e.g. to produce anonymous data), then this processing attains the purpose of the Directive.
It follows that the Directive does not apply to that processing which is undertaken to protect privacy. For instance, one does not need a Schedule 2 ground to transform personal data into anonymous data (e.g. by deleting the name, address and patient identifiers from the personal data to produce data).
So as far as I can see, the DeepMind contract could allow the transformation of a million patient records into anonymous data which then have a potential use in profiling. For example, the life insurance industry could identify those data subjects who apply for insurance and because of their circumstances, can be excluded from insurance (as the data held by DeepMind shows a probability that life expectancy is poor and the data subject presents a high risk of creating a non-profitable relationship – i.e. there is a pay-out on early death).
Such profiling can be used to cherry-pick those who are insured.
Undermining the enforcement mechanism in the DPA
Consider some processing of personal data that is “beneficial” to the data subject. For example, one of the requirements for data sharing in the Digital Economy Act 2017 is for any data sharing to be for “..the facilitation of the provision of a benefit (whether or not financial) to individuals…” (Section 35(9)(b)).
So, let us assume that some processing achieves this beneficial objective but wholly ignores the provisions in data protection legislation and related Codes of Practice.
Well if the processing is 100% “beneficial” to data subjects, how can an Enforcement Notice or Monetary Penalty Notice be served (e.g. the former needs the ICO to consider likely distress or likely damage; the latter requires consideration of likely substantial damage or substantial distress). As the processing is 100% beneficial, it follows that no damage or distress occurs; the result is that the whole enforcement regime in the DPA collapses.
I now wonder whether this problem also contributed to the reason why the Royal Free escaped formal enforcement.
I am not a fan of fixed penalty notices for transgressions of data protection legislation; the Royal Free case has helped change my mind on that. The UK’s implementation of the GDPR must have provisions which closes this loophole.
Concluding comments
As far as I can see, the Royal Free:
- has without a statutory authority from Parliament or consent of the data subject, decided to share personal data about 1,000,000 data subjects, via an “arrangement” with another company owned by Google. This in itself can generate all kinds of suspicions, wild rumours and scare stories.
- has, in the absence of consent or a statutory ground (or even approval from an ethics committee), undertaken processing that is could breach medical confidentiality.
- has not informed the data subjects of the DeepMind “arrangement” nor have they explained the objectives and purposes associated with the “arrangement”.
- has signed a contract with DeepMind could permit retention of data of medical outcomes which could be used for later profiling
- has assumed the ends justify the means.
Medical researchers often state that they want patients to trust them with medical data. Ignoring the requirements of the Data Protection Act, to the extent that has occurred at Royal Free, has undermined that trust.
It would be nice if some medical researchers actually acknowledged this.
Other Publicity
Forthcoming Amberhawk’s courses in summer
- Next GDPR Workshop: 12 September (Leeds), 6 October (Edinburgh)
- DP Foundation Course: starts 3 October (Edinburgh)
- DP Practitioner Course: 13 November (London)
References
Royal Free Undertaking: https://ico.org.uk/action-weve-taken/enforcement/royal-free-london-nhs-foundation-trust/
Data sharing/data processor agreement at the time of the ICO investigation: https://www.whatdotheyknow.com/request/royal_free_nhs_trust_google_deep
Deepmind (including new contract): https://deepmind.com/applied/deepmind-health/working-nhs/how-were-helping-today/royal-free-london-nhs-foundation-trust/
Source Informatics cases: R v Department of Health ex p Source Informatics [2001] QB 424 (CA) which reversed Latham J. in [1999] EWHC 510 (Admin).
David Aaronovitch column: https://www.thetimes.co.uk/edition/comment/don-t-let-privacy-fears-halt-a-health-revolution-z2jkcknh8