The current A level results debacle has raised two data protection questions. Namely whether the right not to be subject to automated decision (A.22 of GDPR) applies and whether the exemption for exam results (Schedule 2 paragraph 25 of the DPA2018) is fit for purpose? The answer to both questions, in my view, is “NO”.
Despite yesterday’s Government U-turn, this blog shows that the two questions above are inter-related; for example, transparency of the processing of personal data associated with respect to the current A level exams can be subject to the exam script exemption.
In summary, I think there has been no breach of transparency arrangements, no automated decision taking and there is no right of access to exam scripts. This blog explains why?
This position could threaten the UK’s adequacy determination from the European Commission because the justification for the exam script exemption appears to be wholly inconsistent with a CJEU judgement. That is a polite way of saying “complete rubbish”.
However, Ofqual’s processing of personal data can be challenged on the basis that the outcome of the processing is unfair but not much else.
Is there an automated decision?
First to the “automated decision” controversy which has abated given yesterday’s Government U-turn. The ICO, who has dodged several bullets because of this volte-face, stated in her press statement of 14 August 2020 that:
“The GDPR places strict restrictions on organisations making solely automated decisions that have a legal or similarly significant effect on individuals. The law also requires the processing to be fair, even where decisions are not automated.”
“Ofqual has stated that automated decision making does not take place when the standardisation model is applied, and that teachers and exam board officers are involved in decisions on calculated grades.”
If one assumes that the ICO is satisfied that there has been no automated decision taking based solely on automatic processing of personal data, then it is important for the public to understand why. It would have been nice if the ICO had provided a modicum of reasoning for this view in her statement.
Possible reasons I can see for no automated decision are as follows:
- There is an appeals process, so a decision about admission is neither final nor is it based solely on automatic processing of personal data.
- It is not Ofqual as controller that is making the automated decision about University admission; Ofqual is processing the personal data but it is the University that is making the decision about admission. Most data subject rights are satisfied by the controller processing the personal data (e.g. the right of access is directed to the processing controller). This is not the case here as the controller making the decision is not the controller that is doing the processing (and Ofqual is not a joint controller with each University).
- Evidence for this is the fact that different Universities are making different decisions with respect to student applications (i.e. some Universities gave space for an appeal under the old arrangements, some haven’t)
- The raw grade estimate for each candidate is not the estimate of one examiner. There are a number of possible human interventions on each estimated grade that are reviewed before input into an algorithm which merely standardises the grades across all candidates. In other words, the algorithm does not make an automated decision; the essence of any decision about final grades, obtained by a candidate, is taken prior to input into the algorithm.
- With respect to the processing of personal data, students were told months ago that they needed certain grades to obtain a place at the University of their choice; a decision making process that takes months is hardly “automatic”.
I am not saying there is not a breach of the GDPR; Ofqual’s algorithm is vulnerable to claims that the outcome of their processing is unfair. That to my mind is a better bet than making a complaint with respect of automated decision making.
Assume an automated decision
If we assume there has been an automated decision where the A.22 right applies, another problem surfaces. Does Ofqual’s automated decision need to be explicitly authorised in legislation? In the UK, I suspect the answer is again “NO” contrary to the GDPR requirements.
A.22(2)(b) permits an automated decision to be taken if it is “authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject's rights and freedoms and legitimate interests” (my emphasis).
So what does “authorised by Member State law” mean? Does it mean that the “automated decision making” has to be explicitly specified in new legislation or can it be authorised by existing legislation? If so, what legislation authorises Ofqual’s automated decision taking?
Recital 71 suggests that automated decisions “should be allowed where expressly authorised by ….Member State law”. Existing law is unlikely to expressly authorise automated decision taking; however existing law is likely to refer to general powers which can be interpreted as permitting the controller to take automated decision if needed (e.g. for Ofqual’s functions).
Consider the recent case relating to the police’s use of CCTV automated decisions using a facial recognition functionality. The Court of Appeal stated the police’s use of facial recognition was unlawful because the legal basis was left too vague with the result there was a breach of the A.8 ECHR right (no law permitting the interference).
As a result, new legislation expressly authorising public bodies like the police to take automated decisions via facial recognition CCTV is likely to be enacted by Parliament. Such legislation would be an example of automated decision processing that is expressly authorised in legislation.
However, this still leaves open the question of whether automated decisions can be authorised implicitly by general powers in the UK?
Well, the Government has slipped in an extension to the automated decision taking provisions and, in the UK, automated decisions can be taken if “required or authorised by law” (see S.14(3)(b) of the DPA2018). In summary, I suspect the GDPR (thanks to Recital 71) expects “automated decision making” to be expressly specified in new legislation but the UK Government could well expect that such decisions can merely be “required” by existing legislation.
In other words, the UK’s DPA2018 could permit a major departure from the standard of protection associated with respect to the A.22 data subject right as expressed in the GDPR.
Finally if new legislation is needed, don’t assume it will be scrutinised by Parliament in a meaningful way. This is because the legislation could take the form of a Statutory Instrument in order to minimise Parliamentary consideration.
Indeed, the Home Office has a worrying track record of using obscure Order making powers. For example, powers in telecommunications legislation of 1984 (enacted at the same time as the DPA1984) was used to allow bulk personal database acquisition by the national security agencies for three decades without any debate or scrutiny (until the Snowden revelations hit the fan; see references).
Exam scripts (Schedule 2, paragraph 25)
The exemption for exam scripts has been fashioned without regard to a CJEU decision in 2016 which has been ignored/overlooked by the Government; the choice of words (ignored or overlooked) I leave to you as it depends on how charitable you want to be.
The judgment in question is the case of Peter Nowak (case C434/16; see references). It determined, under Directive 95/46/EC, that the answers given by a candidate contained within an exam script constituted personal data of the candidate as well as the comments made, on an exam script by an examiner, with respect to a candidate’s answer to a question.
The Court added that as a result the candidate (i.e. data subject) could have the right of access to these personal data but not the right to correction (e.g. when a candidate gave a wrong answer to a question, that answer could not be corrected!). However, the CJEU did state that the data subject rights did not extend to the questions set in the exam, as an exam question would not constitute personal data (paragraph 58 of the judgment).
However, the UK’s approach adopted in the exemption in paragraph 25(1) of Schedule 2 states that the right of access and the right to be informed “do not apply to personal data consisting of information recorded by candidates during an exam”. You can see now why meaningful information about the logic associated with any automated decision taking, which is part of the right to be informed, is exempt.
The ICO advice on exemptions states that “… candidates do not have the right to copies of their answers to the exam questions” omits to state with the same forcefulness that there is no need to provide information about the processing as required by A.13 and A.14 (e.g. the logic associated with automated decision taking).
Now the only official explanation for this exemption I have be found appears in the documents the Government has published with respect to an adequacy assessment for the UK (see references). Document “E3 (Schedule 2 Restrictions)” states:
“…the provision [in paragraph 25] aims to protect the integrity of exams by ensuring that exam scripts cannot be accessed outside established processes. This is necessary to protect the confidentiality of the questions so that awarding organisations can re-use questions where appropriate. This means they are then able to build exam papers and conduct multiple assessments in the year, which is crucial for many professions.
The reuse of questions is widely acknowledged as being best practice for certain types of assessment and essential to the way in which standards are set and maintained in these assessments.”
So the exemption in paragraph 25 exists in order to protect exam questions (which are not even personal data following the CJEU decision in Nowak!).
The result is an over-engineered, exam-script exemption that has the potential for great detriment for data subjects as follows:
- A candidate can have access to the examiner’s comments about their script but not the script itself; completely bonkers in my view. It is interesting to note that when the European Commission intervened in Nowak it argued that the script was personal data but the examiners comments were not (i.e. the complete opposite to the position established by the exemption).
- Without access to the script, a candidate cannot see whether they have been unfairly assessed. How can a candidate make a case of unfair treatment that if they cannot gain access to their script, examiner’s comments and marks?
- A candidate is not informed about the processing. How do you begin to exercise any data subject right if you don’t know what personal data are processed or the logic involved in automated processing?
- Any exemption authorised by Member State law in Article 23 has to be proportionate and necessary. The Government claim is that the exemption is necessary and proportionate to protect information about exam questions which is not even personal data. Complete rubbish.
Concluding comments
I can see the need to defer access to personal data for a few months whilst an exam board it marking hundreds of thousands of scripts; however an everlasting, absolute exemption for all time, from the right of access or to information about the processing is clearly over the top.
In other words, the exam script exemption is not necessary and joins the exemptions associated with immigration and confidential references as a significant diminution of data subject rights. The exemption has been implemented in the UK to protect information which is not even personal data. It is a clear breach of A.23 requirements.
The European Commission is assessing the adequacy of the UK’s data protection regime; one does not need not go further than this exam script exemption, and the UK’s approach to automated decision taking, to see that three data subject rights of the GDPR have been compromised.
Upcoming Data Protection Courses (in Autumn)
Obviously COVID19 has put a spanner in the training works, but the following courses are scheduled for the Autumn now lockdown is unlocked.
All courses lead to the relevant BCS qualification:
- Data Protection Practitioner: London, Starts Sept 22 (6 days)
- Data Protection Foundation: London, Oct 13-15 (3 days)
- Data Protection Practitioner: Edinburgh, Starts Nov 23 (5 days)
Full details on www.amberhawk.com of by emailing [email protected]
References
ICO exemptions document https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/exemptions/
Case number C-434/16 of Peter Nowak -v- Data Protection Commissioner (just google “C-434/16 Nowak” for a host of links)
Explanatory Framework for Adequacy Discussions - Section E3: Schedule 2 Restrictions; https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/872235/E3_-_Schedule_2_Restrictions.pdf
EC’s view on C-434/16. Scroll down to the table for 2016 https://ec.europa.eu/dgs/legal_service/submissions_cour_en.htm
Section 94 of the Telecommunications Act 1984: a warning from history. https://amberhawk.typepad.com/amberhawk/2015/11/section-94-of-the-telecommunications-act-1984-a-warning-from-history.html?
Comments
You can follow this conversation by subscribing to the comment feed for this post.