The Home Office’s (very truncated) consultation on a revised Code of Practice involving overt surveillance of public places (e.g. the use of facial recognition CCTV, Automatic Number Plate Recognition (ANPR)) ends in early September.
In summary, the draft Code contains too many general platitudes for my liking and is deficient on important detail. So much so, one wonders whether the Home Office is taking this public consultation seriously.
The two main deficiencies are: omission of key elements of the UK_GDPR that applies to overt CCTV used in public place surveillance, and the suggestion that A.8 of the Human Rights Act can be set aside if needed.
This blog reviews the draft Code which updates a dozen overt surveillance Principles most of which overlap with data protection requirements.
The Draft Code of Practice
The draft Code only applies to a “surveillance camera system” (as defined by Section 29(6) of the Protection of Freedoms Act 2012) that are operated by “relevant authorities” (as listed in Section 33(2)). Relevant authorities are mainly local authorities, police and anybody else the Secretary of State nominates by Order (e.g. the Civil Nuclear Constabulary); these public authorities have to appoint a data protection officer (a point raised later).
There is no obligation for private sector CCTV systems to adhere to the Code, but organisations that are not “relevant authorities” are “encouraged to adopt this code and its guiding principles”.
Covert surveillance is also not covered by the draft Code; such covert surveillance is subject to the Codes produced under the Regulation of Investigatory Powers Act (RIPA) 2010. This means this draft Code is silent when an overt surveillance camera system has a covert functionality attached (e.g. when a CCTV in a City Centre used for public safety includes a covert facial recognition functionality used for a law enforcement purpose – see references for examples).
Similarly, the RIPA Codes apply when there is covert functionality added to private surveillance camera systems (e.g. an overt CCTV at a football club is scanning the crowd at the turnstiles for safety issues, and is also covertly being used on behalf of law enforcement).
This joint overt/covert functionality needs a far more intensive and detailed public debate than is on offer on this speedy consultation; for example, should overt camera systems monitoring the London underground have a facial recognition functionality used covertly by law enforcement and other public bodies?
Who decides whether it is reasonable to commence this joint functionality? Who is monitored and why? When are the surveillance personal data deleted?
Can we rely, for example, on RIPA legislation enacted over a decade ago (when facial recognition functionality did not exist) to resolve these questions or is specific legislation on surveillance camera systems needed? The previous Surveillance Camera Commissioner thinks its the latter.
The dangers are real: what could the Chinese Authorities in Hong Kong do with a facial recognition CCTV system monitoring its public transport system? These worries explain why there are calls for facial recognition systems to be banned?
In summary, City Centre CCTV has to consider the draft Code whilst private shopping mall CCTV does not, even though the CCTV issues are the same. ANPR used by the police to monitor traffic is subject to the Code, but ANPR for a supermarket car-park is not.
I don’t think this differentiation is rational or indeed reassuring; there are too many gaps.
Article 8: can it be set aside?
Article 8(2) of the Human Rights Act 1998 describes when a public authority can interfere with private and family life. This states that:
“There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others”
Note two things:
(a) the interference has to be “in accordance with the law” (this explains why there are specific calls for legislation covering the use of overt surveillance camera systems);
(b) the list of necessary purposes where interference is justified is limited by an exhaustive list to “national security, public safety, economic well-being…” etc etc.
However Principle 1 of the Code does not follow these two factors (“Use of a surveillance camera system must always be for a specified purpose which is in pursuit of a legitimate aim and necessary to meet an identified pressing need). In the draft Code:
- the law which legitimises the interference by overt CCTV is NOT specified in the Code (the relevant law is specified but only with respect to covert surveillance).
- the purposes go beyond those specified in A.8(2) by the introduction of a non-exhaustive list. Paragraph 1.1 states that “Surveillance camera systems operating in public places must always have a clearly defined purpose or purposes … (which) … include national security, public safety, the economic well-being of the country ….etc etc”.
It can be seen that the word “include” anticipates purposes NOT specified by A.8(2) and therefore the Code anticipates a breach of A.8 if this is required.
Missing data protection issues
The draft Code has the following general omissions:
- that facial recognition functionality involves the processing of special category of personal data and all the attendant legislative baggage that this requires (e.g. identification of Schedule 1 condition; changes to the A.30 ROPA introduced by Schedule 1, Part 4);
- that “third party service providers” are processors under the UK_GDPR; the detail of the controller-processor requirements in A.28 and A.29 are thereby omitted;
- that “where a system is jointly owned or jointly operated” that these bodies are likely joint controllers (and the joint controller related obligations in the UK_GDPR are missing);
- the possibility that CCTV images are likely to be stored on the Internet (i.e. processed outside the UK) and the data protection issues associated with transfers of personal data need to be addressed.
Specific data protection omissions include:
- reference to the data sharing code of practice when a relevant authority discloses the product of CCTV surveillance to other Third Parties;
- the fact that unauthorised obtaining or disclosure of CCTV images (e.g. posting on YouTube by a CCTV operator) can be an offence under the DPA2018;
- the requirement to consider data protection by design and by default when procuring new surveillance systems (e.g. to use pixilation when needed);
- the fact that a relevant authority will have a Data Protection Officer who can advise on DPIAs, CCTV procurement and operations etc;
- the fact that it is not just the right of access which is an issue (the right to object, restrict and other rights apply);
- reference to certification against quality management criteria is mention; however, it fails to mention certification against data protection criteria that is being rolled out by the ICO.
Now, I don’t want to go on and on – but I could do so. I think you follow the drift of the above: data protection in this draft Code has simply gone AWOL. I should add that in June (well before the this consultation) the ICO published an opinion on the use of facial recognition technology in public places which the consultation ignores.
Concluding comment
Section 128 of the DPA2018 allows the Secretary of State to require the ICO to produce a statutory code of practice. I suggest that the Home Office draft Code (which does not relate to most CCTV camera systems) is ditched in favour of a comprehensive Code of Practice to be produced by the ICO. At least with the ICO delivering a Code, it will include data protection – unlike the current Home Office offering.
In addition, the ICO has to be consulted about this Home Office Code; hopefully she will recommend either that her office produces a comprehensive CCTV Code or that fresh legislation is needed to cover the use of overt and covert surveillance camera systems.
Data Protection Practitioner Course (September)
Because of Indian variant and the continuing COVID pingdemic uncertainty, the course can be attended in person, or via Zoom, or as a mixture if you get pinged (it's up to you). The Data Protection Practitioner Course is in London, and starts Tuesday, September 7 (6 days).
Full details on www.amberhawk.com/StandardDP.asp or by emailing info@amberhawk.com
References
Published 13 August 2021; “This consultation closes at 11:45pm on 8 September 2021: https://www.gov.uk/government/consultations/surveillance-camera-code-of-practice
My blog on overt TfL congestion charge cameras being also used for covert national security purposes: https://amberhawk.typepad.com/amberhawk/2014/01/data-protection-day-home-secretary-signs-a-national-security-certificate-to-permit-the-unacceptable.html
A similar use of overt/covert CCTV was tried in Birmingham: https://www.theguardian.com/uk/2010/aug/27/birmingham-police-inquiry-muslim-cctv
The ICO's opinion "The use of live facial recognition technology in public places" (dated 18 June 2021) is on the ICO website.