Other Information Law

Facial recognition CCTV excluded from new data protection law by definition of “personal data”

I have come to the conclusion that the new definition of personal data in the Data Protection and Digital Information No.2 Bill (“No.2 Bill”) only applies to facial recognition CCTV if the data subject is on a watch-list.

If the individual is not on a watch-list, and the camera images are deleted immediately after checking the watch-list, then personal data are not processed and there are no data protection obligations (e.g. no transparency)

It then follows that if the watch-list, for example, involves details of criminals wanted by the police, many facial recognition systems will process personal data in total secrecy (i.e. no transparency).

This blog explains how the No.2 Bill’s definition of personal data moves such facial recognition CCTV systems into the twilight zone.

Suppose a facial recognition CCTV system is linked to a watch-list of suspects whose facial features have already been digitised (perhaps using the digitised biometric images derived from passport photographs).

Obvious examples of such watch-lists are: (a) a list of those who are of interest for law enforcement purposes or (b) those who commit a crime if they enter a specific area (e.g. those who have been excluded from entering a specific shop).

Suppose further, such a CCTV system is set up in a shopping centre and performs facial recognition on everybody who passes any of the system’s cameras.  The system sends a warning message to the camera operator if someone, on the watch-list, passes the camera; the system also retains the relevant images and other details of the encounter.  Only personal data of those on the watch list are retained.

The system is so designed that the details of those who are NOT on the watch list are not retained (i.e. deleted instantaneously) after the check against the watch-list is complete.

Bridges v South Wales Police

The system described above is the one that featured in the Court of Appeal judgement (Bridges v South Wales Police: paragraph 9, see references).  In this case, the data protection regime (in the DPA1998 and the pre-Brexit GDPR/DPA2018) was fully engaged because the Court determined that “personal data” were being processed about every individual whose image was captured by a Facial Recognition CCTV camera.

In the Bridges case, the Court’s interpretation of the definition of personal data was crucial.  This was derived by reference to Recital 26 of the pre-Brexit GDPR (or to Recital 26 of Directive 95/46/EC when the DPA1998 applied). These Recitals specified that if the information processed related to a living individual who was being singled out, then “personal data” were being processed.

Paragraph 46 of the Bridges judgment makes this point clear.  It states:

  1. The primary point of dispute before the Divisional Court under the DPA 1998 was the extent to which using AFR Locate [the name used for the Facial Recognition CCTV system] entails processing personal data, SWP [South Wales Police] contending that the only personal data processed is the data of persons on the watchlist on the ground that it is only those persons that SWP can identify by name.

Having referred to the judgment of the Court of Appeal in Vidal-Hall v Google Inc [2015] EWCA Civ 311, [2016] QB 1003, and to the decision of the CJEU in Case C-212/13 Rynes v Urad [2015] 1 WLR 2607, the Divisional Court concluded (at [122]) that the processing of the Appellant’s image by the AFR Locate equipment was processing his personal data because the information recorded by AFR Locate individuated him from all others, that is to say it singled him out and distinguished him from all others.  (My emphasis).

As we shall see, “singling out” is NOT found in the new statutory definition of “personal data” in the No.2 Bill, and it is this omission that has significant consequences (i.e. SWP’s arguments in the first part of paragraph 46 as quoted above become correct).

Recitals are not law

The definition of personal data in the No.2 Bill is at first sight unchanged: “Personal data” means any information relating to an identified or identifiable living individual.  However, the No.2 Bill introduces the concept of direct or indirect identification of the individual into the statutory definition.

This new statutory definition, which also does not meet the standard of the Council of Europe Convention No 108 set in 1981 (see references), makes sure that the interpretation in Recital 26, as used in Bridges), is largely redundant.

This is because the Recitals themselves have no legal effect in the UK.  This was established in the case of Sanso Rondon v LexisNexis (see references), when the Court invalidated a requirement in Recital 80 (namely “the designated representative [of a controller or processor established outside the UK (or EU)] should be subject to enforcement proceedings in the event of non-compliance by the controller or processor”).

The Court came to this conclusion because the requirement was found only in Recital 80 and was not explicitly linked in any obligation in any Article of the GDPR.  That is why the Court concluded that, in relation to the liability of the representative for non-compliance (e.g. by a controller):

“…if the GDPR had intended to achieve 'representative liability' [for non-compliance as specified in Recital 80] then it would necessarily have said so more clearly in its operative provisions [i.e. the Articles of the GDPR]…(paragraph 101).

In other words, the Recitals are there to help interpret the legislation if there is doubt as to what the statutory requirements in an Article mean.  If a provision described in a Recital is not echoed in a relevant Article, then the interpretation in the Recital can be invalid.

In summary, the requirement for “singling out”, although it remains in Recital 26, it is not supported by any “operative provisions” found in the new statutory definition of “personal data”.  To the contrary, the new “operative provisions” of the definition work to exclude any prospect for “singling out”.

DPDI definition of “personal data”

The No.2 Bill additionally requires that to be “personal data”, there has to be a living individual who is directly or indirectly identifiable; in the latter case, indirect identification has to be at the time of the processing and by using reasonable means.

This raises three additional questions to be answered before the processing of personal data” by a CCTV system, similar to that under discussion, can be confirmed.

The first question is:  ‘Are individuals identified “directly”?’.  The No 2 Bill requires that “An individual is identifiable from information “directly” if the individual can be identified without the use of additional information”.

Clearly, in the case of facial recognition CCTV, the answer is NO.  This is because additional information has to be used to perform the identification of those images captured by a facial recognition CCTV system (e.g. the pre-recorded details of those on the watch-list).

In other words,  direct identification, is not relevant because identification of a living individual (on or off the watch list) requires "the use of additional information" (contrary to the definition of direct identification).

The next question is: ‘Are individuals identified “indirectly” as defined?’.  The No.2 Bill states that “An individual is identifiable from information “indirectly” if the individual can be identified only with the use of additional information”.

Clearly, the answer is YES as additional information is available as part of the CCTV system to identify those individuals on the watch-list.

But what about those NOT on the watch-list?

This question is answered by the third question in the series: “Is indirect identification easy to do “at the time of the processingand byusing reasonable means”  (Clause 3A(2) requires “the living individual is identifiable .… by the controller or processor by reasonable means at the time of the processing”).

In this case, for those not on the watch-list, there are no reasonable means of achieving identification available at the time of the processing (e.g. from the time of collection by a camera, checking the watch list, and then its instantaneous deletion).

There are three main reasons for this conclusion:

  • first, there is no “additional information” that is available by “reasonable means… at the time of the processing” in the CCTV system itself (or held by some other person);
  • second, the image of the individual and related biometrics have been deleted almost instantaneously, so there is no need for additional information to perform the unwanted indirect identification;
  • finally it would be unreasonable to ensure the Facial Recognition system retained details of individuals not on the watch-list. The far more reasonable proposition is to protect the interests of such individuals is by deleting their personal details instantaneously, as such deletion guarantees their privacy.  In addition,  the captured details cannot be further processed for purposes other than the original check of “is this individual on the watch-list?”.

In addition, Article 11 of the UK_GDPR specifically states that additional information is not required to be collected.  It says:

A.11(1) “If the purposes for which a controller processes personal data do not … require the identification of a data subject by the controller, the controller shall not be obliged to maintain, acquire or process additional information in order to identify the data subject for the sole purpose of complying with this Regulation. (My emphasis to show that this is the case when an individual is not on the watch-list).

Hence, there are no data protection obligations placed on those data that concern those whose images are collected and then instantaneously deleted because the individual is  NOT on the watch list.  Quite simply, such a CCTV system does not process personal data (as newly defined) in these circumstances.

The data protection regime of the No.2 Bill only applies to those on the watch list.

Why secret CCTV facial recognition systems?

As a facial recognition system is not processing personal data if an individual is NOT on the watch list,  then there is no need to be transparent as the UK_GDPR does not apply.  There is no need for a lawful basis nor a condition to overcome the processing of Special Category of Personal Data as the information processed is not personal data.

Transparency is only needed for data subjects on a watch list.  However,  there are a number of exemptions in the DPA2018 that provide the controller relief from the need to be transparent if being transparent would prejudice the crime, taxation, immigration or law enforcement purposes etc (e.g. Schedule 2, paragraphs 2 or 4; Section 44 of the DPA2018).  Hence there is no need to be transparent in these cases also.

It follows that many facial recognition CCTV systems can be installed without any transparency obligations and used more or less in secret.  As stated at the beginning of the blog, they enter the data protection twilight zone.

As this is the case, expect the number of Facial Recognition systems to mushroom exponentially after the enactment of the No 2 Bill.


Edward Bridges and -the Chief Constable of South Wales Police: Neutral Citation Number: [2020] EWCA Civ 1058

Sanso Rondon v LexisNexis; Neutral Citation Number: [2021] EWHC 1427 (QB)

Definition of “personal data” in DPDI No 2 Bill results in non-compliance with CoE Convention No.108: https://amberhawk.typepad.com/amberhawk/2023/04/definition-of-personal-data-in-dpdi-no-2-bill-results-in-non-compliance-with-coe-convention-no108.html

Data Protection Courses (Summer 2023)

An all-day Zoom workshop (10.00-4.30)  on the Data Protection and Digital Information No 2 Bill. Will be held on Thursday 13 July  2023 hopefully to include changes made during the Committee stage of the Bill.  The fee will be £250+VAT.  Email [email protected] for workshop agenda or to reserve a place on this session.

The following BCS Practitioner or Foundation courses can be attended in person, or via Zoom, or as a mixture (i.e. part Zoom, part attendance just in case “stuff happens on the day”).

  • The next Data Protection FOUNDATION Course is on Zoom only (June 20-22, 2023 (Tuesday to Thursday, 3 days: 45am to 5.00pm).
  • The next Data Protection PRACTITIONER Course is in London on Monday, 24 July 2023 to Friday, 28 July 2023 (5 days: 9.30am to 5.30pm).

Full details on the new  Amberhawk website (www.amberhawk.com) or  obtained by emailing [email protected].











All materials on this website are the copyright of Amberhawk Training Limited, except where otherwise stated. If you want to use the information on the blog, all we ask is that you do so in an attributable manner.