This is the second instalment of my list of nineteen areas where the Data Use and Access Bill (DUAB) provisions act to the detriment of data subjects.
In Part 1 (blog of 25 Feb 2025), I listed a summary of the issues that would cause problems with renewing the Adequacy Agreement with the European Commission. This blog gives more details of the problems associated with the regulator (the ICO and the replacement Information Commission).
The summary points concerning DUAB are:
- the justification for Third Party Marketing via the use of the legitimate interests lawful basis risks creating a spammers’ charter;
- the Digital Verification Services (Part 2 of DUAB) take no account of the Nine Identity Assurance Principles developed by a committee of privacy experts established by the Cabinet Office;
- The Equality and Human Rights Commission do not have economic considerations prior to using enforcement powers. As data protection is integrally linked to the human rights regime (e.g. A.8 ECHR), the economic considerations of the ICO should be removed;
- the proposed Information Commission can remain beholden to the control of the Secretary of State; DUAB makes it worse because all voting members of the Commission can to be chosen by the SoS (e.g. for obedience perhaps);
- to ensure balance, the proposed Information Commission should contain members tasked with looking out for the interests of data subjects in its decision making processes; this includes the approval of Codes of Practice or Codes of Conduct.
- the current ICO’s policy towards minimal enforcement, raises the question of whether:
- there should be an offence associated with deliberate breach of the UK_GDPR provisions by controller or processor;
- the Secretary of State should use existing powers to enable privacy NGOs (e.g. Open Rights Group, Liberty) to act in the interests of data subjects without a data subject mandate; and
- data subjects should be able to appeal the Information Tribunal, in appropriate cases, where the ICO has failed to act in the data subjects’ interests.
Spammers’ charter legitimised
With respect to the processing of personal data for marketing, direct marketing on behalf of a controller or Third Party should not be legitimised in terms of legitimate interests in all circumstances. DUAB overturns a 35 years-old data protection standard that Third Party marketing requires consent (i.e. since the DP Registrar v Linguaphone Tribunal Decision under the DPA1984).
See the Appendix on page 10 of my submission to the Bill Committee (see references) which simply shows an extract of the ICO’s guidance direct marketing and the data protection standards that prevailed under the DPA1998 (which are being overturned by DUAB).
Many company email addresses for staff are of the form [email protected]. These can be scraped of the Internet and the used indiscriminately for spam for marketing purposes via “legitimate interests” provisions in DUAB (new Article 6, paragraph 11(a)). This use of “legitimate interests” for Third Party marketing thus risks creating a spammer’s charter.
Further detail which relate to the similar provisions in the previous DPDI Bill can be found on my blog: DPDI No.2 Bill dumps all data subject consent requirements for Third Party marketing (Hawktalk ; May 2023).
Action: Article 6, paragraph 11(a) (Clause 70) should be amended to exclude Third Party marketing and limit legitimate interests to be used in exceptional circumstances where the established DPA1998 consent standard for such marketing is impractical.
Digital identity assurance principles ignored
As background to the Government’s Digital Verification Identity proposals in Part 2 of DUAB, I refer to the “Nine Identity Assurance Principles” that were published in 2015 for inclusion in any future Governmental digital identity project. The Principles were produced by a specialist group of privacy experts (the Privacy and Consumer Advisory Group: PCAG) which included the author; the Principles are still available on the Government website (see references below). PCAG was established by the Cabinet Office.
These Principles were originally produced to avoid a repeat of the ID Card debacle a decade earlier; the Government asked a number of privacy experts (including the ICO) to debate and draft a set of objectives so that national ID could become acceptable on data protection grounds.
As a result, these Principles emerged to provide a benchmark for any digital identity scheme. It allows one to identify which Principle is not being considered and the consequences of that lack of consideration. In DUAB, there is no evidence that the Principles have been considered.
It is interesting to note the current successor Committee to PCAG (OLIPAG) was disbanded by DSIT officials in February 2025 as part of the merger of Cabinet Office responsibilities. There were many internal OLIPAG debates concerning One Login: the Government’s authentication and identity verification scheme for millions of users.
Further details can be found on: Government’s digital identity proposals ignore obvious privacy concerns. (Hawktalk ; February 2023)
Action: The ICO should be tasked, as part of his Annual Report, to include a section on how well Part 2 of DUAB performs, including how well the Digital Verification Services including One Login performs with respect to compliance with the Nine Identity Assurance Principles. There should be explicit reference to these Principles in Part 2 of DUAB to reassure the public; they could, for instance, be referenced as a Code of Practice under the auspices of the ICO.
SOS! Too much power for SoS
Schedule 12 of DUAB places too much power in the SoS to influence regulatory priorities. In addition, the Information Commission should report to Parliament, not a Government Department (DSIT). Two previous Select Committee had previously recommended that the Information Commissioner become directly responsible to, and be funded by, Parliament to protect the independence of the role. (See paragraph 10 of Justice Committee ; Third Report; Session 2008-2009, printed 3 Feb 2009).
Action: The subject of whether the ICO should report to Parliament should be raised again, perhaps as part of a “should Schedule 14 of DUAB form part of the Bill” debate
Strip the ICO of economic considerations
Data Protection law is integrally linked to the Human Rights regime via Article 8 ECHR (and to some extent Article 10 ECHR). The ICO is identified in the Economic Growth (Regulatory Functions) Order 2017 as having to apply economic considerations when deciding to enforce data protection, but the Equality and Human Rights Commission has no such considerations with respect of Articles 8 and 10.
The inclusion of economic considerations is likely to lead to inconsistent enforcement of Articles 5,6,9,23 and Schedule 1 by the ICO which are linked to A.8 ECHR (i.e. any provision in the UK_GDPR or DPA2018 that uses the word “necessary” or “proportionate”). The inclusion of economic factors thus makes the ICO is an unreliable custodian of A.8 rights.
There has been no Parliamentary debate on the application of the Economic Growth Order as applied to the ICO (see middle of: “Ministers want to pull the strings and rein-in the ICO’s independence” ; Hawktalk November 2021).
Action: Modify the list of regulators in The Economic Growth (Regulatory Functions) Order 2017 to exclude the ICO. Allow data subjects to complain directly to the Equality and Human Rights Commission concerning breaches of Article 8 ECHR instead of the ICO
New Commission is not independent
All the voting members of the new Information Commission are appointed by the Secretary of State (SoS). This gives too much control to the SoS who can fashion a Commission to suit the Government’s objectives. There should be more independence in the appointment mechanism; some appointees, for example, could be from a privacy NGO tasked with looking after the interests of data subjects.
The same goes for Ministerial decisions about the content of Codes of Practice that they draft and which impact on their Departmental responsibilities; for example Codes under the Digital Economy Act 2017. All Ministers head a Government Department which is also a large controller; this in turn means the Minister have a vested interest in the outcome of the processing of personal data. In other words, there is an inherent risk that a Code favours Departmental processing interests.
To maintain balance between opposing objectives, all Codes of Practice or Codes of Conduct specified in the DPA2018, or parts of Codes, relating to data protection compliance should be independently approved by the new Information Commission. Having members of the Commission tasked with looking after the interests of data subjects, also reduces the risk of inherent bias.
For further detail, see my comments on the same provisions that appeared in the DPDI Bill: “Cronyism at the Information Commission can undermine its regulatory-independence” (Hawktalk: October. 2023).
Action: at the very least, ensure that one Commission appointee has the specific responsibility of protecting the interests of data subjects. This idea originates from the DPA1984 when one member of the Tribunal looked after the interests of data subjects and one the interests of controllers.
Do we need a new Offence?
The current Commissioner is reluctant to use his fining powers. This creates a problem with respect to enforcement, because if data protection law is not enforced, it encourages controllers to ignore its provisions. The ICO is clearly of the view that fines do not work (e.g. in the public sector) and, if this view is correct, an alternative enforcement mechanism is needed.
In short, the Commissioner has not followed what the law, as enacted by Parliament, requires him to do. Parliament has not been informed that there is a problem in the law and the ICO has taken it upon himself to decide that not fining public bodies is the correct public policy.
Ministers are content to let this sleeping dog lie. This is because they are politically responsible for their Department’s processing as a large controller; the ICO’s policy means they face reduced risks from non-compliance.
My own view is that it is not the role of the ICO to determine important public policy concerning the privacy of 60 million UK data subjects in such an arbitrary way.
Parliament should thus consider whether, as an alternative, the deliberate misuse of personal data contrary to the data protection rules is a criminal offence (subject to the Proceeds of Crime Act 2002 so that large multi-million profits for unlawful processing of personal data can be recovered). Such recovery could provide an alternative to the current fining arrangements involving fines of over £1 million.
This can be achieved quickly by reverting to the DPA1998 and DPA1984 where non-compliance with an Enforcement Notice was an offence committed by the controller; the DPA2018 says such a breach should be a fine (which is unlikely to be actioned by the current ICO). The trigger would be the ICO serving an Enforcement Notice and the controller not complying with its terms.
A controller related offence creates some equity. For instance, employees when they deliberately set out to flout the controller’s data protection procedures commit an offence (see section 170 DPA2018). Controllers do not commit an offence if they set out deliberately flout the data protection rules (e.g. Principles and Rights).
Action: Require the Government to hold a swift public consultation as to what is the best way for the ICO to enforce the data protection regime.
Allow NGOs to pick up the slack
As the ICO is unwilling to use his enforcement powers, then there is a serious risk that controllers will not take their data protection responsibilities seriously. NGOs should be freed to act to pick up some of the slack when the ICO usually decides not to act (e.g. on complaints).
This can be achieved using powers under Article 80(2) of the UK_GDPR when “the Secretary of State] may provide that anybody, organisation or association [e.g. a privacy NGO]…., independently of a data subject's mandate, has the right to lodge a complaint with the Commissioner and to exercise the rights referred to in Articles 78 and 79 if it considers that the rights of a data subject under this Regulation have been infringed as a result of the processing” (my emphasis).
Action: To compensate for ICO inaction, the Committee can require the Government to allow Privacy NGOs to take independent action to protect data subjects.
Data subject appeal to Tribunal needed
There should be a mechanism whereby data subjects can use the Tribunal system when there is a failure of the Commissioner to act on a complaint. This is especially the case when the ICO, as a matter of policy, has decided to minimise enforcement of the data protection regime.
There is ample evidence that the ICO has made mistakes in his decision making process and has failed to take action; in some cases he has written to data subjects telling them to apply to the Courts for relief. The Tribunal cannot review the ICO’s decisions not to act and the only recourse for the data subject is Judicial Review of the ICO’s decision. In relation to rights, data subjects can apply to the Courts for a Compliance Order. Both are costly processes for all concerned.
Under the FOI regime, the ICO decision not to release requested information can be challenged. Under data protection, the ICO’s failure to require a controller to provide personal data on subject access cannot, but the controller can appeal to the Tribunal should the ICO order disclosure of personal data against the controller’s wishes.
Additionally, the ICO prefers to enforce the UK_GDPR via “reprimands” of controllers. There is no appeal against a reprimand apart from Judicial Review, and to some extent the lack of an Appeal is unfair to Controllers who are often faced with the attendant bad publicity.
The Tribunal appeal system is unfair to data subjects and possibly controllers.
Action: there should be a data subject appeal to the Tribunal when the ICO fails to act. This has to be restricted to reduce the risk of appeals from disgruntled data subjects that have no merit. So the data subject’s grounds of an appeal to the Tribunal has to, for example, present a data protection issue of public interest that need resolution. If successful the Tribunal could force the ICO to review the ICO’s failure to act.
Special category personal data at risk
The Ministerial powers to add special category of personal data to the list in A.9(1) UK_GDPR might actually permit (rather than prohibit) the processing of such special data (i.e. the provision might not be a safeguard as touted by the Government). The provision could allow the SoS to introduce sub-classification of existing special category of personal data linked to a particular context or purpose (e.g. on the lines used in the existing “biometric data for the purpose of uniquely identifying a natural person”; see A.9(1) UK_GDPR).
For example, a class of special category of personal data such as “cancer records in the context of training AI algorithms” would make it easier to process such personal data in accordance with one of the conditions that lifts the prohibition. Thus, far from prohibiting the processing of new special category of personal data, the power to add context to existing special categories of personal data could expand the processing of these special categories without the explicit consent of the data subject (the option in A.9(2)(a)).
Action: this point should be explored when the Committee considers whether the relevant clause should stand as part of the Bill.
References
Justice Committee Third Report: The work of the Information Commissioner (Session 2008-09) https://publications.parliament.uk/pa/cm200809/cmselect/cmjust/146/14602.htm
Download the Nine ID Principles here Download PCAG Identity assurance principles for building identity services in government
Download my evidence to the Bill House of Commons Bill Committee (see bottom of Part 1 blog on https://amberhawk.typepad.com/amberhawk/2025/02/data-bills-problems-exposed-as-government-rush-duab-through-parliament-part-1.html
Spring Data Protection Courses
Amberhawk is holding our first session on the changes to the UK’s data protection regime arising from the Data (Use and Access) Bill, by Zoom, on Tuesday March 25: (10.00am-4.00pm; £275+VAT).
The following courses following BCS Practitioner or Foundation syllabi on the DPA2018/UK_GDPR can be attended in person, or via Zoom, or as a mixture (i.e. part Zoom, part attendance just in case “stuff happens on the day”).
- Data Protection FOUNDATION Course: London on March 11-13 (Tuesday to Thursday, 3 days: 9.45am to 5.00pm).
- Data Protection PRACTITIONER Course: London on March 17-21 (Every day this week: 9.30am to 5.30pm).
More details on the Amberhawk website: www.amberhawk.com or email [email protected].
Comments