I am asking readers to sign my Petition on the Parliamentary web-site (see end of this blog); most of the blog's text explains why you should sign.
In summary, the Petition states that the Government would be negligent if it failed to draft clauses for the Data Protection and Digital Information Bill (“DPDI Bill”) to protect data subjects from the harmful impact of Artificial Intelligence (AI).
I have suggested that these AI clauses should be aligned with the EU Data/IA Acts (both of then apply; see references) and added to the DPDI Bill at Report stage in the House of Lords. This will preserve the ability of the UK’s AI businesses to develop services that meet the requirements of the only available, recognised, international legal standard that applies to AI.
If the Government misses this legislative boat, any AI legislation in the UK will be delayed until after the next General Election (that is assuming a new incoming Government sees AI legislation as a major priority after the next General Election).
This, in turn, means the earliest that UK legislation is likely to be enacted is in 2025, coming into force by 2027. By contrast, the EU Data Act becomes law beginning September 2025 - the AI Act soon after.
The current position of the UK Government is for an “AI Safety Institute” to make recommendations to Government. January’s official announcement of the Institute describes its functions as follows.
“The Institute is not a regulator and will not determine government regulation. It will collaborate with existing organisations within government, academia, civil society, and the private sector to avoid duplication, ensuring that activity is both informing and complementing the UK’s regulatory approach…”
I am sure such collaboration and academic debate will be very worthy. Sadly, however, the urgent requirement is for legislation governing the current risks arising from unregulated AI as of now, and not a “talking shop”.
Diminished protection from AI abuse
The EU Data/AI Acts maintain the EU_GDPR (or the current UK_GDPR) as setting the standard of privacy protection for data subjects.
This is unlike the DPDI Bill which weakens that privacy protection in the following areas relevant to AI processing: personal data, lawful processing, scientific research, automated decision taking and the Information Commission.
The problems associated with each of these areas are summarised below with reference to longer blog items.
Definition of “personal data”.
The DPDI uses a definition of “personal data” that does not meet the standards described in the DPA1984, the Council or Europe Convention No 108 nor the GDPR itself. In other words, what is personal data in France might not be personal data in the UK.
Such a reduction in the reach of “what is personal data?” has the potential to have a significant impact on how the Principles, obligations and rights of the UK_GDPR apply in practice. How can one assess, for instance, that an AI algorithm processes personal data “fairly”, if the meaning of personal data in the UK differs from that used in the rest of Europe.
The UK has been here before. The defective Durant decision in the Court of Appeal in 2004 concerning “what is personal data?” caused a decade of confusion in the UK; just Google “Durant decision data protection” to find that out.
This judgment caused the European Commission to start infraction proceedings against the UK, and the issue is still so sensitive that information associated with these proceedings are still not accessible by FOI. (I tried 2 years ago).
This continuing refusal to release information about a 20 year-old historical event, helps explain why the risks to the Adequacy Agreement between EU and UK, arising out of the DPDI Bill’s inadequate definition of personal data are very real.
A complete description showing why the UK’s definition of personal data does not meet European standards (or the 40 year old standard set by the DPA1984) can be seen from Hawktalk blogs dated 4 Aug 2022 and 13 Apr 2023.
Lawful and compatible processing
The DPDI gives the Secretary of State powers to:
- define certain AI processing as lawful by changing Annex 1 of DPDI;
- ensure that such AI training and algorithm development is compatible with the original purpose of the processing by changing Annex 2 of DPDI; and
- provide powers to introduce exemptions based on economic grounds to protect AI training and development from interference by the application of data subject rights (e.g. the right to be informed: see powers in A.23(1)(e) of the UK_GDPR).
These powers are extremely broad. They can be used to ensure that any Minister’s pet processing project, in future, is defined as lawful, compatible with the purpose of obtaining and exempt from transparency obligations. They can also be used to ensure that AI economic priorities prevail over measures that protect privacy.
I should add the House of Lords Committee on Delegated Powers (HL Paper 60; 14 February 2024) have called for these powers to be removed from the Bill as they are so general that they are “capable of being exercised more broadly to give effect to changes in policy”.
Detail of these data sharing powers is in my blogs dated 18th and 31st August 2022.
Secrecy and “scientific research”
The DPDI defines scientific research to include AI development and testing. Additionally there is a new exemption from transparency if certain (dodgy) ”appropriate safeguards” apply and any further processing of personal data for such scientific research is deemed to be compatible with the purpose of obtaining, again if these safeguards apply
For example, one safeguard proffered by the DPDI Bill is that the processing is unlikely to cause substantial damage or substantial distress to any data subject. This merely means that any moderate damage or moderate distress to data subjects (i.e. anything short of substantial), is acceptable as far as this Government is concerned.
The DPDI Bill anticipates that much “scientific research” (i.e. AI development) will be made lawful in terms of “legitimate interests” of controller or a Third Party. This means that large databases can be lawfully used by controllers (or disclosed to Third Parties) for AI development purposes.
Additionally, if the appropriate safeguards apply, the exemption from transparency means that existing data subjects described in the database are not informed about this further AI use or Third Party disclosure.
Such secrecy also applies to any onward data sharing of databases to any Third Party AI development, subject to the “appropriate safeguards” applying.
The Government say this liberal exchange of personal data is an example of it allowing personal data to become the oil that fuels economic dynamism. Others may view it differently: the Government is transporting this fuel in oil-tankers such as the Torrey Canyon or Exxon Valdiz.
Details of the risk from data sharing for scientific research is in my Hawktalk blog dated 31 Oct 2023.
AI automated decision taking
The DPDI reverses A.22 and the right not to be subject to automated decision taking or profiling. Instead of few exceptions when automated decisions can be taken (e.g. authorised by law), the DPDI permits automated decision taking in general, subject to a few exceptions (e.g. involving special category of personal data).
In other words, the DPDI Bill opens the door to untrammelled automated decision taking using AI techniques on UK’s data subjects.
There are safeguards, but these all apply after the automated decision has been taken and the significant effect on the data subject has impacted. There is nothing in the DPDI Bill that relates to safeguards prior to that automated decision being taken.
I think this position tells you something about the Government’s priorities here.
Independent Regulator
The DPDI Bill reduces the independence of the Regulator (e.g. when enforcing AI best practice).
The Secretary of State appoints all the voting non-executive members to the new Information Commission and sets the Commission’s strategic direction which includes economic, law enforcement and national security priorities.
Note that all these priorities set by the SoS are likely to apply to the Governments future exploitation of AI techniques.
For instance, the DWP are seeking powers to demand bulk personal databases from banks for its own AI purposes to identify those associated with any erroneous benefit claim or pension payment – which of course means processing personal data relating to millions of pensioners and claimants and those related to them.
The DPDI Bill therefore reduces confidence in an independence of the Regulator when enforcing the data protection rules in the context of AI. Detail in my Hawktalk blog dated 19 Oct 2023.
Common theme
The common theme in all the above, is that the DPDI Bill allows for AI development practices to proceed, in the presence of deliberately weakened legislative protection for data subjects.
Given that all USA high tech are large players in the AI Industry and given these companies do not have a proud tradition in the field of privacy protection, the Government’s policy amounts to “a wing and a prayer” that its “wild west” approach to the free flow of personal data for AI purposes will not result in any significant misuse.
In short, the Government has knowingly and recklessly abandoned the Precautionary Principle associated with the governance as applied to AI.
That is why I have tabled this petition; please consider signing it (and pass it to your colleagues to consider signing). First target 10K signatures.
The Petition
The text is as follows:
Amend the DPDI Bill to protect us from harmful Artificial Intelligence (AI)
The Data Protection and Digital Information Bill (now in the Lords) is a vehicle that allows the Government to legislate to protect against misuse of AI. The Bill should be amended so that it mirrors EU's AI/Data Act, which we believe provides better data protection with regards to AI.
The Prime Minister has said he wants “guard rails” to protect society from the misuse of AI.
We believe there is no excuse for Government inaction, and that failure to implement protective clauses on the lines of the EU's AI Act may risk damaging AI innovation and development in the UK.
The Petition is on: https://petition.parliament.uk/petitions/652982.
Forthcoming Data Protection Courses
The following BCS Practitioner or Foundation courses can be attended in person, or via Zoom, or as a mixture (i.e. part Zoom, part attendance just in case “stuff happens on the day”).
- Data Protection PRACTITIONER Course is in London or Zoom on Monday, 18 March to Friday, 22 March (5 days: 9.30am to 5.30pm).
- Data Protection FOUNDATION Course is in London or Zoom on (April 9-11: Tuesday to Thursday, 3 days: 9.45am to 5.00pm)
- All-day Zoom workshop (10.00-4.30) on the DATA PROTECTION AND DIGITAL INFORMATION BILL held on Thursday 9 May. Email [email protected] for workshop agenda or to reserve a place on this session
Finally, we have launched a specific qualification for data protection specialists in the Education sector.
Further details of all the above on the Amberhawk website (www.amberhawk.com)
References
Regulation 2023/2854 (Data Act} on “Harmonised rules on fair access to and use of data” (published in the OJ on 13 December 2023). See also my Hawktalk blog dated 5 Jan 2024
Details of AI Act: https://www.europarl.europa.eu/news/en/press-room/20231206IPR15699/artificial-intelligence-act-deal-on-comprehensive-rules-for-trustworthy-ai
Comments
You can follow this conversation by subscribing to the comment feed for this post.