Text Updated 6 March 2024
Most people will agree that the promised “Brexit benefits” have yet to manifest themselves in physical form. This is especially the case with the Data Protection and Digital Information (“DPDI”) Bill which for three years been touted by Ministers as the pre-eminent Brexit Bonus for Britain.
The Bill, it is claimed, combines a high level of data protection for data subjects with easier compliance for controllers and the wider exploitation of personal data:- such data being the new oil in a free-running, modern, data-driven UK economy.
Readers of this blog will know I think this claim to be pure fantasy; the true position is the Bill lowers the level of data protection for data subjects. The Bill only oils the wheels of business/public sector processing to the extent that it removes obligations and reduces rights.
However, the recent publication, just before Xmas, of the European Union’s Data Act and agreement on the EU's AI Act add another dimension. The former Act's text is in the public domain; the latter's text has yet to appear.
The provisions of both infer the DPDI Bill is likely to damage UK controllers who are intending to develop/sell Artificial Intelligence (AI) expertise, products or consultancies into Europe. Such damage negates the reasons given by Ministers as to why the Bill was needed.
Put simply, if AI controllers based in the UK ignore the EU's Data Act (and adhere to the DPDI Bill) they may experience difficulty trading with partners in European Union. If they comply with the Data Act, then implicitly they are accepting GDPR standards of data protection for the UK (and not the lower standards of the DPDI Bill). The combination of adhering to the Data Act and also the DPDI Bill (once enacted) is not a viable option.
This blog explains why this is the case.
Legislative catch-up needed?
Whilst the Government has been gathering AI plaudits by hobnobbing with the great and the good at Bletchley Park conferences on AI, those “pesky Europeans” have stolen a legislative march. The EU will have comprehensive legislation in place in each Member State governing AI from September 2025; by contrast the UK has yet to draft a single clause.
This creates a problem for the UK Government; it has three options in a game of “regulatory catch up”. It can:
- Continue to believe in Brexit (a “do nothing” option). As Government believes its own rhetoric (e.g. that the DPDI Bill’s “high standard” of data protection does not jeopardise the Adequacy Agreement with the EU), it follows that there is no change with the EU's Data or AI Act. The UK can thus ignore the Data Act and take “control of its laws” to its own timetable (e.g. introduce AI legislation, only if needed, after it has won the next General Election).
- Prepare amendments to the DPDI Bill for its House of Lords’ Report stage in April/May. This involves Government having a set of private discussions with its friends (as it did with the DPDI Bill) on how to respond to the Data and AI Acts. The DPDI Bill provides a vehicle for any AI amendment which can then be pushed through Parliament with minimal scrutiny at Report stage. (In much the same way 125 pages of amendments were pushed through Report stage in the House of Commons in December).
- Halt the DPDI Bill for four months. This allows for a public consultation on AI legislative options to take place, allow time for amendments to be prepared based on that consultation, and allow time for them to be scrutinised properly by Parliament. If Government acts quickly, completion of the DPDI Bill is still possible before the summer recess in July, so the delay is unlikely to interfere with the Prime Minister’s suggested timetable of a General Election from the Autumn.
The Data Act (Regulation 2023/2854)
In the EU, AI processing will be subject to both the Data Act and the AI Act. This part of the blog relates to the Data Act as its text is available (unlike the AI Act as of the time of writing).
The Recitals to Data Act (i.e. Regulation 2023/2854 – see references) spells out its purpose. It is to provide a Euro-wide standard covering “The proliferation of products connected to the internet” which “has increased the volume and potential value of data for consumers, businesses and society”.
The objective is to encourage reuse and interoperability of the data from different domains in order to “increase competitiveness and innovation and ensure sustainable economic growth”. However, “barriers to data sharing prevent an optimal allocation of data for the benefit of society”.
These barriers include “a lack of incentives for data holders to enter voluntarily into data sharing agreements, uncertainty about rights and obligations in relation to data, the costs of contracting and implementing technical interfaces, the high level of fragmentation of information in data silos, poor metadata management, the absence of standards for semantic and technical interoperability, bottlenecks impeding data access, a lack of common data sharing practices and the abuse of contractual imbalances with regard to data access and use”.
The Act therefore “responds to the needs of the digital economy” by “removing barriers to a well-functioning internal market for data” by establishing “a harmonised framework specifying who is entitled to use product data or related service data, under which conditions and on what basis”. Clearly, this includes processing for AI purposes.
All the above statements could have been uttered by any Brexit loving Minister concerning the reasons for introducing the DPDI Bill.
Protecting data subjects
Divergences arise when it comes to protecting data subjects in the forthcoming AI revolution. Whereas the DPDI Bill essentially reduces data protection standards to ease AI processing, the Data Act stresses that the fundamental rights and obligations that protect personal data are safeguarded by the GDPR (and by PECR). There is no reason to believe the AI Act is different.
These two “legislative acts provide the basis for sustainable and responsible data processing, including where datasets include a mix of personal and non-personal data”. Note the implication that non-compliance with these two legislative acts (e.g. by exempting GDPR provisions via the DPDI Bill) is unsustainable and irresponsible.
The Data Act therefore “is without prejudice” to the GDPR and PECR and “no provision [of the Data Act] should be applied or interpreted in such a way as to diminish or limit the right to the protection of personal data or the right to privacy and confidentiality of communications” (my emphasis: the confidentiality of communication requirement is found in Article 5(3) of PEC Directive 2002/58/EC).
Any processing of personal data should comply with the GDPR “including the requirement of a valid legal basis for processing under Article 6”, and “where relevant, the conditions of Article 9” (A.9 deals with overcoming the prohibition from processing of special category of personal data such as health records).
The Data Act “does not constitute a legal basis for the collection or generation of personal data by the data holder” (my emphasis). In other words, any personal data processed has to have an A.6 lawful basis.
Where the user [of an AI system] is not the data subject, the Data Act “does not create a legal basis for providing access to personal data or for making personal data available to a third party and should not be understood as conferring any new right on the data holder to use personal data generated by the use of a connected product or related service” (my emphasis).
The Data Act “should not adversely affect the data protection rights of data subjects”. For instance, with respect to right to object to certain disclosures of personal data, the disclosure request can be satisfied by “anonymising [the shared] personal data” [i.e. when the right would not apply]
There is no change in the Data Act to the right not to be profiled or not to be subject to automatic decision taking in A.22 of the GDPR.
With respect to subject access “where the readily available data contains personal data of several data subjects” the software can be designed to “transmitting only personal data relating to the user” [in this case, the user is the specific data subject making the request].
In addition “the principles of data minimisation and data protection by design and by default are essential when processing involves significant risks to the fundamental rights of individuals”.
Data sharing falling within scope of the Data Act should include “pseudonymisation and encryption”, but also the application of “technology that permits algorithms to be brought to the data … without the transmission between parties or unnecessary copying of the raw or structured data themselves”.
What one discovers is that the DPDI Bill has provisions that negate or vary all the above Data Act strictures; one assumes it is likewise for the AI Act..
A contrast in protection
In summary, under the EU’s Data Act, processing for AI purposes can proceed, subject to the data protection standards of the GDPR (and PECR). In the UK, processing for AI purposes can proceed are subject to the weakened standards of the DPDI Bill. (Further detail in the Hawktalk blog dated 31 Oct 2023).
There are six points of divergence between the Data Act and the DPDI Bill worthy of description in this blog; the same is likely to apply to the EU's AI Act
First, the definition of “personal data”. The Data Act defines “personal data” to be the GDPR’s definition (in Article 4(1)) and defines ‘non-personal data’ to mean “data other than personal data”. By contrast, the DPDI Bill redefines “personal data” to be at a standard demonstrably lower that the UK’s Data Protection Act 1984 and the 1981 version of the Council of Europe Convention No 108 (detail is in the Hawktalk blog dated 13 Apr 2023).
The DPDI Bill is designed to reduce the “data” that can be classified as “personal data”. It means that what the Data Act considers as “personal data” could well be treated by the DPDI Bill as “data” (i.e. free of any data protection right or obligation).
It then follows that the Bill’s reduction in the scope of “personal data” means the UK implementation of AI rules risks being considered as being defective, when compared with similar Data Act requirements (which relies on the GDPR definition of “personal data”).
It is worth reminding readers that the DPA1998 was deemed to be a defective implementation of Directive 95/46/EC because of the Durant decision in the Court of Appeal that weakened the scope of the definition of personal data. Details of the reasons for the EU’s infraction proceedings against the UK over this definition are still being refused to be released under FOI (latest attempt in 2022).
Second, the new lawful basis for sharing personal data with the public sector (Recognised Legitimate Interest). This is contrary to the Data Act, which limits the list of lawful bases to those in Article 6 of the GDPR.
So if a UK public sector controller were to rely on the new Recognised Legitimate Interest lawful basis to ask for disclosure of personal data to support its AI work, then it would the processing personal data below the standards specified in the Data Act.
Not only that, the data subject rights of erasure with respect to such disclosed personal data would not apply to any public authority that obtained the personal data via a Recognised Legitimate Interest disclosure. In addition, the right to object would be difficult to apply. This diminishing of data subject rights is also contrary to the Data Act standards, which maintains all GDPR data subject rights.
Third, exemptions that cover AI algorithm training and development. The DPDI Bill provides a new exemption from transparency and incompatibility requirements for AI training and development by controllers and related Third Parties. These are subject to certain safeguards (which are of dubious standard) and these exemptions apply, irrespective of the nature of the AI purpose.
Indeed, this new exemption from the right to be informed raises the prospect of secret transfers of personal data to Third Parties in third countries for their AI purposes. (Detail in the Hawktalk blog of 31 Oct 2023).
Finally, the new exemption from the right to be informed in the DPDI Bill for AI training and development arises because of the Bill’s expanded definition of “scientific research”; this exemption is not replicated in the GDPR. This is also contrary to the Data Act which maintains all GDPR data subject rights.
Fourth, the reversal of the right not to be subject to automated decision taking (A.22). The Data Act maintains the primacy of A.22 of the GDPR which prohibits the data subject from being subjected to an automated AI decisions or profiling in general.
Instead, the GDPR defines when a controller can take such automated decisions (e.g. decisions necessary for a contract or likely contract with the data subject; decisions authorised by law or by the consent of the data subject).
By contrast DPDI allows automated AI decisions and profiling to be taken in general, but defines the circumstances when a controller cannot make such decisions (e.g. the processing to make an AI decision is justified in terms of a Recognised Legitimate Interest – the new A.6 lawful basis for the UK).
This is a major change in protection for data subjects because in the UK decisions about individuals can be expected to be taken by AI as a matter of routine. Yes, data subjects in the UK might be able to make representations concerning the automated decision but I suspect in practice this is of little value. This is because managers will very rarely have the courage to overturn a decision of a “tested” AI Algorithm.
Fifth, the role of Data Protection by Design and by Default. When testing AI training or AI algorithms, the Data Act states that “the principles of data minimisation and data protection by design and by default are essential”. By contrast, the DPDI Bill removes this “essential” element; instead it merely states that the “appropriate safeguard” is to “show respect” for these elements.
Sixth, reducing the independence of the Information Commissioner weakens protection for the data subject. A data protection regulator that has other priorities (e.g. economic objectives) grafted on to them cannot fully protect the data subject’s interests when these priorities conflict with data protection objectives.
In addition, the Secretary of State (SoS) can effectively control the replacement Commission when the Office of Information Commissioner is abolished. Detail of how the SoS chooses all the voting, non-executive members of the Commission can be found in the Hawktalk blog dated 19 Oct 2023.
Concluding commentary
The UK has a population of 65 million data subjects; the EU a population of 750 million. Suppose you are looking at using AI to offer services to the EU and UK population or selling AI expertise/products into the EU and UK. Now answer the following questions.
- Whose AI training regime presents the highest risk to your organisation? Is it one designed under UK data laws (if they appear) or EU Data/AI Acts?
- Whose AI products and services will your customer, the data subject, trust the most? Is it one designed under UK data laws or EU Data/AI Acts?
- Whose AI products and services expose the greatest risk to your organisation, if your new emergent AI technology goes horribly wrong? Is it one designed under UK data laws or EU Data/AI Acts?
If your answer is UK data laws, then clearly European standards are chucked out of the window with obvious consequences for the risk to trade with Europe. If your answer is the EU Data Act, then implicitly they bring in the GDPR standards of data protection into the UK and not the DPBill’s reduced standard.
In short, it is answers to questions like this which explains the headline to the blog: why the Data/AI Acts and DPDI Bill combination strangles the UK’s emerging AI industry.
Of course, if you still believe in the benefits of Brexit, the previous sentence is obviously false.
Data Protection Courses in early 2024
Our well received, all-day Zoom workshop (10.00-4.30) on the Data Protection and Digital Information Bill will next be held on Thursday 29 February 2024. The fee will be £250+VAT. Email [email protected] for workshop agenda or to reserve a place on this session.
The following BCS Practitioner or Foundation courses can be attended in person, or via Zoom, or as a mixture (i.e. part Zoom, part attendance just in case “stuff happens on the day”).
- Data Protection PRACTITIONER Course is in London on Monday, 18 March 2024 to Friday, 22 March 2024 (5 days: 9.30am to 5.30pm).
- Data Protection FOUNDATION Course is in London on (April 9-11, 2024: Tuesday to Thursday, 3 days: 9.45am to 5.00pm)
References
Regulation 2023/2854 (Data Act} on “Harmonised rules on fair access to and use of data” (published in the OJ on 13 December 2023)
Details of the AI Act (text not available): https://www.europarl.europa.eu/news/en/press-room/20231206IPR15699/artificial-intelligence-act-deal-on-comprehensive-rules-for-trustworthy-ai
You forget one other consideration when designing AI products: should I design only in English, giving me access to the UK and possibly most states of the US and the anglosphere, or do I take on the far greater challenge of developing a multilingual AI product which can stand up to stringent EU legislation? And remember, for all the backslapping of the Eurocrats over having finally designed business friendly legislation (yeah right), it will be implemented by the same courts that time and again ruled in favour of Mr Schrems.
Posted by: JimMcNeill | 08/01/2024 at 08:33 AM