Last week, the Prime Minister was quoted concerning the need to ensure Artificial Intelligence (AI) is “introduced safely and securely with guard-rails in place”. Strange to find that he appears to be unaware that several of these urgently needed guard-rails are being dismantled by the DPDI No.2 Bill (the “Bill”).
On June 6th, I delivered a presentation to the Data Protection Forum where there was lively discussion concerning the weakening of data protection within several identified issues of the Bill.
I have decided to repeat that lecture, this time as a video so readers can follow the detail of those discussions. The video (if interested see URL at the end) lasts 51 minutes (171 MB).
For those who do not have the time to hear my dulcet tones, the rest of the blog summarises the points I was making.
Bogus savings for MPs
First (slide 4), the saving figures produced by Government are suspect; indeed, they are “put them on the side of a Brexit Bus” dishonest. The Minister, for example, in the Commons boasted that £90 million per year can be used by SME’s for “higher investment, faster growth and better jobs”.
Well if you do the maths, there are 1 million SMEs notified with the ICO; this works out at £1.73 per SME per week. I could be wrong, but I don’t think one obtains much “higher investment, faster growth and better jobs” for £1.73 per week per SME.
No Brexit Dividend
The next point is that, contrary to Government claims, (mindlessly repeated by others, I hasten to add), there is no “Brexit dividend” arising from the UK departing from GDPR rules.
In slides 5 to 10, I show how the GDPR does not put barriers in the way of innovation and proof of concept processing and that this mechanism is open to all of the UK’s Euro-competitors. If they follow the recipe I outline, they are unlikely to be hindered by the GDPR until they are satisfied that the proof of concept works and an implementation phase begins.
Operational systems exposed
Large differences between the Bill and the unamended UK_GDPR arise when one goes from “proof of concept” processing of personal data to the implementation phase (this comparison is made in slides 11 and 12).
Here I show that a high risk implementation (e.g. of AI) might now not get a DPIA, or expert DP advice from a DPO, or even reported to the ICO if there is a residual high risk to data subjects that cannot be mitigated. In other words, the Government’s changes mean that high risk AI processing could be exposed to a lack of data protection input (especially if that processing is undertaken by a start-up company).
Not only that. The fact that Ministers have powers in the Bill to determine lawful processing, compatible processing, the ICO statutory priorities, content of Codes of Practice, the right not to be subject to automated decision taking, and ensure the ICO has to consider non-DP factors prior to enforcement all collectively weaken the hands on the privacy protection tiller.
There is talk of a 6 month moratorium on AI and conferences in the Autumn to decide what to do. Perhaps this Bill should be held-in limbo for that time, because if these discussions conclude that legal protection is needed by society, this Bill would be well placed to put them quickly into effect.
Defective definition
In slides 13 to 16, I return to the definition of “personal data” as it risks weakening protection. I present my analysis which shows that Facial Recognition Cameras, as used by South Wales Police, might not be processing personal data according to the revised definitions.
I also show that the definition of “personal data” falls below the standard set forty years ago by the DPA1984.
Datashare? Rhymes with nightmare
Finally I touch on the expansive voluntary data sharing regime (slides 17-22) and explain how Ministerial powers enable them to make their pet projects process personal data lawfully (by adding lawful basis to Annex 1 of the Bill) and also ensure they are compatible with the processing (by adding to Annex 2).
I show the balancing tests between the interests of data subjects and the interests of the controller that protect the public from unlawful disclosure have been abandoned. These protections arise because there is either a “public interest” test associated with the disclosure to a public body, or the disclosing controller performs “legitimate interest test” prior to disclosure to that public body.
Either way, the Bill’s legislative changes provide a prime example of how the Bill eases the burden of data sharing for controllers at the expense of privacy protection for data subjects.
I also note that there is little the data subject can do if the data sharing goes pear-shaped. The underlying assumption is that data sharing is a “good thing”; no thought has been given as to what happens if that assumption is wrong.
Finally, I pose a question: “what is the point of Parliament providing, for example, HMRC statutory powers to demand personal data, subject to detailed statutory safeguards, if both statutory requirements & safeguards can be avoided by using a voluntary approach towards data sharing pursued in this Bill?”.
Further weakening
Other weaknesses are mentioned in slide 3. I had no time to develop them into my talk so I list them as follows:
- Third party marketing is a “legitimate interest”; this overturns four decades of “third party marketing requires consent”.
- Policy towards ECHR jeopardises Adequacy Agreement. The Illegal Immigration Bill provides an example of powers that allow Ministers to ignore Strasburg Human Rights rulings. The Agreement states that the UK Government will follow these rulings.
- The right not to be subject to automated decision/profiling (likely to be important for AI) is diminished in importance as Ministerial powers can dictate when such decisions/profiling can occur.
- The risks from onward transfers remain (e.g. France to UK to Gibraltar). Gibraltar is adequate as far as the UK is concerned; it is not adequate as far as the European Commission is concerned. I am expecting many large controllers will have to deal with this situation as the Bill’s transfer arrangements are very generous.
As an aside, I am expecting, for Rwanda to be determined as offering an adequate level of protection. After all, if the UK is dispatching illegal immigrants to that country, it would be embarrassing if Rwanda failed the tests presented by the ICO in his “Transfer Risk Assessment” documentation.
I wonder if DSIT, Home Office, FCO or any one of the ICO’s thousand staff have done this TRA?
Concluding comment
The Government has truncated debate in the Committee stage and pushed through its legislative program through in six sittings in less than 10 days in May; such is the disdain held for privacy protection. Even then, the debates have been lamentable and it appears that there has been little input from NGOs working in the privacy space.
I suspect that either the Opposition does not understand the nuances of the situation or that it supports the Government’s motives behind the Bill. I have not worked what their position is, yet.
Hopefully, the video will help redress that shortfall for the House of Lords Debates. The Video (51 mins) is accessible at: https://www.dropbox.com/s/9s424y1r13v9dv9/DP%20forum%202023%20june%20%20v2%20lowq.mp4?dl=0
Data Protection Courses (Summer 2023)
An all-day Zoom workshop (10.00-4.30) on the Data Protection and Digital Information No 2 Bill. Will be held on Thursday 13 July 2023 hopefully to include changes made during the Committee stage of the Bill. The fee will be £250+VAT. Email [email protected] for workshop agenda or to reserve a place on this session.
The following BCS Practitioner or Foundation courses can be attended in person, or via Zoom, or as a mixture (i.e. part Zoom, part attendance just in case “stuff happens on the day”).
- The next Data Protection FOUNDATION Course is on (September 19-21 2023 (Tuesday to Thursday, 3 days: 45am to 5.00pm) or
- The next Data Protection PRACTITIONER Course is in London on Monday, 24 July 2023 to Friday, 28 July 2023 (5 days: 9.30am to 5.30pm).
Full details on the new Amberhawk website (www.amberhawk.com) or obtained by emailing [email protected].