The following is a summary list of nineteen Data Use and Access Bill (DUAB) provisions which act to the detriment of data subjects. Because of its length, I have split these issues in two blogs. Part 2 of this set will be published on Thursday.
All these issues have been raised in my evidence to the Public Bill Committee which is considering DUAB; you can access this evidence (see references). The Committee stages start on Tuesday March 4, and are expected to end on Tuesday March 18 at 5pm.
I suspect the rush is because economic issues are overriding any privacy concerns, some of which I outline in the two blogs.
Threat to Adequacy Agreement?
This is an important question, given that renewal of the Agreement is supposed to be in the summer.
There is no “killer issue” (like the revised definition of “personal data” as was proposed in the previous two DPDI Bills of the last Government or degradation of Article 8 ECHR). However, in combination, I think there might be a collective problem arising from:
- The exemption from all the Principles for AI training when using an existing personal dataset (new A.8A(3)(c)). If that AI training processing falls within the definition of “scientific research”, the processing can additionally be exempt from the right to be informed and right to object to the processing and unrestricted transfers between Third Parties can occur.
- The exemption from Article 6 via Recognised Legitimate Interests, which provides an expansive gateway for voluntary disclosure to any public body for its public tasks. Other purposes listed in Annex 1 are catered via the existing legitimate interest lawful basis
- The exemption from “incompatibility of further processing” requirements (Article 6(4)) for purposes listed in Annex 2 negates the need for exemptions in Schedule 2, DPA2018.
- The undermining of statutory disclosure gateways that public bodies like HMRC and the security services use; DUAB provides an alternative voluntary gateway.
- The power of Ministers to designate a controller in any country as offering an adequate level of protection when that country itself is not adequate.
- The limits on the exemption from the right not to be subject to an automated decision.
- The lack of enforcement by the ICO who is fettered with economic considerations.
- Non compliance with Strasbourg jurisprudence in UK v Gaskin.
- The fact that exemptions are not being introduced via the mechanism in Article 23 which requires exemptions to be “necessary”, “proportionate” and in the “general public interest”. The implication is that the new exemptions might fail these three tests.
Looking at these two blogs, I have come to one conclusion. Either I inhabit a different planet from the Information Commissioner, or the Commissioner is asleep at the data protection wheel. I will leave readers to decide which option they prefer.
What follows is a list of issues and possible Parliamentary action MPs could take. In this latter regard I am not expecting any change, as the official opposition supported these measures in their DPDI efforts.
Lawful basis that undermines Parliament
DUAB provides Ministerial powers that deem certain processing to always be lawful (via Recognised Legitimate Interests) and compatible with the purpose of obtaining (listed in Annexes 1 and 2 as introduced into the UK_GDPR). These last two powers we deemed to be excessive and unnecessary by the House of Lords Delegated Powers and Regulatory Reform Committee (9th Report of Session 2024–25, HL Paper 49: 28 November 2024).
In summary, Paragraph 1 of Annex 1 provides for unrestricted voluntary data sharing to any public body for any of its public tasks; the remaining purposes listed in Annexes 1 and 2 already covered by Article 6(1)(f), or the exemptions in Schedules 2 to 5 of DPA2018, or a combination of both. Further detail can be found: “DUAB makes function creep in the public sector inevitable and lawful” (Hawktalk ; December 2024)
These provisions also undermine Parliamentary approval of statutory data sharing powers. Paragraph 1 of Annex 1 provides an alternative pathway to the use of powers to demand personal data which Parliament has already approved.
Many public bodies (e.g. HMRC; DWP) can either demand personal data via their many powers, or ask for the same personal data under the new voluntary pathway DUAB provides. The latter approach, if successful, negates the requirement for any statutory protection that Parliament has imposed for such mandatory disclosures. (See later in relation to a similar problem with national security agencies).
Action: The provisions in Clause 70 and 71 and both Annexes 1 and 2 should be removed especially paragraph 1 of Annex 1 and Annex 2. DUAB should specify that public bodies should use their statutory powers as Parliament intended before inviting voluntary disclosure.
AI; Precautionary Principle ignored
DUAB provides Ministerial powers that negates the right not to subject to automated decisions; it significantly degrades the protection afforded to data subjects when AI uses ordinary personal data to make automated decisions about them (new A.22D). Having a human review of an automated decision is not a safeguard as the human reviewer will usually confirm the automated decision; it does not remove any central unfairness in the decision.
Informing data subjects about the automated decision is not a safeguard if the data subject has no choice but to submit to such a decision.
The time to legislate for removal of the A.22 right is when one has actual experience of AI automated decision making; not before such decision making occurs; the Government has abandoned the Precautionary Principle here. As explained below, it could take up to 6 months to contest an automated decision with the ICO.
Action: All Article 22 modifications should be subject to a commencement provision when there has been actual and substantial experience of automated decision making using AI. The ICO can update Parliament on this issue prior to commencement.
Designating controllers as adequate
DUAB provides Ministerial powers to specify high tech firms outside the UK (i.e. established in a Third Country) as offering an adequate level of data protection even though the Third Country itself is not deemed to be adequate (new Article 45A(4)(c)(ii)). Ministers can, via negative resolution, specify for example, “Palantir in the USA is adequate to process NHS records” whilst the USA itself is not adequate. These powers are excessive, especially as these kinds of transfer can be subject to the ICO’s standard contract clauses instead.
The expansive powers in Schedule 7 that replace the transfer provisions in the UK_GDPR create a risk to the Adequacy Decision with the European Commission because the UK and Commission can significantly diverge on their assessment of the level of data protection in a Third Country.
Action: all Schedule 7 transfer arrangements should be removed. At the very least, the negative affirmative procedures should be replaced by the positive affirmative procedure.
National security evades DPA2018
The national security agencies (MI5, MI6 & GCHQ) are protected from the DPA2018 because the ICO cannot use his enforcement powers (see section 110, DPA2018); the leaves these agencies outside the scope of enforceable data protection rules.
Instead of no enforcement, actual enforcing of data protection obligations for national security purposes can be transferred to the Investigatory Powers Commissioner. This would help reassure the public that as government expands the meaning of national security (as it has indicated in the context of illegal immigration), that Part 4 of the DPA2018 and its enforcement provisions apply to such processing (rather than they don’t apply – the current situation).
Action: instead of removing the ICO from any enforcement of national security processing of personal data, a simple modification to Section 110 of DPA2018 should transfer responsibility for data protection enforcement to the Investigatory Powers Commissioner (who has the requisite security clearance to investigate complaints).
ICO should not designate
Because of the wide range of the Section 110 exemption (discussed immediately above), it is rather pointless having the ICO having any responsibility for looking at any designation notice (Clauses 82A to 82E). The impact of designation is to transfer personal databases from police control which can be enforced by the ICO (Part 3 of the DPA2018) to the control of national security agencies (Part 4 of the DPA2018) which is not enforced by the ICO.
Action: To be consistent, responsibility concerning consultation/supervision of the designation process should also be transferred from the ICO to the Investigatory Powers Commissioner.
Back door bulk data collection?
The Investigatory Powers Commissioner should be tasked with inquiring whether the exemption in Sections 26 to 28 of DPA2018 (this applies to the voluntary arrangements for disclosure of personal data, from a controller subject to the UK_GDPR, to the national security agencies) overlap with the bulk data sharing provisions in the Investigatory Powers Act 2016. This is to reassure the public that there is no “back door” method of bulk dataset collection. In DUAB, for instance, an ANPR database held by the police can be designated for transfer to the national security agencies (see also paras 4 and 5 above for other problems).
With bulk data collection by these national security agencies, there is now a statutory route (Investigatory Powers Act) and a voluntary route (DUAB and DPA2018) where one route is subject to statutory protection and the other is not (exactly like paragraph 1 above).
It is noteworthy that these agencies have a track record of using powers enacted decades earlier for their personal data collections. See “Section 94 of the Telecommunications Act 1984: a warning from history” (Hawktalk ; November 2015). That is why reassurance is required.
Action: the Investigatory Powers Commissioner should be tasked in his Annual Report to include a section on how the national security agencies comply with the provisions in Part 4 (DPA2018).
Six month delay for complaints
When the data subject complains to the ICO, the ICO expects the complainant to have exhausted the controller’s complaint resolution procedures. DUAB formalises this step and introduces at least a 30 day delay before data subjects can complain to the ICO (New sections 164A(3) and 164A(4)). Note that the 30 day limit relates to acknowledging the complaint and not its resolution; resolving the complaint is associated with a further unspecified time.
Suppose an A.22 complainant writes to the controller about an automated processing decision. Thirty days after that decision, the data subject gets an acknowledgement and then, after an unspecified time, the outcome of the appeal. Only then will there be an approach to the ICO who usually takes 4 to 6 months to form a view. Faced with this six month delay, many data subjects won’t bother complaining to the ICO about the automated decision (or about anything else). I suspect that this will be the same for many complaints about controllers.
Action: Exclude A.22 automated decisions from Clause 164A proposals and ensure that the 30 days limit relates to resolution of the complaint, not the fact that a complaint has been received.
Razzamatazz with RAS/AI training
The exemption for scientific research (RAS) purposes introduced by new A.13(5) to A.13(7) and A.14(4) to A.14(6) (right to be informed) does not follow the implementation of exceptions from rights via the usual GDPR mechanisms specified in A.23. The definition of scientific research includes some AI development functionality which is a concern.
Additionally, the “appropriate safeguards” which support this exemption are demonstrably unreliable (e.g. they undermine the Data Minimisation Principle). The exemption also includes the fact that RAS processing is always compatible and disclosures to Third Parties for RAS purposes can be subject to an exemption from transparency (even when personal data are transferred to a Third Party outside the UK).
It does not matter what the nature of the research or AI purpose is. Full details can be found in Data Bill legislates for expansive degradation of data subject protection (Hawktalk ; February 2025).
Action: Remove the exemption from A.13 and A.14; the SoS should be told that he has powers to reintroduce it via A.23(1)(e).
UK v Gaskin ECHR ruling ignored
DUAB fails to correct the non-implementation of the ECHR Judgement in the case of Gaskin v UK (1989) 12 EHRR 36. I have included this as the Adequacy Agreement with the European Commission assumes Strasbourg Jurisprudence on the ECHR/data protection is followed by the UK, and this is an example where it isn’t. Such non-compliance is a red-line for the Commission.
Details of the Gaskin case as it applies to data protection can be seen under the heading “Confidential References exemption (DPA1998)” (See the middle of the text in Hawktalk blog; December 2024).
Action: Change the exemption in Schedule 2, paragraph 24 of the DPA2018 so it only applies to the sender of the confidential reference (i.e. return to the position established in the DPA1998).
To be continued in Part 2 (Thursday)
References
Download my evidence to the Bill House of Commons Bill Committee here Download Problems with DUAB – Committe evidence
Spring Data Protection Courses
Amberhawk is holding our first session on the changes to the UK’s data protection regime arising from the Data (Use and Access) Bill, by Zoom, on Tuesday March 25: (10.00am-4.30pm; £275+VAT).
The following courses following BCS Practitioner or Foundation syllabi on the DPA2018/UK_GDPR can be attended in person, or via Zoom, or as a mixture (i.e. part Zoom, part attendance just in case “stuff happens on the day”).
- Data Protection FOUNDATION Course: London on March 11-13 (Tuesday to Thursday, 3 days: 9.45am to 5.00pm).
- Data Protection PRACTITIONER Course: London on March 17-21 (Every day this week: 5 days: 9.30am to 5.30pm).
More details on the Amberhawk website: www.amberhawk.com or email info@amberhawk.com.




