The Prime Minister and the Chancellor of the Exchequer have gone "all in" with Artificial Intelligence (AI) in the expectation that it generates economic growth. It has told the main UK Regulators, including the ICO, to ease-off on general enforcement if such enforcement creates a serious risk to that growth.
To ensure Regulators fall obediently in line, the chair of UK's competition Regulator was removed by the government in late January, to be replaced by someone more “amenable” to the growth agenda. As it happens, our own ICO is ahead of the pack; not enforcing the UK_GDPR on economic grounds has been established as policy for two years now.
To illustrate the degradation of the data subject position from the UK_GDPR, the blog explores the flexible provisions for “RAS processing”. This is defined broadly in the Data (Use and Access) Bill (DUAB) to include the scientific R&D purpose, some AI training or AI product development in the public and private sectors.
To assist controllers, DUAB provide new exceptions from data subject rights; most notably the right to be informed about the processing, the right to object to the processing, and the right not to be subject to automated decisions. It also permits personal databases to be exchanged with Third Parties in secret for their RAS purposes.
Also relevant to RAS processing are the general provisions that reduce the protection for data subjects with respect to transfers outside the UK, compatibility, transparency, lawfulness of processing, data minimisation and governmental control over the ICO’s independence.
With the UK aligning itself with the AI policies of a quixotic Trump regime (as opposed to those of the European Union), I would not be surprised if murmurings about renewing the UK Adequacy Agreement begin to surface.
Listing the concerns
The following long list summarises the impact of DUAB provisions, some of which just that relate to RAS processing but others are of general application. The provisions, all to which act to the detriment of data subjects, exist to:
- define scientific R&D as a subset of RAS processing that includes some AI testing and development in order to ensure the relaxation in UK_GDPR standards, applies to the private sector for its scientific R&D processing;
- allow further processing of already collected personal datasets for private sector scientific R&D purposes, using the “legitimate interests” lawful basis. This lawful basis would cover not only the controller’s R&D but also disclosure of personal data to any Third Party for its R&D (and any onward to a chain of other Third Parties for their R&D);
- allow such legitimate interests processing to occur without the knowledge or approval of the data subject, subject to some weak “appropriate safeguards”. In other words, any further use by a controller (or by any Third Party) for scientific R&D purposes might not be transparent to data subjects;
- skew the legitimate interests balancing test for RAS processing in favour of controllers and Third Parties through the use of these appropriate safeguards;
- make it difficult for data subjects to exercise the right to object or erasure to the scientific R&D purpose. This arises as data subjects might not know who has been given their personal data and therefore do not know the identity of the controllers to whom they have to contact concerning their rights;
- ensure that any further scientific R&D processing (i.e. after personal data have been collected by the controller) is always compatible with the original purpose of the obtaining of personal data, even if it involves disclosure to Third Parties for their scientific R&D purposes. The nature of the scientific R&D purpose is irrelevant;
- make no provision for data subjects to retrieve their position, prior to the use or disclosure of their personal data for RAS processing;
- ensure that all AI processing of personal data, when testing of the application of the data protection principles (e.g. fairness) are additionally exempt from incompatibility considerations. As with the scientific R&D purpose, it does not matter what that AI purpose is (e.g. from saving babies to the opposite);
- allow for data subject consent to be replaced by legitimate interests as the lawful basis of choice for scientific R&D and much AI training and development;
- weaken the application of the first three Data Protection Principles (transparency, lawfulness, purpose limitation, data minimisation) in the context of scientific R&D;
- provide Ministerial powers that can add special category of personal data to the list in A.9(1) UK_GDPR in order to permit the processing of such special data in general;
- provide Ministerial powers that can make the processing for scientific R&D a Recognised Legitimate Interest (as was proposed in the previous DPDI Bills);
- provide Ministerial powers that deem certain processing to always be compatible with the purpose of obtaining (these last two powers we deemed to be excessive and unnecessary by a House of Lords Committee);
- provide Ministerial powers that can reduce further the current DUAB “appropriate safeguards” associated with the scientific R&D purpose;
- reduce the scope of the A.22 right not to be profiled so that it only applies to special category of personal data;
- provide Ministerial powers that can reduce the protection afforded to data subjects when AI uses personal data to make automated decisions about them;
- provide Ministerial powers to specify high tech firms outside the UK (i.e. established in a Third Country) as offering an adequate level of data protection even though the Third Country itself is not deemed to be adequate;
- introduce a 45 day delay for data subjects wanting to complain to the ICO as they first have to approach the controller with their complaint (assuming they know who that controller is); and finally (!!)
- does not follow the implementation of exceptions from rights via the usual GDPR mechanisms specified in A.23; this could raise future problems associated with European Commission’s adequacy determination or with judicial review of DUAB provisions.
The reasons for all the above are explained in the following blog text (that is why it’s long).
Lack of trust
Two quick questions for you?
- Should data subjects know which Third Party is using their personal data for scientific R&D?
- Should data subjects know whether their personal data are transferred by a controller to Third Parties outside the UK (e.g. to the USA) for their scientific R&D purposes?
With “legitimate interests” as the lawful basis for scientific R&D, DUAB’s answer to the above is “NO”.
The position outlined above is likely to undermine “trust” in other scientific R&D (and AI training) endeavours which do not rely on the lawful basis of legitimate interests. For instance, some R&D has to rely on data subject consent (e.g. scientific R&D that uses special category of personal data or confidential personal data); in these circumstances, there will be no change via DUAB and controllers will not be able to rely on legitimate interests as described below.
I think data subjects are increasingly likely to refuse consent for any R&D, if for example, they realise their ordinary personal data has been despatched to secret destinations for unrestricted processing for-any scientific R&D/AI project, irrespective of the purpose or nature of that project (as permitted by DUAB).
No thought has been given as to how the data subject can retrieve the position if things go wrong (e.g. get back to the position prior to the further use or disclosure of personal data for scientific R&D).
Finally, the ICO’s light touch approach to enforcement is likely to mean that eventually, data subjects will lose faith that their privacy can be protected. The ICO has already lost the confidence of many NGOs and privacy academics working in the data protection space.
What is “scientific research”?
“Scientific research” is defined to be “any research that can reasonably be described as scientific, whether publicly or privately funded….” where this includes “processing for the purposes of technological development or demonstration, fundamental research or applied research, so far as those activities can reasonably be described as scientific…”.
DUAB includes scientific R&D as part of its definition of “RAS purposes” which extends to statistical and historical research. However, it can be seen that some private AI research or AI training can fall within the meaning of “scientific research”.
It is the extension into the private sphere that is likely to cause problems with A.23 and adequacy (see later).
Note added 28 Feb 2025. This text ignores the House of Lords amendment of scientific research definition to include a "public interest test"; this requirement is being removed at the Committee stage of the DAUB in the Commons restoring the definition used in the blog. Clearly a definition that allows for "privately funded" research in "public interest" is likely to be contradictory as such research is for a private interest.
Special category of personal data
In this regard, the Secretary of State (SoS) is seeking powers to include “additional description of processing of [special category of] personal data which is subject to the prohibition in A.9(1))”.I
This is touted as a protective measure but it is not; it could allow the SoS to introduce sub-classification of existing special category of personal data linked to a particular context or purpose (e.g. on the lines used in the existing “biometric data for the purpose of uniquely identifying a natural person”; see A.9(1)).
For example, a class of special category of personal data such as “cancer records in the context of training AI algorithms” would make it easier to process such personal data in accordance with one of the conditions that lifts the prohibition.
Thus, far from prohibiting the processing of special category of personal data, the power to add context to existing special categories of personal data could expand the processing of these special categories without the explicit consent of the data subject (the option in A.9(2)(a)).
The “appropriate safeguards”
The appropriate safeguards associated with scientific R&D are wholly unconvincing.
However, before exploring this statement, a general point about the proposed new A.84B(1). This suggests that on collection of personal data, a controller should convert the personal data into anonymised or pseudonymised form. That’s fine: but if this step is not feasible, the processing for scientific R&D can be fulfilled by using the unamended personal data [A.84B(1)(c)].
In other words, if a controller decides it can’t anonymise or pseudonymise, the unconverted personal data can still be used for scientific R&D; this undermines the point of the provision.
Remember also, that when personal data are pseudonymised, it means that the identity details of data subjects can easily be obtained (and further disclosed for Third Party scientific R&D processing, as required).
Note that the “collection of the personal data” is qualified by “whether from the data subject or otherwise”. As will be seen below, this covers the circumstances when any Third Party obtains personal data from any controller via the legitimate interests lawful basis.
All actual appropriate safeguards apply [A.84B(2)] and the detail of these safeguards are found in new A.84C. They can be summarised as follows:
- No decision or action is taken about a particular data subject unless the R&D is medical research approved a medical ethics committee (of which several are identified). This is not a new safeguard; it is already in the UK_GDPR (as it was the DPA1998 in S.33(1)).
- The processing for R&D purposes should not be “likely to cause substantial damage or substantial distress to a data subject”. This “safeguard” means that moderate damage/distress to any data subject (i.e. anything short of substantial damage or substantial distress) is acceptable to Government.
- Show “respect” for Data Minimisation Principle. This touted “safeguard” is likely to reduce standards. Normally, a controller “shall apply” the Data Minimisation Principle (the A.5 requirement) rather that “respect” this Principle. This “safeguard” (through that the use of the word “respect”) has effect of lowering the standard set by the Data Minimisation Principle itself.
In any event, the S.o.S is given powers in DUAB to vary the appropriate safeguards (and you should be able to guess what that means).
“Consent” lawful basis
There is a more flexible, definition of consent for scientific R&D which I am not concerned about, because the data subject can still reverse consent (in theory) in such a way that the right of erasure applies (see A.17(1)(b)).
For instance, if the further processing for scientific R&D relies on data subject consent as the lawful basis, then obviously the data subject will be given transparency details about such further processing in order to provide consent. Properly formed consent also makes any further processing compatible with the purpose of obtaining; additionally evidence of consent has to be retained.
Also if the data subject withdraws consent, it follows that the right to erasure is likely to apply; this extends (In theory) to other Third Parties relying on that consent. Withdrawing consent also engages an objection to the processing and, of course, it has to be as easy to withdraw consent as to give it.
Dealing with data subject consent is seen as “a right faff” by many controllers. That is why they will welcome the move from data subject consent to legitimate interests (as explained below) to justify their scientific R&D processing (especially as they don’t have to involve the data subject in that processing).
Controller’s exemption for Scientific R&D
Suppose a controller has collected several ordinary-content personal datasets and wants to further use them for scientific R&D purposes (this includes some AI R&D remember). Obviously an A.6 lawful basis is required for such further processing, the processing has to be transparent to data subjects and not incompatible with the purpose of obtaining.
These latter two requirements (transparency, compatibility) are negated by the exceptions in DUAB (see new Articles 8A(3), 13(1)(5) to (7)). For instance, any further processing purpose for scientific R&D is deemed to be compatible with the purpose of obtaining so long as the (dodgy) appropriate safeguards apply (see above).
Additionally, the exception added to A.13 states that there is no need to inform existing data subjects of the further purpose if the further processing of personal data is…
- “(i) for (and only for) the purposes of scientific….research…, and
- (ii) in accordance with A.84B, and [Remember A.84B(2) contains a requirement to apply all the appropriate safeguards in A.84C to the processing]
- (b) providing the transparency information is impossible or would involve disproportionate effort”.
“Disproportionate effort” depends on “among other things, the number of data subjects, the age of the personal data and any appropriate safeguards applied to the processing”.
Curiously, the definition of disproportionate effort that helps justify the use of the use of the exception does not require all the appropriate safeguards to apply, unlike the requirement in A.84B(2). I suspect this wording is to make it easier to claim disproportionate effort and not trouble data subjects with a Privacy Notice.
“Legitimate interests” exposed
Any controller who takes advantage of this exception “has to take appropriate measures to protect the data subjects’ rights, freedoms and legitimate interests” and make information available to the public (e.g. a statement on the controller’s website will suffice). This general information replaces the detailed Privacy Notice given to each data subject.
The reference to “the data subjects’ rights, freedoms and legitimate interests” only makes sense if the relevant lawful basis, used by the controller to justify the scientific R&D processing, is legitimate interests (i.e. A.6(1)(f)).
If this lawful basis is used and the appropriate safeguards apply, the controller can additionally argue that (as well as the right to be informed) the right to object and the right to erasure is also significantly nullified (if not extinguished; see revised A.17(3)(d) and A.21(6)).
The argument for this nullification is that if the right to be informed is expressly negated by statute via the use of the “appropriate safeguards”, then it follows that these safeguards sufficiently protect the interests of data subjects to a degree that the right to erasure or object can also be nullified.
In this way, legitimate interests will develop, over time, as the preferred lawful basis as it effectively negates two data subjects’ rights.
Not an Article 23 exception
Note that this exception is being introduced directly into A.13/A.14 and not by the normal UK_GDPR procedures for new restrictions (or exemptions) found in A.23.
This rather nerdy point might be a factor in any Adequacy decision or in any future Judicial Review as A.23 has specific protections for data subjects whenever new restrictions/exemptions from rights are introduced. For instance, A.23(1)(e) requires exemptions/restrictions to be “proportionate”. “necessary” and protect an “important objective of general public interest”.
In practice, I think the inclusion of “private interests” in the definition of scientific research means the exception cannot be introduced by A.23(1)(e) which assumes exemptions are in the public interest, and not a private interest.
Note the implication that the exception in A.13 is also neither proportionate” nor “necessary”.
Third Party’s exemption for Scientific R&D
The intended use legitimate interests as the lawful basis permits disclosures of personal data by a controller to any Third Party for its scientific R&D purposes. From the disclosing controller’s perspective, such a disclosure would be is in the legitimate interest of a Third Party.
As seen above, such a disclosure can be exempt from the A.13 right to be informed (and right to object), assuming those [dodgy] appropriate safeguards apply. In other words, a controller making a disclosure can disclose existing databases of personal data to a Third Party in secret (as specified above) for its scientific R&D purpose so long as these appropriate safeguards apply.
Remember that this further disclosure for a Third Party’s Scientific R&D is deemed to be compatible (New Article 8A(3)(b)) if the appropriate safeguards apply.
Any Third Party which receives the personal data from a controller is subject to the A.14 right to be informed. What we will see is that it has a similar exception that ensures secrecy, so long as those appropriate safeguards apply.
In this case, the exception for a Third Party from providing transparency information applies if it “would involve disproportionate effort” or “is likely to render impossible or seriously impair the achievement of the objectives of the processing for which the personal data are intended”.
“Disproportionate effort” again is defined exactly as with A.13 and depends on “among other things, the number of data subjects, the age of the personal data and any appropriate safeguards applied to the processing”.
There is no indication of what triggers “seriously impairment” but the argument can be made that “we have no need to tell data subjects because it will seriously impair the research” is a distinct possibility.
In a case of a Third Party relying on “seriously impairment” or on the use of “disproportionate effort .that Third Party. “must take appropriate measures to protect the data subject’s rights, freedoms and legitimate interests, including by making the information available publicly” (new A.14(7)).
This means the Third Party has, for example, to make public statements on its website. How can data subjects know to look at the Third Party website if they don’t know who the Third Party is? Answers on postcard to the fairies, I suppose.
Note the reference in new A.14(7) to “data subject’s rights, freedoms and legitimate interests” confirms that legitimate interests in A.6(1)(f) is the lawful basis used to substantiate this processing by a Third Party.
Finally, that Third Party could become a controller processing pseudonymised personal data if it follows the provisions in new A.84 (as described above). It thus becomes possible for that controller to disclose the raw personal data to another Third Party who becomes a controller who can the disclose to another Third Party.
All these disclosures are compatible, secret so long as some dodgy appropriate safeguards applies and general text describing the scientific R&D processing is placed on a website.
In other words, personal data can be swopped around a host of Third Parties for their scientific R&D processing, and some of those Third Parties could be in the USA.
Guess what can go wrong? A Third Party in Trump-land collects these databases in order to re-identify data subjects. All the data subject can do is ask: “Please Mr. Trump, can I have my personal data back?”.
Transfers to Trump-land?
New Article 45A (in Schedule 7 of DUAB) permits the Secretary of State (S.o.S) to approve the transfer of personal data outside the UK via negative resolution procedures; this means that Parliament is generally excluded from considering the matter.
However, it is a mistake to see this power as being limited to defining a particular Third Country as adequate; the power can approve transfers to “a sector or geographic area within a third country” (e.g. Silicon Valley) or a specific “controller or processor” (e.g. Palantir) as being adequate (detail in Schedule 7, paragraph 4(c) of DUAB).
It can be seen the power to designate any controller on the planet involved in scientific R&D as offering an adequate level of data protection because some [dodgy] “appropriate safeguards” adds to the concern that these DUAB provisions are not safe.
Additionally, if the exception in A.13/A.14 applies than transfers to Third Countries will not be transparent to data subjects in a Privacy Notice.
Concluding commentary
The long list of concerns at the beginning of this blog, shows how the government is using DUAB to sacrifice the protection of data subjects in order to pursue its economic agenda.
When you add in a laissez faire regulator who does not use his powers, publicly supports the government’s economic agenda, does not enforce the UK_GDPR to protect data subject’s interests one can see that the problems begin to mount.
This week, the UK decided (Tuesday 11 Feb) not to sign the artificial intelligence (AI) communique at a global AI summit in Paris. Signatories to the communique pledged an "open", "inclusive" and "ethical" approach to the technology's development.
The UK instead, afflicted by “economic AI FOMO” disease, preferred to cling onto President Trump’s coat-tails.
The fact that the UK decided to be out of step with the European Union will not go unnoticed by them, nor will be the fact that the USA high tech industry has the ear of UK ministers. Issues such as Ministers seeking untrammelled powers over personal data (e.g. to make processing lawful, compatible or that could deem Google or Facebook to be adequate) now appear to carry significant risk.
If the European Commission wanted to “do something”, one recourse would be to explore whether the UK’s unregulated data protection law, as modified by DUAB, would be inadequate in terms of the GDPR. After all, the UK’s Adequacy Agreement, is due to renewal before 27 June 2025.
If the European Commission wants to take such a hard line, all it has to do in justification is look at the list of issues I have raised at the beginning of this blog.
Spring Data Protection Courses
Amberhawk is holding our first session on the changes to the UK’s data protection regime arising from the Data (Use and Access) Bill, by Zoom, on Tuesday March 25th 2025: (10.00am-4.30pm; £275+VAT).
The following BCS Practitioner or Foundation courses can be attended in person, or via Zoom, or as a mixture (i.e. part Zoom, part attendance just in case “stuff happens on the day”).
- Data Protection FOUNDATION Course: London on March 11-13 (Tuesday to Thursday, 3 days: 9.45am to 5.00pm).
- Data Protection PRACTITIONER Course: London on March 17 to March 21 (Monday to Friday: 9.30am to 5.30pm).
More details on the Amberhawk website: www.amberhawk.com or email info@amberhawk.com.




