How embarrassing! The UK’s Equality and Human Rights Commission (EHRC) is joining a judicial review (JR) to ensure the Metropolitan Police’s current use of live facial recognition technology (FRT) complies with human rights law.
Ten days earlier, our laissez-faire ICO issued the blandest of non-statements (13 August 2025) concerning the Police use of FRT: “When used by the police, FRT must be deployed in a way that respects people’s rights and freedoms, with appropriate safeguards in place”.
What these appropriate safeguards look like should become central to the legal arguments now expected in January (see references); that is why this JR will be very important.
To add to the embarrassment, in July, the European Commission published a rose-tinted (but completely erroneous) draft Adequacy Agreement (see last two blogs); this determined the UK’s data protection and human rights regime to be “adequate”.
Yet, within four weeks, the UK’s human rights regulator joined an action, which at its heart, claims that the largest UK Police force is systematically infringing the human rights of the millions daily traversing the streets of London. So much for high quality analysis from the Commission!
In summary, the EHRC has concerns that the Metropolitan Police’s current use of FRT breaches Article 8 (right to respect for privacy), Article 10 (freedom of expression and to impart information) and Article 11 (freedom of assembly and association).
As an aside, it is important to realise that the Data (Use and Access) Act 2025 puts the ICO in an impossible position. He has to “promote public trust and confidence in the processing of personal data” whilst at the same time he has to have regard “to the importance of the prevention, investigation, detection and prosecution of criminal offences” and “the need to safeguard public security…”.
In other words, when it comes to data protection issues associated with as FRT, the ICO is fettered by having to address conflicting priorities. Put simply, the EHRC’s action has exposed a major flaw in the UK’s enforcement regime in data protection.
Some FRT data protection issues
The Met’s DPIA (see references) states that there is no processing by its FRT which is subject to the UK_GDPR. This means every data subject on the FRT watchlist has to be of interest to the police for any of the law enforcement purposes.
These law enforcement purposes are specified in S.31(1) of the DPA2018. They are: “the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, including the safeguarding against and the prevention of threats to public security” (my emphasis).
It can be seen that the latter emboldened text is most likely to create a chilling impact on the human rights Articles as identified by the EHRC above. For instance, could someone choose not go on a demonstration because they think that their presence on a particular march would be recorded by the authorities via FRT, and subsequently used to their detriment?
The Watchlist
In data protection terms, any use of biometric data in connection with any of the law enforcement purposes is defined to be “sensitive processing”; such processing is subject to a “strict necessity” test (S.35(5) of DPA2018). It follows any FRT watchlist should only store the biometrics of data subjects whose personal data would pass this “strict necessity” threshold.
The DPIA states that such a “Watchlist is constructed using images of Sought Persons which are already held by the MPS” (para 3.4, table at row 1). Although there has been press reports that the Government intend to make the DVLA and Passport photographic biometric databases available for general FRT deployment by the Police, such access does not appear to have happened at the Met (yet).
Despite this statement, another party to the JR (Privacy International) has just posted on its website that “The Government has secretly allowed police forces to search over 150 million UK passport and immigration database photos with “Orwellian” facial recognition technology for 6 years”.
If indeed the Police do eventually have access to these immense biometric databases for FRT related processing, it means everyone who has a Passport or Driving Licence has the potential to be on a Watchlist. So the construction of a Watchlist is all important.
The DPIA associated with the Met’s use of FRT does not explain how data subjects qualify to be on the Watchlist as a “Sought Person”, nor does it describe the categories of Sought Persons. Indeed the DPIA (para 3.8) suggests Watchlist is downloaded just prior to deployment of FRT cameras.
This could be a step that ensures that only an accurate and up to date Watchlist is uploaded. Alternatively, it could imply that a Watchlist deployed for say for the Notting Hill Carnival could be different Watchlist from that deployed at a “Stop The Genocide” demonstration.
Note that the longer the Watchlist, the greater the chances that a misidentification error will occur. One suspects also, that the Met’s Watchlist will have entries from neighbouring police forces, but this is not confirmed in the DPIA.
What is identified in paragraph 3.15 of the DPIA, is that a Watchlist could have 10,000 photographs. Given the inevitable advance of technology, it is easy to imagine each FRT deployment involving an UK-wide Watchlist that could easily exceed 100,000 entries.
Alerts
The DPIA states that if there is a match, there is an alert sent to an officer viewing a screen who then decides whether or not to intervene with the Sought Person. If there is no alert (i.e. no biometric match), then the image that is captured appears blurred on any FRT display monitor; it is then immediately deleted.
A decision not to intervene in the case of an alert could be because the match is false (i.e. it is recognised that the person caught on camera and subject to an alert has similar facial details to a Sought Person but is, on inspection, not the Sought Person).
But such non-intervention could also indicate an intelligence gathering exercise (e.g. this Sought Person was at a this specific location at that specific time). In such cases, those around this Sought Person could also become known to the police as they will be recorded in other normal CCTV footage collected at the same date, time and location as specified by the alert.
Surveillance and intelligence
This prospect raises the suspicion that FRT deployment could also become an intelligence gathering tool as well as being used for the apprehension of Sought Persons.
For instance, I wonder what a Home Secretary (in the next Reform Government?), who has implemented “an extreme hostile environment policy” towards immigration, would do with FRT? Remember any Reform Home Secretary, who follows in Theresa May’s footsteps, has overall responsibility for policing in England and Wales, including strategic oversight of the Metropolitan Police.
Indeed, the history of CCTV deployment by the Police does include cases of mass surveillance for intelligence gathering purposes. For instance, “Project Champion” involved the use of CCTV Cameras in parts of Birmingham in 2010, and was used for intelligence purposes (see references for detail).
That is why the EHRC is correct to stress that the procedures surrounding FRT deployment are so important. Information about these procedures reassure the public that FRT does not stray into areas that have a chilling effect as described by the EHRC. From the police’s perspective, such information also makes the cost-benefit case for FRT deployment for law enforcement purposes.
The use of ANPR
There is an overlap between the data protection and human rights issues raised by the deployment of Automatic Number Plate Recognition (ANPR) and FRT. Hopefully, the Court’s conclusions and considerations can be applied to the use of ANPR cameras by the police.
The retention of personal data comprising vehicle registration plates captured by ANPR varies depending on whether a plate relates to a “vehicle of interest”. Did you know, for example, that details of all vehicle registration plates passing each ANPR camera are retained for between 90 days locally, and centrally for a year, and perhaps longer?
Although ANPR images do not involve sensitive processing nor contain biometrics, they are still have to be retained in a way that is necessary (rather than strictly necessary) for a law enforcement purpose.
One assumes such a long retention is for law enforcement purposes, but this is not at all clear from the DPIA associated with ANPR (see references). The current ICO has made no public statement specifically on the use of ANPR by the police.
In summary, the police’s ANPR policy was widely called “Denying criminals the use of the roads”. The current use of FRT can be seen as an extension of this policy: “Denying criminals the use of the pavement”.
The Judicial Review of FRT
In the 2020 case R (Bridges) v Chief Constable of South Wales Police (see references), the Court of Appeal found that the use of FRT at the time was unlawful, breaching privacy rights and the Equality Act 2010.
Since the Bridges ruling, FRT has been deployed more frequently, integrated into CCTV networks, and used with larger watchlists. Thousands of faces are often being scanned per deployment. The EHRC’s submission states that these developments mean the use of the technology poses a threat to human rights.
The proposed JR claim (see references) therefore raises issues of significant public importance. In its intervention, the EHRC has been given permission to provide submissions and evidence on the intrusive nature of FRT which focus on the way in which the technology has been used in recent years by the police.
The EHRC’s submission stresses that FRT can be intrusive, especially when used on a large scale, and warns that its use at protests could have a “chilling effect” on individuals’ rights under Articles 10 and 11. These rights, the EHRC claims, are fundamental to democratic society.
Data also shows that the number of black men triggering an ‘alert’ is higher than would be expected proportionally, when compared to the population of London.
Statistical point
Additionally, the EHRC notes that the accuracy of the technology is paramount and even low error rates can translate to significant numbers of false identifications when using large watchlists.
The point the EHRC is saying can be illustrated as follows. Suppose we use the DPIA’s error rate (in paragraph 3.15) of one in six thousand (0.017%) in its use of FRT. Suppose further, FRT is deployed to monitor the 72,000 football fans going to watch Barnsley, and the watchlist size is that mentioned in the DPIA of 10,000.
One can expect 12 false identifications and a maximum of 12 false arrests. Scale this up by a factor of ten: if the watchlist is 100,000 and the population scanned is 720,000 then there could be up to a hundred false arrests.
A UK-wide deployment of FRT on a daily basis (which is the Home Secretary’s objective) could involve a scale factor of a hundred or more, with the obvious consequence of thousands of false alerts and arrests.
Concluding comments
John Kirkpatrick, Chief Executive of the EHRC, said in his press release:
“….There must be clear rules which guarantee that live facial recognition technology is used only where necessary, proportionate and constrained by appropriate safeguards. We believe that the Metropolitan Police’s current policy falls short of this standard. The Met, and other forces using this technology, need to ensure they deploy it in ways which are consistent with the law and with human rights.”
My own view is that the above statement from the head of the EHRC is precisely the kind of statement that should have been made by the ICO.
In addition, the ICO has stressed the need for FRT deployment “with appropriate safeguards in place”. If he joined the Judicial Review process as an interested party, he could get judicial approval for these much vaunted safeguards (which nobody has seen) and expanded to include ANPR by the police (which is badly needed).
Instead, the ICO sits on the fence whilst others determine whether or not current FRT processing by the Met police is “strictly necessary” for its law enforcement functions. The Home Secretary, for her part, has promised a Code of Practice which will contain an inevitable bias in favour of the deployment of FRT.
By “doing nowt”, the ICO risks putting his office in to serious disrepute.
Autumn Data Protection Courses
Amberhawk is holding a workshop on the changes to the UK’s data protection regime arising from the Data (Use and Access) Act 2025, by Zoom, on 24 Wednesday, September : (10.00am-4.45pm; £275+VAT).
The following courses following BCS Practitioner or Foundation syllabi on the DPA2018/UK_GDPR can be attended in person, or via Zoom, or as a mixture (i.e. part Zoom, part attendance just in case “stuff happens on the day”).
- Data Protection PRACTITIONER Course: London on September 15-19 (Monday to Friday, 5 days, 30am to 5.30pm) and on November 24-28
- Data Protection FOUNDATION Course: London on October 7-9 (Tuesday to Thursday, 3 days: 10.00am to 5.00pm).
More details on the Amberhawk website: www.amberhawk.com or email info@amberhawk.com
References
ECHR press release: https://www.equalityhumanrights.com/met-polices-use-facial-recognition-tech-must-comply-human-rights-law-says-regulator
Published DPIAs covering the Met’s use of FRT: https://www.met.police.uk/police-forces/metropolitan-police/areas/about-us/about-the-met/facial-recognition-technology/
The JR case: R (Thompson and Carlo) v The Commissioner of Police of the Metropolis—is scheduled to be heard in January 2026. Press Release on: https://bigbrotherwatch.org.uk/press-releases/met-police-face-major-legal-challenge-over-use-of-live-facial-recognition-technology/
Edward Bridges and the Chief Constable of South Wales Police: Neutral Citation Number: [2020] EWCA Civ 1058
Blog covering the South Wales case: https://amberhawk.typepad.com/amberhawk/2023/04/facial-recognition-cctv-excluded-from-new-data-protection-law-by-definition-of-personal-data.html
Details of Project Champion (secret CCTV surveillance in Birmingham ethnic areas on the grounds of terrorism): https://www.bbc.co.uk/news/uk-england-birmingham-13331161




