Police Law Blog European Decisions Statutory Materials

Let’s face it: use of automated facial recognition technology by the police

The case of R (Bridges) v Chief Constable of South Wales Police & Information Commissioner [2019] EWHC 2341 (Admin); [2020] 1 WLR 672 is said to have been the first claim brought before a court anywhere on planet earth concerning the use by police of automated facial recognition (“AFR”) technology. There could be nothing wrong with posting scores of police officers with eidetic memories to look out for up to a 800 wanted persons at public gatherings. So why not use a powerful computer, capable of matching 50 faces a second with a database of (under) 800 suspects, to do this job much more cheaply and instantaneously, flagging any matches to a human operator for final assessment? According to the Divisional Court in Bridges, this may, depending on the facts of each particular deployment, be lawful.

The facts

SWP are piloting the police use of AFR technology. The case concerned “AFR Locate”. They use it at large gatherings. When they do so, they use prominently marked vans, hand out leaflets and advertise on social media to alert the public to the use of AFR. Despite this, many who attend large scale events are unaware that AFR Locate is in use.

Prior to the gathering, they upload and process a ‘watchlist’ of:

(a) persons wanted on warrants;

(b) individuals who are unlawfully at large;

(c) persons suspected of having committed crimes;

(d) missing persons;

(e) individuals whose presence at a particular event causes particular concern;

(f) persons simply of possible interest to SWP for intelligence purposes; and

(g) vulnerable persons.

The watchlist database (of about 400-800 people) is turned into numerical biometric data unique to each face – rather like the way phones that use ‘Face ID’ do it. This is a ‘biometric template’.

Images from live CCTV at the event is processed by AFR Locate to recognise human faces (as many popular cameras do). It converts images of these faces into biometric templates and looks for a match with the templates on the watchlist. It can process up to 5 faces per frame and up to 10 frames per second, giving a scanning rate of up to 50 faces per second. This way, as many as half a million faces (obviously, some being the same people) can be scanned at a single event and compared with those on the watchlist.

If AFR Locate makes a match, it will alert its operator and display the images for the operator to compare. This human touch is an important filter. No action is taken unless the operator is satisfied of the need for it. A decision will be made as to whether to tackle the potentially matched person. It does not matter what the computer says if the human says “no”.

The system has been successful and reliable.

Importantly, of the hundreds of thousands of faces AFR Locate processes, those not resulting in a match are immediately deleted. Matched faces are deleted within 24 hours. The biometric template is deleted immediately, match or no match. The CCTV feed is deleted after 31 days. Details of those matched is deleted after 31 days.

The Claimant’s claim

The Claimant argued that the use of AFR Locate at events at which he had attended was unlawful. He contended that it unjustifiably interfered with his Article 8 ECHR right to respect for his private life. He also said that it constituted a breach of the Data Protection Act 1998 and Data Protection Act 2018.

In relation to impact statements, the Claimant also claimed that SWP had failed (contrary to s.35(5)(c) DPA 2018) to have in place a document satisfying s.42(2) (explaining SWP’s procedures for securing compliance with the data protection principles and regarding the retention and erasure of personal data processed, giving an indication of how long such personal data is likely to be retained). He complained that the force had failed to have due regard to the need (pursuant to s.149(1) of the Equality Act 2010) to eliminate discrimination etc. (ii) to advance equality of opportunity; and (iii) to foster good relations between persons with different protected characteristics.

The Court’s conclusions

The Court dismissed the claim, holding that:-

1. Article 8 ECHR was triggered or engaged by the gathering of CCTV images, its storage and its sensitive processing (see s.35(3) DPA 2018);

2. The article 8 rights of anyone whose face was scanned, or at risk of being scanned, were infringed;

3. That using CCTV cameras linked to AFR Locate was not an intrusive method of obtaining information and so amply fell within the common law powers of the police;

4. The combination of the DPA 2018, the Secretary of State’s Surveillance Camera Code and SWP’s policy documents constituted a sufficiently foreseeable and accessible legal framework that the interference was such as was in accordance with the law;

5. The interference caused by the use of AFR Locate was justified as being used for a legitimate aim (conceded by the Claimant), (ii) that use was rationally connected to that aim (conceded), (iii) a less intrusive measure could not have been used without unacceptably compromising the aim and (iv) a fair balance had been struck between the rights of the individual and the interests of the community. Its use had been widely publicised and it had led to arrests and the identification of persons of considerable interest to police. It did not disproportionately interfere with anyone’s rights. The watchlist was clearly targeted and the vast majority were wanted on warrants or for offences;

6. The processing of the Claimant’s image by AFR Locate amounted to processing of his personal data for the purposes of the DPA 1998 because the information recorded “individuated” him: singled him out and distinguishing him from all others;

7. The requirements of the first data protection principle were satisfied as the Claimant’s personal data were processed lawfully and fairly (given that they satisfied the Article 8.2 ECHR test for justification) and the condition in §6 of Sch 2 DPA 1998 was met (the processing being necessary for the purposes of legitimate interests pursued by the force). There was therefore no breach of s.4(4) DPA 1998;

8. The use of AFR Locate involved the sensitive processing of the personal data of members of the public for the purposes of s.35 DPA 2018, whether or not they were on the watch list, because it entailed the processing of biometric data for the purpose of uniquely identifying an individual;

9. The processing met the requirements of ss.35(5)(a) & (b) DPA 2018 Act, because it was strictly necessary for law enforcement purposes and met the conditions in §1, Sch 8 DPA 1998 (the processing was necessary for the exercise of a function conferred on a person by a rule of law and was necessary for reasons of substantial public interest), and possibly other conditions not excluded by the Court;

10. It was questionable whether SWP’s current policy document met the requirement of s.35(5)(c) that there be in place an appropriate policy document satisfying s.42(2);

11. …But the Court did not consider it necessary or desirable to decide that issue, leaving it for reconsideration by the force in the light of further guidance from the Information Commissioner;

12. The impact assessment prepared by SWP met the requirements of s.64 DPA 2018, containing a clear narrative that explained the proposed processing, recognising that personal data of members of the public would be processed and identifying the safeguards that were in place in terms of the duration for which any such data would be retained and the purpose for which it would be used; and

13. The equality impact assessment prepared by SWP demonstrated compliance with s.149(1) of the Equality Act 2010.

Implications of the case

This case is a key test of technology of great interest to policing (and, no doubt, intelligence agencies), both in the UK and worldwide. The issues raised by it (as well, perhaps, as by other cases now arising) may well ultimately be considered by the Supreme Court.

The temptation for overstretched police forces will be to seek to expand the use of AFR Locate and similar technologies. There are thought to be up to 6 million CCTV cameras in the UK. Many citizens would ask: why not put all of their feeds through AFR Locate in order to see if terrorists, very serious and dangerous or sexual offenders and missing persons can quickly and easily be located? The answer is that a mass surveillance exercise of that nature, however many dangerous criminals it led to the capture of (within reason), would be likely to be held to be disproportionate because less intrusive – and far better targeted – measures could be used without unacceptably compromising the law enforcement aim and because it failed to strike a fair balance between the rights of the individual and the interests of the community. Would a more narrowly drawn exercise, putting every CCTV feed through AFR Locate, but limiting the watchlist only to terrorists and the most dangerous offenders and limiting the AFR period, pass muster?

Major retailers may be tempted to link AFR technology to their CCTV feeds so that, as you enter (or perhaps even walk past) the store, you are identified, welcomed with a text message telling you about their latest offers, and met by sales assistant who specialises in the goods you usually buy.

The Court of Appeal will hear the claimant’s appeal on 23 June 2020, for 3 days. The parties are likely to view the Court of Appeal as a mere staging post on the way to the Supreme Court. This is only the beginning.