Police Law Blog European Decisions Statutory Materials

Let’s face it: use of automated facial recognition technology by the police

The case of R (Bridges) v Chief Constable of South Wales Police & Information Commissioner [2020] EWCA Civ 1058 (handed down on 11 August 2020) was an appeal from what is said to have been the first claim brought before a court anywhere on planet earth concerning the use by police of automated facial recognition (“AFR”) technology. There could be nothing wrong with posting scores of police officers with eidetic memories to look out for up to a 800 wanted persons at public gatherings. So why not use a powerful computer, capable of matching 50 faces a second with a database of (under) 800 suspects, to do this job much more cheaply and instantaneously, flagging any matches to a human operator for final assessment? According to the Court of Appeal in Bridges, this system constitutes an interference with Article 8 rights which is not such as is in accordance with the law, but which (critically) would be proportionate if a sufficiently narrow local policy were framed.

The system in use in South Wales will require a more rigorous local policy to be put in place which reduces the breadth of discretion available to individual officers as to who and (connectedly) where may be targeted for the use of AFR, as well as sufficient impact assessments, pursuant to the Equality Act 2010 and DPA 2018, and a sufficient DPA 2018 policy statement.

The facts

SWP have been piloting the police use of AFR technology. The case concerned “AFR Locate”. It worked as follows. They use it at large gatherings. When they do so, they use prominently marked vans, hand out leaflets and advertise on social media to alert the public to the use of AFR. Despite this, many who attend large scale events are unaware that AFR Locate is in use.

Prior to the gathering, they upload and process a ‘watchlist’ of:

(a) persons wanted on warrants;

(b) individuals who are unlawfully at large;

(c) persons suspected of having committed crimes;

(d) missing persons;

(e) individuals whose presence at a particular event causes particular concern;

(f) persons simply of possible interest to SWP for intelligence purposes; and

(g) vulnerable persons.

The watchlist database (of about 400 – 800 people) is turned into numerical biometric data unique to each face – rather like the way phones that use ‘Face ID’ do it. This is a ‘biometric template’.

Images from live CCTV at the event is processed by AFR Locate to recognise human faces (as many popular cameras do). It converts images of these faces into biometric templates and looks for a match with the templates on the watchlist. It can process up to 5 faces per frame and up to 10 frames per second, giving a scanning rate of up to 50 faces per second. This way, as many as half a million faces (obviously, some being the same people) can be scanned at a single event and compared with those on the watchlist.

If AFR Locate makes a match, it will alert its operator and display the images for the operator to compare. This human touch is an important filter. No action is taken unless the operator is satisfied of the need for it. A decision will be made as to whether to tackle the potentially matched person. It does not matter what the computer says if the human says “no”.

The system has been successful and reliable.

Importantly, of the hundreds of thousands of faces AFR Locate processes, those not resulting in a match are immediately deleted. Matched faces are deleted within 24 hours. The biometric template is deleted immediately, match or no match. The CCTV feed is deleted after 31 days. Details of those matched is deleted after 31 days.

The Claimant’s claim

The Claimant argued that the use of AFR Locate at events at which he had attended was unlawful. He contended that it unjustifiably interfered with his Article 8 ECHR right to respect for his private life. He also said that it constituted a breach of the Data Protection Act 1998 and Data Protection Act 2018.

In relation to impact statements, the Claimant also claimed that SWP had failed (contrary to s.35(5)(c) DPA 2018) to have in place a document satisfying s.42(2) (explaining SWP’s procedures for securing compliance with the data protection principles and regarding the retention and erasure of personal data processed, giving an indication of how long such personal data is likely to be retained). He complained that the force had failed to have due regard to the need (pursuant to s.149(1) of the Equality Act 2010) to eliminate discrimination etc. (ii) to advance equality of opportunity; and (iii) to foster good relations between persons with different protected characteristics.

The claim having failed at first instance, the Claimant appealed and succeeded in part (see below).

The decision of the Court of Appeal, in summary

The Court of Appeal held that the interference in his Art 8 right was not such as was in accordance with the law and that the Divisional had erred in so holding. This was essentially because the local SWP policy (which – in conjunction with the DPA 2018 and the Surveillance Camera Code of Practice – was capable of amounting to a sufficient legal framework) contained two impermissibly wide areas of discretion: (1) the selection of those on watchlists, especially the “persons where intelligence is required” category (who), and (2) the locations where AFR may be deployed (where). This meant, in turn, that the data protection impact assessment prepared by SWP did not meet the requirements of s.64 DPA 2018, since it proceeds on the basis that Article 8 is not infringed.

The Court also held that SWP had not done all that they reasonably could to fulfil the Public Sector Equality Duty pursuant to s.149(1) of the Equality Act 2010. They expressed the hope that, as AFR is a novel and controversial technology, all police forces that intend to use it in the future would wish to satisfy themselves that everything reasonable which could be done had been done in order to make sure that the software used does not have a racial or gender bias.

SWP has decided not to appeal – no doubt, considering that, the key finding on proportionality having been upheld, they should be able lawfully to use AFR Locate in the near future having ensured that their local policy documents are sufficiently rigorous and having undertaken sufficient DPIA and PSED impact assessments.

The principles arising from Court’s conclusions

Read with those parts of the judgment of the Divisional Court which remain undisturbed, the applicable principles appear to be these:-

    1. Article 8 ECHR was triggered or engaged by the gathering of CCTV images, its storage and its sensitive processing (see s.35(3) DPA 2018);
    2. The article 8 rights of anyone whose face was scanned, or at risk of being scanned, were infringed;
    3. That using CCTV cameras linked to AFR Locate was not an intrusive method of obtaining information and so amply fell within the common law powers of the police – although it is not simply akin to taking a photograph given the vast amount of processing taking place at the back end;
    4. The combination of the DPA 2018, the Secretary of State’s Surveillance Camera Code and SWP’s policy documents did not constitute a sufficiently foreseeable and accessible legal framework and so that the interference was not such as was in accordance with the law. This may be rectified by the creation of a more rigorous local SWP policy (or a national policy) to be put in place which reduces the breadth of discretion available to individual officers as to who, and (connectedly) where, may be targeted for the use of AFR.
    5. Proportionality: The interference caused by the use of AFR Locate was justified as being used for a legitimate aim (conceded by the Claimant), (ii) that use was rationally connected to that aim (conceded), (iii) a less intrusive measure could not have been used without unacceptably compromising the aim and (iv) a fair balance had been struck between the rights of the individual and the interests of the community. Its use had been widely publicised and it had led to arrests and the identification of persons of considerable interest to police. It did not disproportionately interfere with anyone’s rights. The watchlist was clearly targeted and the vast majority were wanted on warrants or for offences. This was a critical finding by the Divisional Court which was upheld by the Court of Appeal. This means that AFR Locate may lawfully be used in future, so long as the police first put in place an appropriate local or national policy, undertake adequate data protection and equality impact assessments and put in place a more detailed policy document which complies with s.42(2) DPA 2018 which is now in force (but was not at the time of the matters complained of by Mr. Bridges);
    6. The processing of the Claimant’s image by AFR Locate amounted to processing of his personal data for the purposes of the DPA 1998 because the information recorded “individuated” him: singled him out and distinguishing him from all others;
    7. The requirements of the first data protection principle were satisfied as the Claimant’s personal data were processed lawfully and fairly (given that they satisfied the Article 8.2 ECHR test for justification) and the condition in §6 of Sch 2 DPA 1998 was met (the processing being necessary for the purposes of legitimate interests pursued by the force). There was therefore no breach of s.4(4) DPA 1998;
    8. The use of AFR Locate involved the sensitive processing of the personal data of members of the public for the purposes of s.35 DPA 2018, whether or not they were on the watch list, because it entailed the processing of biometric data for the purpose of uniquely identifying an individual;
    9. The processing met the requirements of ss.35(5)(a) & (b) DPA 2018, because it was strictly necessary for law enforcement purposes and met the conditions in §1, Sch 8 DPA 1998 (the processing was necessary for the exercise of a function conferred on a person by a rule of law and was necessary for reasons of substantial public interest), and possibly other conditions not excluded by the Court;
    10. It is questionable whether SWP’s current policy document would meet the requirement of s.35(5)(c) that there be in place an appropriate policy document satisfying s.42(2) – this document must now be rendered compliant, given that (unlike the time to which this case relates) the DPA 2018 is now in force;
    11. The impact assessment prepared by SWP did not meet the requirements of s.64 DPA 2018, although it contained (as the Divisional Court found) a clear narrative that explained the proposed processing, recognising that personal data of members of the public would be processed and identifying the safeguards that were in place in terms of the duration for which any such data would be retained and the purpose for which it would be used. This is because it proceeded on the basis that Article 8 was not infringed; and
    12. The equality impact assessment prepared by SWP did not demonstrate compliance with s.149(1) of the Equality Act 2010.

Implications of the case

All that the police have to do now in order to render AFR Locate lawful to use in principle is to:-

    1. Put in place a more focussed local (or national) policy sufficiently narrowing the breadth of discretion in relation to the selection of those on watchlists, especially the “persons where intelligence is required” category and the locations where AFR may be deployed (the who and where), so that AFR Locate may be said to be such as is in accordance with the law;
    2. Undertake an adequate data protection impact assessment;
    3. Undertake an adequate equality impact assessment in compliance with s.149(1) of the Equality Act 2010; and
    4. Put in place a more detailed policy document which complies with s.42(2) DPA 2018.

Simple.

Comment

This case is a key test of technology of great interest to policing (and, no doubt, intelligence agencies), both in the UK and worldwide. The Court of Appeal gave a nod to the national security implications by their reference to the need to respect the well-established principle of “neither confirm nor deny”.

The temptation for overstretched police forces will be to seek to expand the use of AFR Locate and similar technologies. There are thought to be up to 6 million CCTV cameras in the UK. Many citizens would ask: why not put all of their feeds through AFR Locate in order to see if terrorists, very serious and dangerous or sexual offenders and missing persons can quickly and easily be located? The answer is that a mass surveillance exercise of that nature, however many dangerous criminals it led to the capture of (within reason), would be likely to be held to be disproportionate because less intrusive – and far better targeted – measures could be used without unacceptably compromising the law enforcement aim and because it failed to strike a fair balance between the rights of the individual and the interests of the community. Would a more narrowly drawn exercise, putting every CCTV feed through AFR Locate, but limiting the watchlist only to terrorists and the most dangerous offenders (the who issue) and limiting the AFR period, pass muster?

Major retailers may be tempted to link AFR technology to their CCTV feeds so that, as you enter (or perhaps even walk past) the store, you are identified, welcomed with a text message telling you about their latest offers, and met by sales assistant who specialises in the goods you usually buy.

It is not currently known whether Mr. Bridges will seek to appeal to the Supreme Court, SWP having decided not to. But there will be other challenges to other uses of AFR. The key challenge for the police will be the framing of the policy/policies limiting their use of AFR Locate. This is only the beginning.