R (on the application of M) v Chief Constable of Sussex  EWCA Civ 42 is an important decision from the Court of Appeal regarding an information sharing agreement (“ISA”) between a police force and a local business crime reduction partnership (“BCRP”). The ISA was held not to breach the Data Protection Act 2018 (“DPA”) and the sharing of information that revealed a vulnerability to child sexual exploitation (“CSE”) was held not to be in breach of data protection rights. The case indicates the approach that the courts may take when asked to scrutinise information sharing agreements and policy documents where the police seek to share data with other organisations for the purpose of reducing crime and disorder.
A group of police officers exchange off-duty, sexist, degrading, racist, antisemitic, homophobic and disability-mocking WhatsApp group chat messages, as well posting crime scene photographs of current investigations. No crime was committed. That’s a private matter, isn’t it? No. It isn’t. So held the Second Division of the Inner House of the Court of Session in BC v Chief Constable of the Police Service of Scotland Livingstone  CSIH 61;  SLT 1021 (Lord Justice Clerk (Lady Dorrian), & Lords Menzies and Malcolm).
If concerns are raised that a person might be vulnerable to radicalisation, how long can a police force hold data about that person? This was the question facing the High Court in the case of R (II) v Commissioner of Police for the Metropolis  EWHC 2528 (Admin), which held that the police’s continued retention of data a sixteen year old was contrary to the Data Protection Act 2018 and Article 8. In finding this, the court held that a force’s retention of data must be proportionate, what is proportionate in any given situation is fact-specific and that when the police cease to be able to identify a policing purpose for continued retention of personal data, it should be deleted.
Pile v Chief Constable of Merseyside Police  EWHC 2472 (QB) concerned what many might consider to be the tail end of just another good night out. The claimant got into a taxi on 22 April 2017, in an advanced state of intoxication, and the taxi driver rang 999 to report that she had started abusing him and ‘kicking off’. She vomited all over herself and over the back of the taxi. Officers responding to this unfortunate misunderstanding found her covered in vomit, including in her hair. They arrested her for the offence of being drunk and disorderly. At the police station, Ms Pile was flailing her arms with the intention of striking the officers accompanying her. She later accepted a £60 fixed penalty notice as an alternative to being prosecuted. For many, the story would have ended there…
The case of R (Bridges) v Chief Constable of South Wales Police & Information Commissioner  EWCA Civ 1058 (handed down on 11 August 2020) was an appeal from what is said to have been the first claim brought before a court anywhere on planet earth concerning the use by police of automated facial recognition (“AFR”) technology. There could be nothing wrong with posting scores of police officers with eidetic memories to look out for up to a 800 wanted persons at public gatherings. So why not use a powerful computer, capable of matching 50 faces a second with a database of (under) 800 suspects, to do this job much more cheaply and instantaneously, flagging any matches to a human operator for final assessment? According to the Court of Appeal in Bridges, this system constitutes an interference with Article 8 rights which is not such as is in accordance with the law, but which (critically) would be proportionate if a sufficiently narrow local policy were framed.
The system in use in South Wales will require a more rigorous local policy to be put in place which reduces the breadth of discretion available to individual officers as to who and (connectedly) where may be targeted for the use of AFR, as well as sufficient impact assessments, pursuant to the Equality Act 2010 and DPA 2018, and a sufficient DPA 2018 policy statement.