High Court dismisses challenge regarding Automated Facial Recognition
On 4 September, the High Court of England and Wales dismissed a challenge to the police’s use of Automated Facial Recognition Technology (“AFR”). The court ruled that the use of AFR was proportionate and necessary to meet the legal obligations of the police.
The pilot project AFR Locate was used for certain events and public places when the commission of crimes was likely. Up to 50 faces per second can be detected. The faces are then compared by biometric data analysis with wanted persons registered in police databases. If no match is found the images are deleted immediately and automatically.
An individual has initiated a judicial review process after he has not been identified as a wanted person, but is likely to have been captured by AFR Locate. He considered this to be illegal, in particular due to a violation of the right to respect for private and family life under Article 8 of the European Convention on Human Rights (“ECHR”) and data protection law in the United Kingdom. In his view, the police did not respect the data protection principles. In particular, that approach would violate the principle of Article 35 of the Data Protection Act 2018 (“DPA 2018”), which requires the processing of personal data for law enforcement purposes to be lawful and fair. He also pointed out that the police had failed to carry out an adequate data protection impact assessment (“DPIA”).
The Court stated that the use of AFR has affected a person’s rights under Article 8 of the ECHR and that this type of biometric data has a private character in itself. Despite the fact that the images were erased immediately, this procedure constituted an interference with Article 8 of the ECHR, since it suffices that the data is temporarily stored.
Nevertheless, the Court found that the police’s action was in accordance with the law, as it falls within the police’s public law powers to prevent and detect criminal offences. The Court also found that the use of the AFR system is proportionate and that the technology can be used openly, transparently and with considerable public commitment, thus fulfilling all existing criteria. It was only used for a limited period, for a specific purpose and published before it was used (e.g. on Facebook and Twitter).
With regard to data protection law, the Court considers that the images of individuals captured constitute personal data, even if they do not correspond to the lists of persons sought, because the technology has singled them out and distinguished them from others. Nevertheless, the Court held that there was no violation of data protection principles, for the same reasons on which it denied a violation of Art. 8 ECHR. The Court found that the processing fulfilled the conditions of legality and fairness and was necessary for the legitimate interest of the police in the prevention and detection of criminal offences, as required by their public service obligations. The requirement of Sec. 35 (5) DPA 2018 that the processing is absolutely necessary was fulfilled, as was the requirement that the processing is necessary for the exercise of the functions of the police.
The last requirement under Sec. 35 (5) of the DPA 2018 is that a suitable policy document is available to regulate the processing. The Court considered the relevant policy document in this case to be short and incomplete. Nevertheless, it refused to give a judgment as to whether the document was adequate and stated that it would leave that judgment to the Information Commissioner Office (“ICO”), as it would publish more detailed guidelines.
Finally, the Court found that the impact assessment carried out by the police was sufficient to meet the requirements of Sec. 64 of DPA 2018.
The ICO stated that it would take into account the High Court ruling when finalising its recommendations and guidelines for the use of live face recognition systems.