Facial Recognition Technology: High Court gives judgment
12 September 2019
The High Court has dismissed an application for judicial review regarding the use of Automated Facial Recognition Technology (AFR) and its implications for privacy rights and data protection.
Haddon-Cave LJ and Swift J decided that the current legal regime is adequate to ensure the appropriate and non-arbitrary use of AFR in a free and civilised society. The Court also held that South Wales Police’s (SWP) use to date of AFR by has been consistent with the requirements of the Human Rights Act 1998 (HRA) and data protection legislation.
Nonetheless, periodic review is likely to be necessary. This was the first time any court in the world had considered AFR. This article analyses the judgement and explores possible avenues for appeal.
Live AFR captures the facial biometrics of people passing within range of surveillance cameras and compares this data to the facial biometrics of people on police watchlists. It is described in the judgment as a
new and powerful technology which has great potential to be put to use for the prevention and detection of crime. 
The claimant, a civil liberties campaigner, challenged SWP’s use of AFR on three main grounds:
- The SWP’s use of AFR was contrary to the requirements of ECHR Article 8;
- The SWP breached the Claimant’s data protection rights under both the Data Protection Act 1998 (DPA 1998) and Data Protection Act 2018 (DPA 2018);
- The SWP failed to comply with the Public Sector Equality Duty under section 149 of the Equality Act 2010.
Ground 1: Article 8
The Court accepted that AFR technology engages the Article 8 rights of anyone whose face is scanned (or is at risk of being scanned). The extensive jurisprudence of both the UK Supreme Court and the ECtHR emphasises the tension between the use of invasive scientific techniques in the criminal justice system and private-life interests. The case of S v. United Kingdom (2009) 48 EHRR 50 concerned the retention of biometric information in the form of fingerprint records and DNA samples. At  the ECtHR observed “the mere storing of data relating to private life of an individual amounts to an interference within the meaning of Article 8.” Like fingerprints and DNA, the High Court held that AFR technology enables the extraction of information of an “intrinsically private” character.
The subsequent question was whether the use of AFR Locate by the SWP is “in accordance with the law” for the purposes of Article 8(2). The Claimant’s submission in this regard was that SWP does not, as a matter of law, have the power to deploy AFR and that even if the SWP’s use of AFR is not ultra vires, any interference with Article 8(1) rights is not subject to a sufficient legal framework such that it is capable of being justified under Article 8(2).
In support of this latter argument, the Claimant cited the Police and Criminal Evidence Act 1984, which regulates the collection and use of fingerprints and DNA samples. He argued that absent comparable provision for AFR technology, its use is not “in accordance with the law”. Parallel to this, it was claimed that both the DPA 1998 and DPA 2018 are insufficient for the regulation of AFR.
The Court held the lack of a specific statutory basis for the use of AFR Locate did not render the technology ultra vires. This is because the SWP’s and the Home Secretary’s reliance upon the police’s common law powers are sufficient authority for use of this equipment. In R (Catt) v Association of Chief Police Officers  AC 1065, which considered the lawfulness of collecting and retaining personal information, Lord Sumption JSC held “At common law the police have the power to obtain and store information for policing purposes, i.e. broadly speaking for the maintenance of public order and the prevention and detection of crime.” 
The Court said this broad construal of the police’s common law powers meant the only issue is whether the use of AFR constitutes what Lord Sumption in Catt termed an “intrusive method” and, therefore, out-with the common law powers of the police. The Court said an “intrusive method” was a clear reference to physical intrusion. For this reason, “the [use of AFR] is no more intrusive than the use of CCTV in the streets.” 
The Claimant’s second submission was that the legal framework governing the use of AFR lacked the necessary qualities of lawfulness that Lord Bingham originally articulated in R (Gillan) v Commissioner of Police of the Metropolis  2 AC 307 at , that is foreseeability and predictability.
In S v United Kingdom, the Grand Chamber concluded the legality of arrangements for the retention and use of fingerprints and DNA required “detailed rules governing the scope and application of measures” to provide sufficient guarantees against the risk of abuse and arbitrariness at . Moreover at , the ECtHR held “The need for such safeguards is all the greater where the protection of personal data undergoing automatic processing is concerned, not least when such data are used for police purposes.”
Nonetheless, the High Court was satisfied that there is a clear and sufficient legal framework governing whether, when and how the system AFR Locate may be used. This framework is composed of three legally enforceable layers in addition to the restraints of the common law.
First, there is primary legislation in the form of the DPA 2018. Second, there are secondary legislative instruments, namely the Surveillance Camera Code of Practice pursuant to section 30 of the Protection of Freedoms Act 2012. Third, there are the SWP’s own local policies. The cumulative effect of these elements renders the use of AFR by SWP sufficiently foreseeable and accessible to satisfy the “in accordance with the law” requirement of Article 8(2).
The most important element is the DPA 2018. As explained by Lord Sumption in Catt (at ), the DPA 2018 embeds key safeguards which apply to all processing of all personal data – including the biometric data processed when AFR Locate is used. Part 3 of the DPA 2018 applies to processing for law enforcement purposes. According to section 34(3) of the DPA 2018, SWP as data controller, “must be able to demonstrate its compliance with” the six data protection principles and the two safeguarding measures set out at sections 35 – 42 of the Act. The principles apply to all operations which involve retention or use of personal data, thereby encompassing AFR.
The High Court referred to Catt  in holding that it was not legally important that these are principles of general application. Moreover, section 35(3) of the DPA 2018 sets out specific conditions that must be met for “sensitive processing”, which includes “processing … of biometric data for the purposes of uniquely identifying an individual”. These requirements arising under the DPA 2018 are mirrored in the Code of Practice on the Management of Police Information, issued by the College of Policing under section 39A of the Police Act 1996. While the framework is not presently deficient, the Court held the future development of AFR technology is likely to require periodic re-evaluation of the sufficiency of the legal regime.
The Court finally asked whether SWP’s use of AFR Locate satisfied the four-stage proportionality test in Bank Mellat v Her Majesty’s Treasury (No 2)  AC 700. First, the use of AFR pursues a legitimate security aim. Second, SWP’s use of AFR is rationally connected to that aim. Third, a less intrusive method could not have been used: CCTV could not have identified whether those at the event were on watchlists, which the Court characterised as being “clearly targeted” . Fourth, the use of AFR Locate strikes a fair balance between the rights of the individual and the interests of the community and was not disproportionate. This is because the interference with the Claimant’s Article 8 rights “would be limited to the near instantaneous algorithmic processing and discarding of the Claimant’s biometric data.” 
The High Court’s treatment of the meaning of “intrusion” in the context of ECHR Article 8 is worthy of re-consideration. Following Catt, the Court interpreted an “intrusive method” of obtaining information in purely physical terms. It followed that because AFR did not compromise an individual’s “bodily integrity”, its level of “intrusion” compared to the use of CCTV in the streets. This fails to grasp how the instantaneous capture of biometric information extracts more private information about a person than a video image. Knowledge of this extraction could affect the “psychological integrity” of an individual, possibly infringing their right to privacy in absence of manifest consent.
This relates to proportionality. If “intrusion” is interpreted on a physical-psychological continuum, proportionate use of AFR should reflect how “sensitive processing” power ought not to be deployed on an indiscriminate basis to ordinary members of the public. This reformed understanding might influence the Court’s approach to the proportionality test.
The conceptual novelty of AFR, moreover, should inform our evaluation of “accessibility” as a criterion of the rule of law. As the ECtHR held in Big Brother Watch v UK in consideration of the Regulation of Investigatory Powers Act 2000:
the domestic law must be sufficiently clear to give citizens an adequate indication as to the circumstances in which and the conditions on which public authorities are empowered to resort to any such measures.
Applying this principle, it is arguable that AFR – being a novel technology – should have a specific statutory basis.
Ground 2: Data Protection
Claim under DPA 1998
The first claim concerned the obligation at s. 4(4) of the DPA 1998 on data controllers “to comply with the data protection principles in relation to all personal data with respect to which he is the data controller”. The first of those principles, set out in Part 1 of Schedule 1 to the DPA 1998, mandates “personal data shall be processed fairly and lawfully”.
The primary point of dispute was whether using AFR Locate entails the processing of “personal data” within the meaning of Section 1 of the DPA 1998. The Act defines “personal data” as “data which relates to a living individual who can be identified”. The Court said there were two methods of personal identification to consider: (a) indirect identification and (b) individuation.
The CJEU adopted an expansive approach to indirect identification in Breyer v Bundesrepublik Deutschland (Case C-582/14), which concerned whether dynamic IP addresses were personal data within the definition in the 1995 Directive. The only incidents excluded were where the risk of identification “appears in reality to be insignificant” . In the context of AFR, this route was deemed “artificial and unnecessary” because the identification mechanism in Breyer did not resemble.
In Vidal-Hall v Google Inc.  QB 1003, in the context of an application for permission to serve proceedings out of the jurisdiction, the Court of Appeal had to consider whether it was arguable that browser generated information (i.e. information about the claimants’ internet usage), was personal data. There the Court held at  that
identification for the purposes of data protection is about data that “individuates” the individual, in the sense that they are singled out and distinguished from all others.
Anonymity did not preclude individuation. Applying this principle, the High Court said the information provided by AFR Locate did amount to “personal data”. However, because AFR was held to be compatible with ECHR Article 8, its use satisfied the conditions of lawfulness and fairness stipulated in the DPA 1998.
Claim under section 34 DPA 2018
The second claim concerned whether the use of AFR Locate complies with the first data protection principle for the purposes of “law enforcement” under section 35(3) DPA 2018.
The Claimant contended, first, that AFR Locate entails the “sensitive processing” of personal data as described in section 35(8) of the DPA 2018, and secondly that AFR Locate does not meet the requirements of section 35(5), which regulate the use of sensitive processing.
The first issue to address is the scope of sensitive processing where AFR Locate is used: does it entail processing biometric data of members of the public “for the purpose of uniquely identifying an individual”?
The Court rejected SWP’s submission that only the personal data of those on watchlists had been sensitively processed. Section 35(8)(b) of the DPA 2018 can properly be read as applying both to the biometric data for those on the watchlist and to the biometric data of the members of the public. Indeed, comparisons can only be made if each person is uniquely identified. This conclusion is supported by the inclusion of biometric data within the General Data Protection Regulation (Reg 2016/679/EU) and the Data Protection Law Enforcement Directive (2016/680/EU), which are measures the DPA 2018 seeks to implement.
The second issue was whether this sensitive processing met the requirements of section 35(5). The Court reached three conclusions. First, that analysis of “strict necessity” is the same as that of proportionality under ECHR Article 8. Second, the processing meets the Schedule 8 criterion of a rule of law function, namely the duty to prevent and detect crime. Third, the SWP had an “appropriate policy document” for securing compliance, as required by section 42(2) DPA 2018. The absence of precise information on the position of members of the public in that document did not preclude the successful discharge of that obligation.
Claim under section 64 DPA 2018
The third claim concerned section 64 of the DPA 2018, which sets out an obligation on data controllers to undertake impact assessments of the proposed processing of personal data. The Court was satisfied that SWP had discharged its obligation under section 64. Evaluation thereof is comparable to judgement on the execution of the public sector equality duty. In essence, a Court will not necessarily substitute its own view for that of the data controller on all matters. As Underhill LJ stated in R (Unison) v Lord Chancellor  ICR 1, the Court must simply inquire “whether the essential questions have been conscientiously considered and that any conclusions reached are not irrational.”
Following my suggestion for an amended approach to ECHR Article 8, it is arguable the Court should in fact apply a distinction between the levels of protection for individual rights under the HRA 1998 and the DPA 2018. Application of this distinction would create a problem for the assessment of lawfulness and fairness under the DPA 2018.
Ground 3: The Public Sector Equality Duty
The Court dismissed the challenge to the SWP’s discharge of the Public Sector Equality Duty under section 149 of the Equality Act 2010. It held there was insufficient evidence to support the Claimant’s contention that SWP did not, in its Equality Impact Assessment, consider the possibility that AFR Locate might produce results that were indirectly discriminatory on grounds of sex and/or race because it produces a higher rate of false positive matches for female faces and/or for black and minority ethnic faces. More detailed factual investigations were required before firm conclusions on bias could be reached.
As to this point, it is suggested that factual concerns about the possibility of discriminatory false positives require urgent attention and may have to be considered again in due course.
Sapan Maini-Thompson is an LLM Candidate at University College London. He tweets @SapanMaini.