The Weekly Roundup: Facial Recognition Technology (and Brexit)
10 September 2019
In the news
As we inch towards October, the £100m government campaign to ‘Get Ready for Brexit’ has been launched. But to all intents and purposes, the government are jumping the gun. By the time businesses have managed to get themselves ready for Brexit (again), Boris Johnson will probably have been required to request an extension to Article 50 under the anti-no deal bill proposed by Hillary Benn, which today was given royal assent and passed into law.
In spite of this, Boris Johnson and other Cabinet ministers including Dominic Raab and Sajid Javid have indicated that they will not be seeking an extension. Johnson would ‘rather be dead in a ditch’, while Javid says the government ‘absolutely would not’ request an extension; Raab, more cautiously, says the government will ‘test to the limit what it lawfully requires’ (whatever that is supposed to mean). Commentators including ex-DPP Ken McDonald have pointed out that failure to comply with an Act of Parliament would put the Prime Minister in contempt of court, and could result in a prison sentence. The Government’s own Justice Minister Robert Buckland has warned Boris Johnson not to toy with the rule of law.
Short of direct defiance, it is difficult to see how the government will be able to circumvent the anti-no-deal legislation. An election (under the Fixed Term Parliaments Act 2011) seems the only way to break the deadlock – but this was rejected by Labour last week: they will not back an election until the extension to Article 50 required under the new law is actually secured from the EU. Boris Johnson is seeking another election later tonight; if he fails, he will have to wait until October to try again.
As one anonymous Cabinet minister said to Rachel Sylvester of The Times, Boris appears for the moment to be ‘checkmated’. The possibility of compromise between the softer- and harder- Brexit wings of the Conservative party has collapsed, following the dismissal of 21 rebel MPs and the resignations of Boris’ brother Jo Johnson and Work and Pensions Secretary Amber Rudd, both EU moderates. Rudd, the most prominent pro-EU member of the Cabinet, called the delisting of the 21 an act of ‘political vandalism’, and issued a withering condemnation of the government’s weak efforts to seek a deal, claiming that the government was spending 80-90% of its time preparing instead for no-deal.
For now, Parliament will be prorogued for five weeks at the end of today’s session to prepare for a Queen’s speech, and the unveiling of Boris Johnson’s ‘exciting domestic legislative agenda’. The controversial decision to prorogue is still awaiting the result of three legal challenges:
- In Scotland, 70 MPs led by Joanna Cherry QC MP lost their first instance case this week in the Outer House of the Court of Session. Upon appeal to the Inner House, they were again refused an interdict, but we are awaiting the full judgement from Lords Carloway, Brodie, and Drummond-Young this week.
- In Northern Ireland, victims campaigner Raymond McCord has brought a case against prorogation based on the risks a no-deal Brexit poses to peace in Northern Ireland under the Good Friday agreement. The case was due to be heard at Belfast High Court at the end of last week.
- In England and Wales, Gina Miller and John Major lost their challenge at first instance, but have been granted permission to appeal to the Supreme Court – that hearing is expected to take place on September 17th.
In the courts
The High Court released its judgement this week in a ground-breaking judicial review brought by Edward Bridges and Liberty against the use of facial recognition technology by the police. In particular, Mr Bridges sought to challenge the use of the ‘AFR Locate’ technology used by South Wales police (SWP) to match faces of members of the public to watchlists of known criminals. This is the first legal challenge of its kind, so it is worth reviewing in detail.
The case was brought on three grounds: (i) infringement of Article 8 ECHR; (ii) failure to comply with requirements under the Data Protection Act 2018 (and DPA 1998); (iii) failure to comply with the police’s duties under s.149 Equality Act 2010.
Article 8 ECHR
As a starting point, it was uncontroversial that the public deployment of facial recognition software constitutes a prima facie violation of the right to privacy under Article 8. The SWP nevertheless had a go, arguing that it was no more invasive than taking a photograph, given that the biometric data would only be kept momentarily unless a match was found. The court wasted no time rejecting this: “AFR Locate goes much further than the simple taking of a photograph” and even ‘momentary’ processing of biometric data would be sufficient to engage Article 8.
The majority of the court’s attention was devoted to submissions under Article 8(2), as to whether deployment of the technology was ‘in accordance with the law’. Counsel for Mr Bridges argued that this condition was not met because (i) the use of the technology was ultra vires the South Wales Police; and (ii) the DPA 2018 did not constitute a sufficient legal framework to render the use of facial recognition technology lawful.
In rejecting the vires claim, the court held that use of the technology was squarely within the common law powers of the police, i.e. ‘broadly speaking the maintenance of order and the prevention and detection of crime’ (R(Catt) v Association of Chief Police Officers per Lord Sumption). Further statutory powers such as those in PACE 1984 were not required. The proper analogy was not to fingerprinting and DNA swab testing – which would constitute assault but for a statutory exception – but rather to CCTV and ANPR, which were within the police’s powers at common law because they are not ‘physically intrusive’.
In rejecting the alternative claim, the court was confident that the legal framework, constituted by the Data Protection Act 2018, the Surveillance Camera Commissioner’s Code of Conduct, and the SWP’s own policy documents, sufficiently robust. The data protection principles under s.34 DPA, combined with the enforcement powers of the Information Commissioner, provided a strong enough mechanism to regulate the abuse of facial recognition technology.
Further, the trial use of facial recognition technology in the instances under consideration (at a Cardiff shopping centre and at a protest at a defence ‘exhibition’) was proportionate: there had been no disproportionate interference with Article 8 rights, and the same objective could not have been achieved by the less intrusive measure of additional CCTV cameras. There was therefore no “systemic or clear ‘proportionality deficit’ such that it can be said that future use of AFR Locate by the SWP would be inevitably disproportionate”.
However, the court emphasised that none of this indicated a blanket declaration of lawfulness for all subsequent deployments of facial recognition technology such as AFR. This judicial review is simply a snapshot of where we are now, and the development of this technology will require regulatory vigilance:
“In our view, when considered in context, [the SWP’s] comments should be considered as amounting to pragmatic recognition that (a) steps could, and perhaps should, be taken further to codify the relevant legal standards; and (b) the future development of AFR technology is likely to require periodic re-evaluation of the sufficiency of the legal regime. We respectfully endorse both sentiments, in particular the latter. For the reasons we have set out already, we do not consider that the legal framework is at present out of kilter; yet this will inevitably have to be a matter that is subject to periodic review in the future.”
The challenge under the DPA 2018 was brought under two headings: (i) the first data protection principle under ss.34-5 – processing of personal data must be lawful and fair; (ii) the requirement to carry out effective Data Protection Impact Assessments (DPIAs) under s.64.
Submissions under s.34 relied in particular on the more stringent requirements for ‘sensitive’ processing under s.35(3)-(5) – this processing was ‘sensitive’ under s.35(8)(b) because it involved processing of biometric data that uniquely identified each individual, even if only momentarily. Crucially therefore, this would include processing of all individuals’ data, not just those on a watchlist. Under s.35(5), sensitive processing must (a) be ‘strictly necessary’ for the law enforcement objective, (b) meet at least one of the conditions in Schedule 8, and (c) the controller must have an appropriate policy document in place.
The first two of these three conditions were held to be satisfied for the same reasons as in the Article 8 reasoning above – the police’s role in public order and the prevention of crime. However, the court again raised a policy question for the future. It suggested that the SWP’s policy documents might not be sufficiently robust – but that this was more appropriate for the ICO to pursue. Presumably we can expect further dialogue between the police and the ICO in the wake of this decision.
In evaluating the DPIA conducted by the SWP, the court appeared rather relaxed, noting that there was a ‘clear narrative’ that explained the processing. Although the court did also rebut the claimant’s allegations of insufficient evaluation in the DPIA, the wording seems a little ill-judged.
Finally, Mr Bridges sought to argue that the SWP’s use of the technology violated the police’s duty under s.149 of the Equality Act 2010. Here the claim rested on allegedly discriminatory tendencies in the facial matching algorithms, such that matches would be disproportionately generated for BAME individuals: the SWP ought to have recognised that the software would operate in an indirectly discriminatory way.
The court had very limited sympathy for this view, stating that the claim had “an air of unreality” about it: “there is no suggestion that as at April 2017 when the AFR Locate trial commenced, SWP either recognised or ought to have recognised that the software it had licenced might operate in a way that was indirectly discriminatory.” It was additionally observed that this bias would depend on the dataset used to train the system, and evidence was not presented to the court on this point.
Nevertheless, echoing its approach under the previous two heads of claim, the court suggested the value of further research into the possibly discriminatory impacts of this facial recognition software.
This judgement was not much of a victory for civil liberties activists. However, the court has been careful to leave a space for regulatory dialogue and review in the future, and it is clear that the police will continue to be held to the strict standards of privacy and data protection law which prevail in this country. As developments in surveillance technology continue to put pressure on the balance between liberty and security, the law must remain vigilant. It is to be hoped that the judges’ optimism about the robustness and adaptability of our legal framework are justified in the long term.
On the UKHRB
Jim Duffy considers possible Article 2 implications of a no-deal Brexit