Rise of the algorithms

4 November 2019 by

The use of algorithms in public sector decision making has broken through as a hot topic in recent weeks. The Guardian recently ran the “Automating Poverty” series on the use of algorithms in the welfare state. And on 29 October 2019 it was reported that the first known legal challenge to the use of algorithms in the UK, this time by the Home Office, had been launched. It was timely, then, that the Public Law Project’s annual conference on judicial review trends and forecasts was themed “Public law and technology”.

Basic tech for lawyers

The conference helpfully opened with a lawyer-friendly run down of algorithms and automation. Dr. Reuben Binns (ICO Postdoctoral Research Fellow in AI) drew a number of useful distinctions.

The first was between rule-based and statistical machine learning systems. In rule-based systems, the system is programmed to apply a decision-making tree. The questions asked and the path to a particular outcome, depending on the answers given, can be depicted by way of flow-chart (even if that flow-chart might be very large, involving numerous branches). In contrast, statistical machine learning involves a computer system training itself to spot patterns and correlations in data sets, and to make predictions based on those patterns and correlations. The computer system is first trained on data sets provided by the system designer. Once trained, it can be used to infer information and make predictions based on new data. These systems might be used, for example, to assess the risk of a person re-offending, where the system has been trained on existing data as to re-offending rates. It has long been known that machine-learning systems can be biased, not least because the data on which they are trained is often biased.

Another useful distinction is between decision making that is fully automated, where the algorithm makes the final decision, and decision making where there is a “human in the loop”, who uses the algorithm’s output to support their decision-making. This has implications not only under the GDPR (see Article 22) but also for the application of public law principles.

This distinction was further linked by Dr. Binn to the concept of “automation bias”: that is, the biases humans may exhibit after being exposed to an automated system. In some instances, even though there is a human in the loop, an individual may place over-reliance on the outcome of an automated-system, in effect simply rubber-stamping the decision. In other instances, under-reliance can occur. The example offered was of judges in Kentucky who were using risk scores produced by algorithms in bail decision-making. Research showed that judges factored in risk scores differently, depending on the race of the defendant in question. This demonstrates that the relationship between algorithmic outputs and human decision-making can be complex.

A final useful distinction, drawn by Dr. Joe Tomlinson (PLP) in a later presentation, was between the ways in which automated decision-making may be “opaque”. Intentional opacity is where an algorithm is designed so that its workings are concealed, in order to protect intellectual property. Illiterate opacity arises where an algorithm is so complex that it is understandable only to tech experts. Finally, intrinsic opacity is where a machine-learning system is so complex that even a tech expert is unable to understand its internal workings. That is, the system is a “black box”. A lack of transparency in decision making is clearly a primary hurdle to holding decision-makers to account where algorithms have been deployed.

Algorithms already in use

We know that algorithms are already widely in use in the immigration and welfare contexts in the UK. Key examples include the use of data-sharing and automated decision making in the EU settlement scheme and in the administration of Universal Credit. In both instances, data sharing between government departments and automated processes are used to determine a person’s entitlement to either settled or pre-settled status, or the level of their benefits. In both contexts, reports produced by civil society suggest there have been significant difficulties experienced by those seeking settled status and welfare provision. Both have been plagued by confusion surrounding how decisions have been reached, making it very difficult for individuals to understand, let alone challenge, that decision. Both have been shown to produce incorrect outcomes, with disastrous consequences for some individuals.

The first known UK case challenging algorithmic decision-making, referenced above, is to another aspect of the immigration system. The Home Office uses an algorithm to filter UK visa applications into those assessed as a low, medium or high risk of being fraudulent. The effect is that those categorised as higher risk are subject to more stringent checks and requirements and are more likely to be unsuccessful. The lawyer behind the legal challenge, Cori Crider, describes the system as providing “speedy boarding for white people”.

Legal challenges to automated decision-making

The recent Report of the UN Special Rapporteur on Extreme Poverty and Human Rights into the digital welfare state succinctly summarises the serious concerns that algorithmic decision making raises. These include:

  • difficulties in digital access for vulnerable persons most affected by these regimes, both in terms of access to the necessary technology and digital literacy;
  • the secrecy often surrounding how decisions are reached;
  • the tendency of risk-scoring and other algorithmic systems to exacerbate existing inequalities and discrimination; and
  • the inflexible robotic application of rules which preclude consideration of relevant extenuating circumstances and removes human interaction and compassion from the picture.

You can see, then, why there is an appetite amongst barristers to consider how the law can be used to ensure greater transparency and scrutiny of automated decision-making in the public sector. This is not to say that the tone of the PLP conference was one wholly of cynicism toward tech. There are obviously great potential gains to be had through the use of automated decision making. The concern is, nevertheless, that technological development and deployment has outpaced scrutiny and regulation.

Large portions of the conference, accordingly, were dedicated to discussing how current legal frameworks might be deployed to challenge algorithmic decision-making. This included inevitable discussion of human rights, the GDPR and Data Protection Act 2018, the Equality Act 2010 and administrative law principles. Without descending into a blow-by-blow account of the legal possibilities, one key output of the conference was a frank discussion of the potential difficulties any legal action might face.

Megan Goulding (Liberty), who was instructing solicitor on the recent unsuccessful challenge to the use of automated facial recognition (AFR) technology, provided some critical insight. She identified two key challenges that her team had faced, which are likely to be present in many tech-focussed cases.

The first issue was getting the Court to engage with the broader societal impact of the technology in question. When the Court came to decide whether the use of AFR amounted to a justified interference with the claimant’s privacy rights, they balanced the infringement of the individual claimant’s rights against the interests of the community in detection and prevention of crime. No weight was given to the negative impact on the community of the systematic use of automated facial recognition technology. This highlights a potential difficulty in bringing human rights-based claims to technological innovations in the future.

Hopefully, this issue is explored further when the case is appealed. Other legal avenues (such as indirect discrimination, public sector equality duty and systemic administrative law claims) may also need exploring when considering challenges to new technology which seek to draw on community-wide effects.

The second issue was getting the evidence necessary to bring the claim. The claimants sought access to the data set on which the AFR system was trained, in order to bolster their public sector equality duty claim. This request was refused on the basis that the data was a trade secret, held by the private company who produced the system. The role of private companies in providing tech was, likewise, a concern squarely raised by the UN Rapporteur in his report:

“Private entities have different motives for their involvement in benefit and social assistance systems and this may lead to conflicts between the public interests these systems ought to serve and the private interests of corporations and their owners.”

The role of private actors in providing automated systems to government, and whether they can avoid transparency by asserting their own commercial interest, will be a key feature of future challenges. Ms Goulding suggests one possible route forward: to argue that a public body cannot comply with its public sector equality duty unless it is willing to disclose, for independent assessment, details of any automated system it is using.

A related issue, canvassed by Dr. Tomlinson, is the role of evidence in judicial review and the difficulties potentially faced by a court in assessing evidence about complex computer systems. As noted by Dr. Tomlinson, evidence in judicial review proceedings is scarcely touched on in administrative law textbooks. There is little by way of a developed jurisprudence on the use of evidence in this context. This leaves us in a difficult position when considering the type of evidence that can be brought when challenging an automated decision. It might be necessary to bring expert evidence as to the workings of the relevant computer system. Lawyers may need to turn to journalistic techniques in order to produce evidence as to how algorithms are functioning. How this will be received by the courts is unclear. Further, it may be necessary to push for a change to the three-month limitation period for bringing a judicial review claim, so that sufficient evidence can be gathered.

All in all, it is clear that the role of algorithms in public sector decision-making will require from lawyers (and, dare I say, judges) a degree of creativity. The law is already playing catch-up.

Postscript: Prompted by the comment below, we are happy to direct those interested in learning more about this subject to the website http://www.ai-lawhub.com which is a rich resource of information and analysis.

4 comments


  1. Alice Irving says:

    Thanks Simon – you are of course correct. The slight oversimplification was for the sake of immediate clarity in the context at hand.

  2. Simon Carne says:

    The writer describes statistical or machine learning systems as using data “to make predictions”. Although the crux of this particular article is about predictive algorithms, it is not correct to equate statistical, or machine based, learning only with predictive applications.

    The point about statistical algorithms, as opposed to those which are rule-based, is that the computer (and, quite possibly, the programmer) does not acquire any understanding of the subject at hand. The algorithm simply draws inferences based on past data. An example is the training of dictation software to select correctly between “there”, “their” and “they’re” based on the other words in the sentence, but not by applying rules of language – merely by observing which version of the word has been used on past occasions in which similar word combinations have appeared.

  3. deeandrobin says:

    Thanks for this blog. You will find all this information and much more on http://www.ai-lawhub.com This was actually the source of much that was said at the PLP conference. It would be really nice if you could give us a mention in this blog!! Many thanks Robin Allen

    1. Angus McCullough QC says:

      Many thanks for this pointer Robin – postscript now added.

Leave a Reply

Welcome to the UKHRB


This blog is run by 1 Crown Office Row barristers' chambers. Subscribe for free updates here. The blog's editorial team is:
Commissioning Editors: Darragh Coffey
Jasper Gold
Editorial Team: Rosalind English
Angus McCullough KC
David Hart KC
Martin Downs
Jim Duffy
Jonathan Metzer

Free email updates


Enter your email address to subscribe to this blog for free and receive weekly notifications of new posts by email.

Subscribe

Categories


Disclaimer


This blog is maintained for information purposes only. It is not intended to be a source of legal advice and must not be relied upon as such. Blog posts reflect the views and opinions of their individual authors, not of chambers as a whole.

Our privacy policy can be found on our ‘subscribe’ page or by clicking here.

Tags


Aarhus Abortion Abu Qatada Abuse Access to justice administrative court adoption ALBA Allison Bailey Al Qaeda animal rights anonymity Appeals Article 1 Protocol 1 Article 2 article 3 Article 4 article 5 Article 6 Article 7 Article 8 Article 9 article 10 Article 11 article 13 Article 14 Artificial Intelligence Asbestos assisted suicide asylum Australia autism benefits Bill of Rights biotechnology blogging Bloody Sunday brexit Bribery Catholicism Chagos Islanders charities Children children's rights China christianity citizenship civil liberties campaigners climate change clinical negligence Coercion common law confidentiality consent conservation constitution contempt of court Control orders Copyright coronavirus Coroners costs court of appeal Court of Protection covid crime Criminal Law Cybersecurity Damages Dartmoor data protection death penalty defamation deportation deprivation of liberty Detention diplomatic immunity disability disclosure Discrimination disease divorce DNA domestic violence duty of candour duty of care ECHR ECtHR Education election Employment Employment Law Employment Tribunal enforcement Environment Equality Act Ethiopia EU EU Charter of Fundamental Rights EU costs EU law European Court of Justice evidence extradition extraordinary rendition Fair Trials Family Fertility FGM Finance football foreign criminals foreign office France freedom of assembly Freedom of Expression freedom of information freedom of speech Free Speech Gay marriage Gaza gender Gender Recognition Act genetics Germany gmc Google government Grenfell Health healthcare high court HIV home office Housing HRLA human rights Human Rights Act human rights news Huntington's Disease immigration India Indonesia injunction injunctions Inquests international law internet Inuit Iran Iraq Ireland Islam Israel Italy IVF Jalla v Shell Japan Japanese Knotweed Journalism Judaism judicial review jury trial JUSTICE Justice and Security Bill Land Reform Law Pod UK legal aid legal ethics legality Leveson Inquiry LGBTQ Rights liability Libel Liberty Libya Lithuania local authorities marriage Maya Forstater mental capacity Mental Health military Ministry of Justice Mirror Principle modern slavery monitoring murder music Muslim nationality national security NHS Northern Ireland nuclear challenges nuisance Obituary ouster clauses parental rights parliamentary expenses scandal Parole patents Pensions Personal Injury Piracy Plagiarism planning Poland Police Politics pollution press Prisoners Prisons privacy Private Property Procedural Fairness Professional Discipline Property proportionality Protection of Freedoms Bill Protest Public/Private public access public authorities public inquiries public law Regulatory Proceedings rehabilitation Reith Lectures Religion RightsInfo Right to assembly right to die right to family life Right to Privacy Right to Roam right to swim riots Roma Romania Round Up Royals Russia Saudi Arabia Scotland secrecy secret justice Sex sexual offence sexual orientation Sikhism Smoking social media Social Work South Africa Spain special advocates Sports Standing statelessness Statutory Interpretation stop and search Strasbourg Supreme Court Supreme Court of Canada surrogacy surveillance Syria Tax technology Terrorism tort Torture Transgender travel travellers treaty TTIP Turkey UK Ukraine UK Supreme Court unduly harsh united nations unlawful detention USA US Supreme Court vicarious liability Wales War Crimes Wars Welfare Western Sahara Whistleblowing Wikileaks Wild Camping wind farms WomenInLaw YearInReview Zimbabwe

Tags


Aarhus Abortion Abu Qatada Abuse Access to justice administrative court adoption ALBA Allison Bailey Al Qaeda animal rights anonymity Appeals Article 1 Protocol 1 Article 2 article 3 Article 4 article 5 Article 6 Article 7 Article 8 Article 9 article 10 Article 11 article 13 Article 14 Artificial Intelligence Asbestos assisted suicide asylum Australia autism benefits Bill of Rights biotechnology blogging Bloody Sunday brexit Bribery Catholicism Chagos Islanders charities Children children's rights China christianity citizenship civil liberties campaigners climate change clinical negligence Coercion common law confidentiality consent conservation constitution contempt of court Control orders Copyright coronavirus Coroners costs court of appeal Court of Protection covid crime Criminal Law Cybersecurity Damages Dartmoor data protection death penalty defamation deportation deprivation of liberty Detention diplomatic immunity disability disclosure Discrimination disease divorce DNA domestic violence duty of candour duty of care ECHR ECtHR Education election Employment Employment Law Employment Tribunal enforcement Environment Equality Act Ethiopia EU EU Charter of Fundamental Rights EU costs EU law European Court of Justice evidence extradition extraordinary rendition Fair Trials Family Fertility FGM Finance football foreign criminals foreign office France freedom of assembly Freedom of Expression freedom of information freedom of speech Free Speech Gay marriage Gaza gender Gender Recognition Act genetics Germany gmc Google government Grenfell Health healthcare high court HIV home office Housing HRLA human rights Human Rights Act human rights news Huntington's Disease immigration India Indonesia injunction injunctions Inquests international law internet Inuit Iran Iraq Ireland Islam Israel Italy IVF Jalla v Shell Japan Japanese Knotweed Journalism Judaism judicial review jury trial JUSTICE Justice and Security Bill Land Reform Law Pod UK legal aid legal ethics legality Leveson Inquiry LGBTQ Rights liability Libel Liberty Libya Lithuania local authorities marriage Maya Forstater mental capacity Mental Health military Ministry of Justice Mirror Principle modern slavery monitoring murder music Muslim nationality national security NHS Northern Ireland nuclear challenges nuisance Obituary ouster clauses parental rights parliamentary expenses scandal Parole patents Pensions Personal Injury Piracy Plagiarism planning Poland Police Politics pollution press Prisoners Prisons privacy Private Property Procedural Fairness Professional Discipline Property proportionality Protection of Freedoms Bill Protest Public/Private public access public authorities public inquiries public law Regulatory Proceedings rehabilitation Reith Lectures Religion RightsInfo Right to assembly right to die right to family life Right to Privacy Right to Roam right to swim riots Roma Romania Round Up Royals Russia Saudi Arabia Scotland secrecy secret justice Sex sexual offence sexual orientation Sikhism Smoking social media Social Work South Africa Spain special advocates Sports Standing statelessness Statutory Interpretation stop and search Strasbourg Supreme Court Supreme Court of Canada surrogacy surveillance Syria Tax technology Terrorism tort Torture Transgender travel travellers treaty TTIP Turkey UK Ukraine UK Supreme Court unduly harsh united nations unlawful detention USA US Supreme Court vicarious liability Wales War Crimes Wars Welfare Western Sahara Whistleblowing Wikileaks Wild Camping wind farms WomenInLaw YearInReview Zimbabwe

Discover more from UK Human Rights Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading