We use cookies to enhance your browsing experience. If you continue to use our website we will take this to mean that you agree to our use of cookies. If you want to find out more, please view our cookie policy. Accept and Hide [x]
I recently came across this judgement by the South African Constitutional Court. As a “Saffa” myself, I rejoice in the case’s title, pairing the name of the penultimate prime minister of the old apartheid South Africa (Botha), and the name (Smuts) of a much earlier Prime Minister of the Union of South Africa from 1919 to 1948.
But this case concerned two ordinary people, an insurance broker and an environmental activist, locking horns over their respective rights to privacy and freedom of expression under the South African Bill of Rights. The Constitutional Court judgment – running into nearly 100 pages in the Butterworths Human Rights Cases – is an interesting example of “salami slicing”, where the court takes apart a protected right and determines which bits of it can be upheld in the circumstances, and which can be set aside. It is also a fascinating insight into how information on social media platforms involves constant “re-publication”, and what that means for privacy and free speech rights. And finally, the judicial reflections on publication of someone’s personal address in the days of WFH show how far we have changed as a society since the pandemic.
The facts can be set out briefly.
Background facts and law
The applicant, Mr Botha, is an insurance broker who resides and conducts business in Gqeberha. He is also the owner of the farm Varsfontein situated in Alicedale in the Eastern Cape Province, a hundred kilometers away from his home.
The first respondent, Mr Smuts, is a wildlife conservationist, farmer, researcher and activist. The second respondent (amicus) is the Landmark Leopard and Predator Project – South Africa, a conservation non-governmental organisation focusing on human wildlife conflict management and leopard and carnivore conservation. It was founded by Mr Smuts who is its executive director.
A member of a group of cyclists who participated in an organised adventure ride that traversed Mr Botha’s farm (legally) encountered a dead baboon and porcupine in cage traps. The animals appeared to him to have been exposed to suffering and distress. Outraged by what he saw, the cyclist photographed the dead animals in the cages with the intention of sharing the photographs with an organisation capable of taking action. He shared them with Mr Smuts on 1 October 2019.
He also sent Mr Smuts a detailed map depicting the location of Mr Botha’s farm on which he indicated the place on the farm where the photographs were taken.
Mr Smuts published a post on the second respondent’s Facebook page which included, amongst others,
(a) a photograph of a baboon trapped in a cage; (b) a photograph of a porcupine trapped in a cage; (c ) a Google search location of Mr Botha’s insurance brokerage address (which turned out also to be Mr Botha’s residential address) and telephone number.
In HM Attorney General for England and Wales v British Broadcasting Corporation [2025] EWHC 1669 (KB), the Divisional Court (the Lady Chief Justice,the President of the King’s Bench Division, and Chamberlain J) gave judgment in relation to the deployment of evidence by MI5 in proceedings concerning the BBC’s reporting on a covert human intelligence source (CHIS), referred to as “X”. The judgment is quite extraordinary, including substantial criticism of the approach taken by MI5 in this case and specific guidance as to the way that evidence from an agency such as MI5 should be presented in future.
This was not a class action but a representative action, pursuant to what is now Civil Procedure Rule (CPR) 19.8, for the tort of misuse of private information against the respondents Google UK Limited (Google) and DeepMind Technologies Limited (DeepMind). The action was on behalf of Mr Prismall and a class of persons said to number approximately 1.6 million.
The appeal was against the striking out of his representative claim for misuse of private information in the court below. In a representative action like this the task before the judge is to establish whether the “lowest common denominator” claimant in the class would fail to make their claim. The judge found that the lowest common denominator claimant in the group of persons represented did not have a realistic prospect of success.
Details of the Case
The claim was for damages in respect of both the one-off transfer by the Royal Free London NHS Foundation Trust (the Royal Free Trust) of data in October 2015, and the continuing transfer of data thereafter until 29 September 2017 pursuant to a live data feed. The data which was transferred took the form of patient-identifiable medical records held by the Royal Free Trust of patients, including Mr Prismall, who had attended hospitals in the Royal Free Trust or had blood tests processed by laboratories operated by the Royal Free Trust between 29 September 2010 and 29 September 2015. Google and DeepMind used the data for the purposes of developing an app called “Streams” which was intended to be used to identify and treat patients suffering from Acute Kidney Injury. Google and DeepMind also had, however, a contractual entitlement to use the data for purposes wider than direct patient care and to develop and prove capabilities to enhance future commercial prospects.
At first instance the judge found that each member of the class did not have a realistic prospect of establishing a reasonable expectation of privacy in respect of their medical records or of crossing the de minimis threshold in relation to such an expectation such that there was no realistic prospect of establishing misuse of private information of each member of the class, or a realistic prospect of establishing an entitlement to damages for loss of control. The lowest common denominator was a notional claimant in the class whose claim represented the “irreducible minimum scenario” for a claimant in the class of persons. The judge’s lowest common denominator claimant was premised on the basis that there was one attendance at a trust hospital, which was an attendance not concerning “a medical condition involving any particular sensitivity or stigma” and there being “no specific reference to the medical condition that had prompted the attendance”. The judge had identified for the irreducible minimum scenario for the lowest denominator claimant that “no upset or concern was caused by the data transfer”. The judge found that the lowest common denominator claimant’s claim would fail.
Grounds of claim
Mr Prismall’s claim related to the wrongful use of private patient information by Google and DeepMind in: (1) obtaining patient-identifiable medical records with a contractual entitlement under the Information Sharing Agreement which was wider than direct patient care and the Streams project;
(2) storing the medical records prior to Streams becoming operational;
(3) using the medical records in the research and development of Streams; and
(4) developing and providing their general capabilities by the use of the medical records for the purposes of future commercial prospects. Damages were claimed for loss of control of the private information only.
The judge said that it was “also well-established that not every disclosure of medical information will give rise to a reasonable expectation of privacy and/or involve an unlawful interference.” If anodyne or trivial information about a brief hospital visit was made public by a patient, the judge saw no reason why that information would attract a reasonable expectation of privacy by dint of it being recorded in a medical record.
Personal data is intimately connected to privacy (art 8, ECHR) but is regulated by specific data protection regimes, such as the UK GDPR. Attention-grabbing legal issues arising out of Big Data dominate the public discourse around data protection: can generative AI use datasets without breaching intellectual property laws; how should the NHS use its mass of personal data; should we be compensated for the value of the data we provide to tech companies who go on to use it in advertising.
But on the other end of the scale from big data claims sits what might be thought of as ‘small data’ – issues around the use of one individual person’s data and the sometimes serious effects that can have. Jasper Gold joins Lucy McCann in a new episode of Law Pod UK to discuss the intersection of data protection, distress and personal injury, and consider some of the legal and tactical issues for litigants involved in these claims.
Brown v Commissioner of the Police of the Metropolis & Anor(2016), Claim No. 3YM09078 (at first instance) and [2019] EWCA Civ 1724 (in the Court of Appeal, on the issue of qualified one way costs shifting)
It has been widely reported that the German magazine Die Aktuelle recently ran a front cover with a picture of a smiling Schumacher and the headline promising ‘Michael Schumacher, the first interview’.
The strapline added: “it sounded deceptively real”.
Anyone walking past a news stand would have assumed that this was a genuine interview with the former Formula 1 driver, who has suffered catastrophic brain injury since a skiing accident in 2013. Only buyers of the edition would have learned from the full article inside, that the ‘quotes’ had been produced by AI.
The news agency Reuter reports that “Schumacher’s family maintains strict privacy about the former driver’s condition, with access limited to those closest to him.”
And in a 2021 Netflix documentary his wife Corinna said
The UKHRB is grateful to Aileen McColgan QC for allowing us to republish her article, which originally appeared on Panoptican, a blog published by the barristers at 11KBW here.
The central question for the Supreme Court in Bloomberg v ZXC [2022] UKSC 5 was, as Lords Hamblen and Stephens put it (with Lord Reeds, Lloyd-Jones and Sales agreeing): “whether, in general, a person under criminal investigation has, prior to being charged, a reasonable expectation of privacy in respect of information relating to that investigation”. The short answer was “yes”.
The decision has been greeted with howls of indignation from Bloomberg but more muted responses from other sections of the press; whereas Bloomberg’s editor in chief released an editorial entitled “U.K. Judges Are Helping the Next Robert Maxwell” which stated that the judgment should “frighten every decent journalist in Britain”, the Financial Times and Guardian were more restrained, pointing out respectively that the decision would have “far-reaching implications for the British media” and would “make it harder for British media outlets to publish information about individuals subject to criminal investigations”. This is no doubt the case, but it is worth noting that the publication which gave rise to this decision was based on a highly confidential letter leaked to Bloomberg and occurred apparently without any consideration of ZXC’s privacy interests.
ZXC, regional CEO of a publicly listed company which operated overseas (“X Ltd”), sued for misuse of private information because of an article concerning X Ltd’s activities in a country for which ZXC’s division was responsible. The activities had been subject to a criminal investigation by a UK law enforcement body (“the UKLEB”) since 2013 and the article was based almost completely on a confidential Letter of Request sent by the UKLEB to the foreign state. ZXC claimed that he had a reasonable expectation of privacy in information published in the Article, in particular in the details of the UKLEB investigation into himself, its assessment of the evidence, the fact that it believed that ZXC had committed specified criminal offences and its explanation of how the evidence it sought would assist its investigation into that suspected offending. ZXC’s application for damages and injunctive relief was upheld at first instance by Nicklin J and £25,000 awarded: [2019] EWHC 970 (QB); [2019] EMLR 20. Bloomberg’s appear was dismissed (see Panopticon post by Robin Hopkins and [2020] EWCA Civ 611; [2021] QB 28.
Having been temporarily suspended in early January as a result of an increase in COVID-19 cases, the Grenfell Tower Inquiry hearings resumed on 8 February 2021. The fire killed 72 people.
The hearings are being conducted remotely using a Zoom-based video platform, which the Inquiry describes as “a temporary measure to be used only for as long as absolutely necessary”.
The Inquiry conducted Phase 1 of the investigation, which focused on the events of the night of 14 June 2017, on 12 December 2018. Phase 2 is currently underway, which examines the causes of these events, including how Grenfell Tower came to be in a condition which allowed the fire to spread in the way identified by Phase 1.
Last week, the Inner House of the Court of Session refused a reclaiming motion in relation to the use of racist, antisemitic and sexist WhatsApp messages in misconduct proceedings against ten police officers. The judgment discusses several interesting issues, such as the police officers’ reasonable expectation of privacy when exchanging such messages, which can be found here.
However, the focus of this article shall be on an aspect of the case which was not cross appealed: the existence of a common law right to privacy in Scotland. Despite not being an issue of contention, the Lord Justice Clerk, Lady Dorrian, took the opportunity to express her views on the matter. These now cast doubt over the existence of such a right – one which Lord Bannatyne, from the Outer House, believed was nascently recognised in case law.
This article was first published on the UK Labour Law Blog ( @labour_blog). We repost it with the kind permission of Dr Philippa Collins (@DrPMCollins at Exeter University)and the editors of the Labour Law Blog
One of the lasting impacts of the COVID-19 pandemic upon the world of work is likely to be a move away from the traditional workplace. In some sectors, such as academia, IT, and administration, remote work or home working is an established working pattern, although a rare one given national statistics from 2019 which indicated only 5% of the workforce worked mainly from home. The need to prevent the spread of the coronavirus through contact in the workplace precipitated a rapid and widespread move to homeworking. In an ONS survey in early May, 44% of adults surveyed were working from home. As some businesses begin to transition back into their previous working patterns, several high-profile companies have announced that they will not expect their staff to return to the workplace and will support homeworking as a permanent option in the future.
The Court of Appeal, overturning a Divisional Court decision, has found the use of a facial recognition surveillance tool used by South Wales Police to be in breach of Article 8 of the European Convention on Human Rights (ECHR). The case was brought by Liberty on behalf of privacy and civil liberties campaigner Ed Bridges. The appeal was upheld on the basis that the interference with Article 8 of the ECHR, which guarantees a right to privacy and family life, was not “in accordance with law” due to an insufficient legal framework. However, the court found that, had it been in accordance with law, the interference caused by the use of facial recognition technology would not have been disproportionate to the goal of preventing crime. The court also found that Data Protection Impact Assessment (DPIA) was deficient, and that the South Wales Police (SWP), who operated the technology, had not fulfilled their Public Sector Equality Duty.
In Sutherland v Her Majesty’s Advocate, the Supreme Court ruled unanimously that it was compatible with the accused person’s rights under ECHR article 8 to use evidence obtained by “paedophile hunter” (“PH”) groups in a criminal trial.
PH groups impersonate children online to lure persons into making inappropriate or sexualised communications with them over the internet, and then provide the material generated by such contact to the police. Importantly, they operate without police authorisation.
Per Section 6(1) of the HRA, a prosecution authority – as a public authority – cannot lawfully act in a way that is incompatible with a Convention right. Consequently, there were two compatibility issues on appeal before the Supreme Court:
Were the appellant’s article 8 rights interfered with by the use of the communications provided by the PH group as evidence in his public prosecution?
To what extent is the state’s obligation to provide adequate protection for article 8 rights incompatible with the use by a public prosecutor of material supplied by PH groups in investigating and prosecuting crime?
This afternoon, health secretary Matt Hancock made a statement in the Commons updating the house on the government’s response to the crisis.
The health secretary announced that anyone in the UK aged five and over who has coronavirus symptoms will be eligible for a test. From today, recognised symptoms include the loss of smell and taste, as well a persistent cough and a high temperature. Hancock confirmed for the first time that the government has recruited over 21,000 contact tracers, including 7,500 health care professionals, to manually trace and get in contact with anyone who has tested positive.
In addition, he offered a degree of clarification in relation to the government’s new contact tracing app. The function of the app is to alert people of the need to self-isolate if they have come into proximity with an individual who reported coronavirus symptoms.
Earlier this month, the Scottish Parliament’s Justice Sub-Committee on Policing published a report which concluded that live facial recognition technology is currently “not fit” for use by Police Scotland.
Police Scotland had initially planned to introduce live facial recognition technology (“the technology”) in 2026. However, this has now been called into question as a result of the report’s findings – that the technology is extremely inaccurate, discriminatory, and ineffective. Not only that, but it also noted that the technology would be a “radical departure” from Police Scotland’s fundamental principle of policing by consent.
In light of the above, the Sub-Committee concluded that there would be “no justifiable basis” for Police Scotland to invest in the technology.
Police Scotland agreed – at least for the time being – and confirmed in the report that they will not introduce the technology at this time. Instead, they will engage in a wider debate with various stakeholders to ensure that the necessary safeguards are in place before introducing it. The Sub-Committee believed that such a debate was essential in order to assess the necessity and accuracy of the technology, as well as the potential impact it could have on people and communities.
The report is undoubtedly significant as it reaffirms that the current state of the technology is ineffective. It therefore strengthens the argument that we should have a much wider debate about the technology before we ever introduce it onto our streets. This is important not only on a practical level but also from a human rights perspective, especially set against the backdrop of the technology’s controversial use elsewhere.
This post, and those that follow it, summarises some of the main points of interest arising from the ALBA Conference 2019.
‘The Constitutionality of Ouster Clauses’ – Chair: Lord Justice Leggatt; Speakers: Professor Alison Young, Professor David Feldman, Professor Stephen Bailey
s.67(8) of RIPA contains a so-called ‘ouster clause’, which held that “determinations, awards and other decisions of the Tribunal (including decisions as to whether they have jurisdiction) shall not be subject to appeal or be liable to be questioned in any court”.
The issue in Privacy International was whether decisions made by the IPT were judicially reviewable. A majority of the Supreme Court held that s.67(8) did not, in fact, oust the jurisdiction of the court. The panel analysed this crucial case in more detail.
Privacy International v. Investigatory Powers Tribunal [2017] EWHC EWCA Civ 1868, Court of Appeal, 23 November 2017
Introduction
As all lawyers know, the great case about courts confronting a no-go area for them is the late 1960’s case of Anisminic.
A statutory Commission was given the job of deciding whether compensation should be awarded for property sequestrated, in the particular case as a result of the 1956 Suez crisis. The Act empowering it said that the
determination by the Commission of any application made to them under this Act shall not be called in question in any court of law.
The House of Lords, blasting aside arcane distinctions, said that this provision was not enough to oust judicial review for error of law.
Fast forward 50 years, and another Act which says
determinations, awards, orders and other decisions of the Tribunal (including decisions as to whether they have jurisdiction) shall not be subject to appeal or be liable to be questioned in any court.
The Court of Appeal has just decided that, unlike Anisminic, this Act does exclude any judicial review.
This blog is maintained for information purposes only. It is not intended to be a source of legal advice and must not be relied upon as such. Blog posts reflect the views and opinions of their individual authors, not of chambers as a whole.
Our privacy policy can be found on our ‘subscribe’ page or by clicking here.
Recent comments