The UKHRB is grateful to Aileen McColgan QC for allowing us to republish her article, which originally appeared on Panoptican, a blog published by the barristers at 11KBW here.
The central question for the Supreme Court in Bloomberg v ZXC  UKSC 5 was, as Lords Hamblen and Stephens put it (with Lord Reeds, Lloyd-Jones and Sales agreeing): “whether, in general, a person under criminal investigation has, prior to being charged, a reasonable expectation of privacy in respect of information relating to that investigation”. The short answer was “yes”.
The decision has been greeted with howls of indignation from Bloomberg but more muted responses from other sections of the press; whereas Bloomberg’s editor in chief released an editorial entitled “U.K. Judges Are Helping the Next Robert Maxwell” which stated that the judgment should “frighten every decent journalist in Britain”, the Financial Times and Guardian were more restrained, pointing out respectively that the decision would have “far-reaching implications for the British media” and would “make it harder for British media outlets to publish information about individuals subject to criminal investigations”. This is no doubt the case, but it is worth noting that the publication which gave rise to this decision was based on a highly confidential letter leaked to Bloomberg and occurred apparently without any consideration of ZXC’s privacy interests.
ZXC, regional CEO of a publicly listed company which operated overseas (“X Ltd”), sued for misuse of private information because of an article concerning X Ltd’s activities in a country for which ZXC’s division was responsible. The activities had been subject to a criminal investigation by a UK law enforcement body (“the UKLEB”) since 2013 and the article was based almost completely on a confidential Letter of Request sent by the UKLEB to the foreign state. ZXC claimed that he had a reasonable expectation of privacy in information published in the Article, in particular in the details of the UKLEB investigation into himself, its assessment of the evidence, the fact that it believed that ZXC had committed specified criminal offences and its explanation of how the evidence it sought would assist its investigation into that suspected offending. ZXC’s application for damages and injunctive relief was upheld at first instance by Nicklin J and £25,000 awarded:  EWHC 970 (QB);  EMLR 20. Bloomberg’s appear was dismissed (see Panopticon post by Robin Hopkins and  EWCA Civ 611;  QB 28.
Having been temporarily suspended in early January as a result of an increase in COVID-19 cases, the Grenfell Tower Inquiry hearings resumed on 8 February 2021. The fire killed 72 people.
The hearings are being conducted remotely using a Zoom-based video platform, which the Inquiry describes as “a temporary measure to be used only for as long as absolutely necessary”.
The Inquiry conducted Phase 1 of the investigation, which focused on the events of the night of 14 June 2017, on 12 December 2018. Phase 2 is currently underway, which examines the causes of these events, including how Grenfell Tower came to be in a condition which allowed the fire to spread in the way identified by Phase 1.
Last week, the Inner House of the Court of Session refused a reclaiming motion in relation to the use of racist, antisemitic and sexist WhatsApp messages in misconduct proceedings against ten police officers. The judgment discusses several interesting issues, such as the police officers’ reasonable expectation of privacy when exchanging such messages, which can be found here.
However, the focus of this article shall be on an aspect of the case which was not cross appealed: the existence of a common law right to privacy in Scotland. Despite not being an issue of contention, the Lord Justice Clerk, Lady Dorrian, took the opportunity to express her views on the matter. These now cast doubt over the existence of such a right – one which Lord Bannatyne, from the Outer House, believed was nascently recognised in case law.
This article was first published on the UK Labour Law Blog ( @labour_blog). We repost it with the kind permission of Dr Philippa Collins (@DrPMCollins at Exeter University)and the editors of the Labour Law Blog
One of the lasting impacts of the COVID-19 pandemic upon the world of work is likely to be a move away from the traditional workplace. In some sectors, such as academia, IT, and administration, remote work or home working is an established working pattern, although a rare one given national statistics from 2019 which indicated only 5% of the workforce worked mainly from home. The need to prevent the spread of the coronavirus through contact in the workplace precipitated a rapid and widespread move to homeworking. In an ONS survey in early May, 44% of adults surveyed were working from home. As some businesses begin to transition back into their previous working patterns, several high-profile companies have announced that they will not expect their staff to return to the workplace and will support homeworking as a permanent option in the future.
The Court of Appeal, overturning a Divisional Court decision, has found the use of a facial recognition surveillance tool used by South Wales Police to be in breach of Article 8 of the European Convention on Human Rights (ECHR). The case was brought by Liberty on behalf of privacy and civil liberties campaigner Ed Bridges. The appeal was upheld on the basis that the interference with Article 8 of the ECHR, which guarantees a right to privacy and family life, was not “in accordance with law” due to an insufficient legal framework. However, the court found that, had it been in accordance with law, the interference caused by the use of facial recognition technology would not have been disproportionate to the goal of preventing crime. The court also found that Data Protection Impact Assessment (DPIA) was deficient, and that the South Wales Police (SWP), who operated the technology, had not fulfilled their Public Sector Equality Duty.
In Sutherland v Her Majesty’s Advocate, the Supreme Court ruled unanimously that it was compatible with the accused person’s rights under ECHR article 8 to use evidence obtained by “paedophile hunter” (“PH”) groups in a criminal trial.
PH groups impersonate children online to lure persons into making inappropriate or sexualised communications with them over the internet, and then provide the material generated by such contact to the police. Importantly, they operate without police authorisation.
Per Section 6(1) of the HRA, a prosecution authority – as a public authority – cannot lawfully act in a way that is incompatible with a Convention right. Consequently, there were two compatibility issues on appeal before the Supreme Court:
Were the appellant’s article 8 rights interfered with by the use of the communications provided by the PH group as evidence in his public prosecution?
To what extent is the state’s obligation to provide adequate protection for article 8 rights incompatible with the use by a public prosecutor of material supplied by PH groups in investigating and prosecuting crime?
This afternoon, health secretary Matt Hancock made a statement in the Commons updating the house on the government’s response to the crisis.
The health secretary announced that anyone in the UK aged five and over who has coronavirus symptoms will be eligible for a test. From today, recognised symptoms include the loss of smell and taste, as well a persistent cough and a high temperature. Hancock confirmed for the first time that the government has recruited over 21,000 contact tracers, including 7,500 health care professionals, to manually trace and get in contact with anyone who has tested positive.
In addition, he offered a degree of clarification in relation to the government’s new contact tracing app. The function of the app is to alert people of the need to self-isolate if they have come into proximity with an individual who reported coronavirus symptoms.
Earlier this month, the Scottish Parliament’s Justice Sub-Committee on Policing published a report which concluded that live facial recognition technology is currently “not fit” for use by Police Scotland.
Police Scotland had initially planned to introduce live facial recognition technology (“the technology”) in 2026. However, this has now been called into question as a result of the report’s findings – that the technology is extremely inaccurate, discriminatory, and ineffective. Not only that, but it also noted that the technology would be a “radical departure” from Police Scotland’s fundamental principle of policing by consent.
In light of the above, the Sub-Committee concluded that there would be “no justifiable basis” for Police Scotland to invest in the technology.
Police Scotland agreed – at least for the time being – and confirmed in the report that they will not introduce the technology at this time. Instead, they will engage in a wider debate with various stakeholders to ensure that the necessary safeguards are in place before introducing it. The Sub-Committee believed that such a debate was essential in order to assess the necessity and accuracy of the technology, as well as the potential impact it could have on people and communities.
The report is undoubtedly significant as it reaffirms that the current state of the technology is ineffective. It therefore strengthens the argument that we should have a much wider debate about the technology before we ever introduce it onto our streets. This is important not only on a practical level but also from a human rights perspective, especially set against the backdrop of the technology’s controversial use elsewhere.
s.67(8) of RIPA contains a so-called ‘ouster clause’, which held that “determinations, awards and other decisions of the Tribunal (including decisions as to whether they have jurisdiction) shall not be subject to appeal or be liable to be questioned in any court”.
The issue in Privacy International was whether decisions made by the IPT were judicially reviewable. A majority of the Supreme Court held that s.67(8) did not, in fact, oust the jurisdiction of the court. The panel analysed this crucial case in more detail.
Privacy International v. Investigatory Powers Tribunal  EWHC EWCA Civ 1868, Court of Appeal, 23 November 2017
As all lawyers know, the great case about courts confronting a no-go area for them is the late 1960’s case of Anisminic.
A statutory Commission was given the job of deciding whether compensation should be awarded for property sequestrated, in the particular case as a result of the 1956 Suez crisis. The Act empowering it said that the
determination by the Commission of any application made to them under this Act shall not be called in question in any court of law.
The House of Lords, blasting aside arcane distinctions, said that this provision was not enough to oust judicial review for error of law.
Fast forward 50 years, and another Act which says
determinations, awards, orders and other decisions of the Tribunal (including decisions as to whether they have jurisdiction) shall not be subject to appeal or be liable to be questioned in any court.
The Court of Appeal has just decided that, unlike Anisminic, this Act does exclude any judicial review.
This blog has covered a number of claims for damages arising out of the misuse of private information. The Mirror Group phone hacking case is one example (see my post here and the appeal decision here), and the fall-out from the hapless Home Office official who put private information about asylum-seekers on the Internet, being another – (Gideon Barth’s post on TLT here). See also below for related posts.
But this post is to give a bit of context, via the wider and scarier cyber crime which is going on all around us. It threatens the livelihoods of individuals and businesses the globe over – and has given and will undoubtedly give rise to complex spin-off litigation.
So let’s just start with the other week. On 21 October 2016, it seems nearly half the Internet was hit by a massive DDoS attack affecting a company, Dyn, which provides internet services infrastructure for a host of websites. Twitter, Reddit, Netflix, WIRED, Spotify and the New York Times were affected. DDoS, for cyber virgins, is Distributed Denial of Service, i.e. an overloading of servers via a flood of malicious requests, in this case from tens of millions of IP addresses. No firm culprits so far, but a botnet called Mirai seems to be in the frame. It is thought that non-secure items like cars, fridges and cameras connected to the Internet (the Internet of Things) may be the conscripted foot soldiers in such attacks.
And now to the sorts of cases which have hit the headlines in this country to date.
In the matter of proceedings brought by Kings College NHS Foundation Trust concerning C (who died on 28 November 2015) v The Applicant and Associated Newspapers Ltd and others  EWCOP21 – read judgment
The Court of Protection has just ruled that where a court has restricted the publication of information during proceedings that were in existence during a person’s lifetime, it has not only the right but the duty to consider, when requested to do so, whether that information should continue to be protected following the person’s death.
I posted last year on the case of a woman who had suffered kidney failure as a result of a suicide attempt has been allowed to refuse continuing dialysis. The Court of Protection rejected the hospital’s argument that such refusal disclosed a state of mind that rendered her incapable under the Mental Capacity Act. An adult patient who suffers from no mental incapacity has an absolute right to choose whether to consent to medical treatment (King’s College Hospital NHS Foundation Trust v C and another  EWCOP 80). Continue reading →
In December 2015, the European Court of Human Rights, by 6 votes to 1, dismissed a Romanian national’s appeal against his employer’s decision to terminate his contract for using a professional Yahoo Messenger account to send personal messages to his fiancé and brother.
Mr Barbulescu contended that his employer had breached his Article 8 right to respect for his private life and correspondence, and that the domestic courts had failed to protect his right. The Court found that there had been no such violation because the monitoring of the account by his employer had been limited and proportionate.
Mr Barbulescu’s employers asked him to create a Yahoo Messenger account for responding to client enquiries and informed him that these communications had been monitored. The records showed that he had used the Internet for personal purposes, contrary to internal regulations. The employer’s regulations explicitly prohibited all personal use of company facilities, including computers and Internet access. The employer had accessed the Yahoo Messenger account in the belief that it had contained professional messages. Continue reading →
Richardson v Facebook  EWHC 3154 (2 November 2015) – read judgment
An action in defamation and under the right to privacy against Facebook has been dismissed in the High Court. The Facebook entity named as defendant did not “control” the publication so as to allow liability; and even if it did, no claim under the Human Rights Act could lie against FB as it could not be described as any sort of a public authority for the purposes of Section 6 of the Act.
The claimant, acting as a litigant in person, sought damages in respect of the publication in 2013 and 2014 of a Facebook profile and a posting on the Google Blogger service. The Profile and the Blogpost each purported to have been created by the claimant, but she complained that each was a fake, created by an impostor. She claimed that each was defamatory of her, and infringed her right to respect for her private life under Article 8 of the European Convention on Human Rights (ECHR). Continue reading →
Does the publication of photographs of a child taken during a riot fall within the scope of Article 8 ECHR?
It depends, says a Supreme Court majority, specifically on whether there was a reasonable expectation of privacy. Either way, the Court in J38 agreed that whether or not the 14 year-old Appellant’s right to respect for private life was in play, the publication of police photographs of him was justified in the circumstances.
This blog is maintained for information purposes only. It is not intended to be a source of legal advice and must not be relied upon as such. Blog posts reflect the views and opinions of their individual authors, not of chambers as a whole.