On 8 September 2022, the European Court of Human Rights (ECtHR) handed down its decision in Drelon v France (application nos. 3153/16 and 27758/18). The Court unanimously found a violation of Article 8 of the European Convention on Human Rights in relation to the collection by the French Blood Donation Service, the Établissement Français du Sang (EFS), of personal data relating to a potential blood donor’s presumed sexual orientation and the excessive length of time the data was kept in a public institution.
The claimant YZ had been acquitted on three counts raping his former wife but details concerning these matters remain on the Police National Computer (PNC). These proceedings concerned whether such retention was lawful.
The question at the heart of this application was whether onus was on the competent authority to justify its processing of the claimant’s dat was lawful and fair under the Data Protection Act 2018. The claimant’s argument was that the relevant guidance (issued pursuant to the 1984 Police and Criminal Evidence Act) to the police was not compatible with this statutory requirement as it put the onus on an applicant for deletion to give reasons for that deletion [para 40].
The CJEU ruled on Tuesday that Directive 2002/58/EC (‘the Directive’) precludes national legislation from ordering telecommunication companies to transfer data in a “general and indiscriminate” manner to security agencies, even for purposes of national security. This is following a challenge by Privacy International to UK security agencies over their practices of collecting bulk communications data (BCD).
The ruling could throw up roadblocks to a post-Brexit “adequacy” agreement over the UKs data protection regime. Adequacy is granted to data protection regimes to confirm that they conform to the data protection standards of GDPR, and thus that companies may move data about EU data subjects outside of the EU to those regimes. Recently, the adequacy rating of the US “Privacy Shield” was invalidated by the Schrems II judgment. This ruling could prove to be an analogous issue for the UK’s adequacy rating at the end of the transition period.
The Court of Appeal, overturning a Divisional Court decision, has found the use of a facial recognition surveillance tool used by South Wales Police to be in breach of Article 8 of the European Convention on Human Rights (ECHR). The case was brought by Liberty on behalf of privacy and civil liberties campaigner Ed Bridges. The appeal was upheld on the basis that the interference with Article 8 of the ECHR, which guarantees a right to privacy and family life, was not “in accordance with law” due to an insufficient legal framework. However, the court found that, had it been in accordance with law, the interference caused by the use of facial recognition technology would not have been disproportionate to the goal of preventing crime. The court also found that Data Protection Impact Assessment (DPIA) was deficient, and that the South Wales Police (SWP), who operated the technology, had not fulfilled their Public Sector Equality Duty.
The High Court has struck out a claim that the disclosure of certain personal information made by a charity to the claimant’s GP was unlawful. Although only summary, this judgment goes to the heart of what we believe data protection to be about. As you will tell from my somewhat trenchant comments at the end of this post, I find it difficult to accept the main conclusion in this ruling.
The LGBT Foundation provides services including counselling and health advice. The claimant sought to access the charity’s services by completing a self-referral form in 2016. The form gave an option for the self-referring individual to consent to information being disclosed to their GP, and stated that the charity would break confidentiality without the individual’s consent if there was reason to be seriously concerned about their welfare. Mr Scott gave his GP’s details in the form. He also stated in the form that he no longer wished to be alive, detailed a previous suicide attempt, said that he had recently been self-harming and that he continued to suffer problems from drug use.
A sessional health and wellbeing officer at the charity conducted an intake assessment for Mr Scott to ascertain what support would be best for him. She told him of the confidentiality policy, including the provision that any information he disclosed would be passed on if the charity considered him to be at risk. In this interview he gave further details of drug use, self-harm and suicidal thoughts. The health officer paused the assessment and consulted a colleague, who advised her to inform Mr Scott that they would be contacting his GP because they had concerns about his welfare. The charity concluded it was at that time unable to provide him with the services he sought from them because of his ongoing drug use. They passed the information on to Mr Scott’s GP via a telephone call. This information was in due course recorded in his medical records.
This post is the first in a series of five reports by Conor Monighan from this year’s conference held by the Administrative Law Bar Association. We will be publishing the next four posts over the next month every Monday.
This year’s ALBA conference featured an impressive list of speakers. There were talks from a Supreme Court judge, a former Lord Chancellor, top silks, and some of the best academics working in public law.
The conference covered a number of practical and substantive topics. The highpoint was an address given by Lord Sumption, in which he responded to criticism of his Reith Lectures. This post, together with those that follow, summarises the key points from the conference.
The Court of Appeal has ruled that a claimant can recover damages for loss of control of their data under section 13 of Data Protection Act 1998 without proving pecuniary loss or distress. The first instance judge, Warby J, had dismissed Mr Lloyd’s application for permission to serve Google outside the jurisdiction in the USA, so preventing the claim getting under way.
The central question was whether the claimant, Mr Richard Lloyd, who is a champion of consumer protection, should be permitted to bring a representative action against Google LLC, the defendant, a corporation based in Delaware in the USA. Mr Lloyd made the claim on behalf of a class of more than 4 million Apple iPhone users. He alleged that Google secretly tracked some of their internet activity, for commercial purposes, between 9th August 2011 and 15th February 2012.
TM (Kenya) concerned a 40 year old Kenyan woman who faced deportation after her applications for leave to remain and asylum were rejected by the Home Office. She had been detained at Yarl’s Wood Immigration Removal Centre in advance of proceedings to remove her from the country, during which time she had been uncooperative with staff. In light of her behaviour and in advance of her removal to Kenya, she was removed from free association with other detainees. Such detention was authorised by the Home Office Immigration Enforcement Manager at Yarl’s Wood, who was also the appointed “contract monitor” at the centre for the purposes of section 49 of the Immigration and Asylum Act 1999.
She sought judicial review of the decision to deprive her of free association. The initial application was refused. She appealed to the Court of Appeal where she advanced three grounds, including that her detention was not properly authorised.
The court found no conflict in the dual positions held by the manager at Yarl’s Wood. The Home Secretary had legitimately authorised her detention under the principles described in Carltona Limited v Commissioners of Works  2 All ER 560. In addition, there was no obligation to develop a formal policy concerning removal from free association, as Rule 40 of the Detention Centre Rules 2001 was sufficiently clear to meet the needs of transparency. Continue reading →
You would have to be a monk or, at any rate, in an entirely internet-free zone, not to have had your recent days troubled by endless GDPR traffic. The tiniest charity holding your name and email address up to the data behemoths have asked, in different ways, for your consent for them to hold your personal data. You may have observed the frankness and simplicity of the former’s requests and the weaseliness of the latter’s, who try to make it rather difficult for you to say no, indeed to understand what precisely they are asking you to do.
Just in case you have not looked at it, here is the Regulation. It is actually a good deal easier to understand than a lot of the summaries of it.
This lack of transparency in these consent forms/privacy statements had not gone unnoticed by one of Europe’s more indefatigable privacy sleuths. Max Schrems, an Austrian lawyer, who, at 30 years of age, has already been to the EU top court twice (see here and here), moved fast. By the end of GDPR day last Friday, 25 May, he sued global platforms with multibillion-euro complaints. 3 complaints said to be valued at €3.9 billion were filed in the early hours against Facebook and two subsidiaries, WhatsApp, and Instagram, via data regulators in Austria, Belgium and Germany. Another complaint valued at €3.7 billion was lodged with France’s CNIL in the case of Google’s Android operating system.
Privacy International v. Investigatory Powers Tribunal  EWHC EWCA Civ 1868, Court of Appeal, 23 November 2017
As all lawyers know, the great case about courts confronting a no-go area for them is the late 1960’s case of Anisminic.
A statutory Commission was given the job of deciding whether compensation should be awarded for property sequestrated, in the particular case as a result of the 1956 Suez crisis. The Act empowering it said that the
determination by the Commission of any application made to them under this Act shall not be called in question in any court of law.
The House of Lords, blasting aside arcane distinctions, said that this provision was not enough to oust judicial review for error of law.
Fast forward 50 years, and another Act which says
determinations, awards, orders and other decisions of the Tribunal (including decisions as to whether they have jurisdiction) shall not be subject to appeal or be liable to be questioned in any court.
The Court of Appeal has just decided that, unlike Anisminic, this Act does exclude any judicial review.
R (o.t.a P & others) v. Secretary of State for Home Department & others  EWCA Civ 321, Court of Appeal, 3 May 2017 – read judgment
The Court of Appeal has upheld challenges to the system of the police retaining information about past misconduct. It held that the system, even after a re-boot in 2013 in response to an earlier successful challenge, remains non-compliant with Article 8.
The problem is well summarised by Leveson P in the first paragraph of the judgment, namely the interface between a system of rehabilitation of offenders and the minimisation of risk to the public caused by the employment of those with misconduct in their pasts.
This blog has covered a number of claims for damages arising out of the misuse of private information. The Mirror Group phone hacking case is one example (see my post here and the appeal decision here), and the fall-out from the hapless Home Office official who put private information about asylum-seekers on the Internet, being another – (Gideon Barth’s post on TLT here). See also below for related posts.
But this post is to give a bit of context, via the wider and scarier cyber crime which is going on all around us. It threatens the livelihoods of individuals and businesses the globe over – and has given and will undoubtedly give rise to complex spin-off litigation.
So let’s just start with the other week. On 21 October 2016, it seems nearly half the Internet was hit by a massive DDoS attack affecting a company, Dyn, which provides internet services infrastructure for a host of websites. Twitter, Reddit, Netflix, WIRED, Spotify and the New York Times were affected. DDoS, for cyber virgins, is Distributed Denial of Service, i.e. an overloading of servers via a flood of malicious requests, in this case from tens of millions of IP addresses. No firm culprits so far, but a botnet called Mirai seems to be in the frame. It is thought that non-secure items like cars, fridges and cameras connected to the Internet (the Internet of Things) may be the conscripted foot soldiers in such attacks.
And now to the sorts of cases which have hit the headlines in this country to date.
Richardson v Facebook  EWHC 3154 (2 November 2015) – read judgment
An action in defamation and under the right to privacy against Facebook has been dismissed in the High Court. The Facebook entity named as defendant did not “control” the publication so as to allow liability; and even if it did, no claim under the Human Rights Act could lie against FB as it could not be described as any sort of a public authority for the purposes of Section 6 of the Act.
The claimant, acting as a litigant in person, sought damages in respect of the publication in 2013 and 2014 of a Facebook profile and a posting on the Google Blogger service. The Profile and the Blogpost each purported to have been created by the claimant, but she complained that each was a fake, created by an impostor. She claimed that each was defamatory of her, and infringed her right to respect for her private life under Article 8 of the European Convention on Human Rights (ECHR). Continue reading →
Emma-Louise Fenelon is a Pupil Barrister at 1 Crown Office Row
‘Eavesdropping, sir? I don’t follow you, begging your pardon. There ain’t no eaves at Bag End, and that’s a fact.’ (J.R.R Tolkein)
If parliamentarians are seen to be taking a more forensic interest in matters of surveillance in the coming weeks and months, the reason is unlikely to be purely down to the publication of the greatly anticipated surveillance legislation. Last week’s Investigatory Powers Tribunal judgment has sent ripples of discontent through both Houses of Parliament, evidenced in immediate calls for an emergency debate on the subject (scheduled to take place in the House of Commons later today).
This blog is maintained for information purposes only. It is not intended to be a source of legal advice and must not be relied upon as such. Blog posts reflect the views and opinions of their individual authors, not of chambers as a whole.