Personal Data in the CJEU

July 18th, 2014 by Christopher Knight

Working out what is and what is not personal data is often difficult, and all the more so where a document is contains different sections or has mixed purposes. In Cases C‑141/12 and C‑372/12 YS v Minister voor Immigratie, Integratie en Asiel (judgment of 17 July 2014, nyr), a request had been made by an immigrant in Holland for a copy of an administrative report concerning his application for a residence permit. It is helpful to set out the details of the document sought. A case officer drafts a document in which he explains the reasons for his draft decision (“the Minute”). The Minute is part of the preparatory process within that service but not of the final decision, even though some points mentioned in it may reappear in the statement of reasons of that decision.

Generally, the Minute contains the following information: name, telephone and office number of the case officer responsible for preparing the decision; boxes for the initials and names of revisers; data relating to the applicant, such as name, date of birth, nationality, gender, ethnicity, religion and language; details of the procedural history; details of the statements made by the applicant and the documents submitted; the legal provisions which are applicable; and, finally, an assessment of the foregoing information in the light of the applicable legal provisions. This assessment is referred to as the ‘legal analysis’. Depending on the case, the legal analysis may be more or less extensive, varying from a few sentences to several pages. In an in-depth analysis, the case officer responsible for the preparation of the decision addresses, inter alia, the credibility of the statements made and explains why he considers an applicant eligible or not for a residence permit. A summary analysis may merely refer to the application of a particular policy line.

Was the Minute personal data within the meaning Article 2(a) of Directive 95/46/EC? There is no doubt, said the CJEU, that the data relating to the applicant for a residence permit and contained in a minute, such as the applicant’s name, date of birth, nationality, gender, ethnicity, religion and language, are information relating to that natural person, who is identified in that minute in particular by his name, and must consequently be considered to be ‘personal data’: at [38].

However, the legal analysis in the Minute, although it may contain personal data, does not in itself constitute such data: at [39]. Held the CJEU, “a legal analysis is not information relating to the applicant for a residence permit, but at most, in so far as it is not limited to a purely abstract interpretation of the law, is information about the assessment and application by the competent authority of that law to the applicant’s situation, that situation being established inter alia by means of the personal data relating to him which that authority has available to it”: at [40]. Extending the application of personal data to cover the legal analysis would not guarantee the right to privacy, or the right to check the accuracy of the personal data itself, but would amount to a right to administrative documents, which the Directive does not provide: at [45]-[46].

Not the most ground-breaking decision to emanate from Luxembourg, but a nonetheless interesting reminder of the utility of carefully distinguishing between different types of data within the same document.

Christopher Knight

Late Reliance on Part I Exemptions

July 18th, 2014 by Christopher Knight

Although hardly at the top of anyone’s list of burning questions which keep them awake at night, there has been a debate about whether the permission to rely on exemptions late (usually after the DN and in the course of litigation before the FTT) extends beyond the substantive exemptions in Part II of FOIA – as provided for in Birkett v DEFRA [2011] EWCA Civ 1606 - to the procedural exemptions of sections 12 and 14.

The question is made all the more enthralling by a conflict of case law, which those who attended our Information Law Conference in 2013 and who weren’t snoozing during my paper will recall. Independent Police Complaints Commission v Information Commissioner [2012] 1 Info LR 427 had held that there could be late reliance on section 12. The Upper Tribunal in All Party Parliamentary Group on Extraordinary Rendition v Information Commissioner & Ministry of Defence [2011] UKUT 153 (AAC); [2011] 2 Info LR 75 expressed the clear, if obiter, view that section 12 was not in the same position as substantive FOIA Part II exemptions because it had a different purpose; section 12 is not about the nature of the information but the effect on the public authority of having to deal with the request. The scheme of FOIA was likely to be distorted, the Upper Tribunal held, if the authority could suddenly rely on section 12 after already having carried out the search and engaged with the requestor: at [45]-[47]. The APPGER approach was accepted by the FTT in Sittampalam v Information Commissioner & BBC [2011] 2 Info LR 195. There was at least a school of thought that the APPGER logic ought also to apply to section 14 (which, as was explained in Dransfield, is not properly an exemption at all: at [10]-[11]).

In Department for Education v Information Commissioner & McInerney (EA/2013/0270) GRC Chamber President Judge Warren considered the late reliance by the DfE on sections 12 and 14, and upheld the DfE’s appeal under section 14. In an appendix, he dismissed the suggestion of the ICO that APPGER meant that section 14 could not be relied upon late. In rather brief reasoning, he considered that if section 17 did not bar late reliance on Part II exemptions (as it was clear that it did not following Birkett), there was no linguistic reason to apply the same approach to Part I exemptions. Sections 12 and 14 could therefore be relied upon late, as a matter of right.

So that is that. Except of course, that there is now a real conflict of authority at FTT level, and with conflicting dicta at UT level too (APPGER having doubted Information Commissioner v Home Office [2011] UKUT 17 (AAC) on this point). Maybe someone would like to take the point on appeal and have it properly determined.

Andrew Sharland was for the DfE and Robin Hopkins was for the ICO.

Christopher Knight

Academies and FOI

July 16th, 2014 by Robin Hopkins

The question of whether information is ‘held’ by a public authority for FOIA or EIR purposes can raise difficulties. This is especially so where the boundaries between public and private service provision are blurred: consider outsourcing, privatisation of services, public/private partnerships, joint ventures, the use of external consultants and so on. Legal separation and practical day-to-day realities can often point in different directions in terms of who holds information on whose behalf.

Geraldine Hackett v IC and United Learning Trust (EA/2012/0265) is a recent First-Tier Tribunal decision which addresses such issues – specifically in the context of academy school provision.

The United Church Schools Foundation Limited delivers schools through two separate trusts: the United Church Schools Trust (which runs 11 private schools) and the United Learning Trust (which runs 20 academies, and receives approximately £110k of its £129k of annual income from public funds).

Para 52A Schedule 1 FOIA brings within the scope of FOIA “the proprietor of an academy” but only in respect of “information held for the purposes of the proprietor’s functions under academy arrangements.”

Geraldine Hackett asked for information about the employment package of ULT’s chief executive (pay, pension contribution, expenses etc) and of the other members of the ULT senior management team.

ULT said it did not hold the information; the information was instead held by UCST (the private school provider). The ICO agreed. So did the First-Tier Tribunal, but this was overturned by the Upper Tribunal on account of aspects of procedural fairness which had gone badly awry at first instance.

On reconsideration by a fresh First-Tier Tribunal, the ICO’s decision was overturned. The Tribunal asked itself the questions which the Upper Tribunal had invited for consideration:

“Was it really the case that ULT had delegated day-to-day running of its charitable activities to a chief executive of whose duties under his contract of employment, ULT was ignorant? Was it permissible to avoid FOIA by the device of a contract of employment made by another body?”

It applied the leading case of University of Newcastle upon Tyne v ICO and BUAV [2011] UKUT 185 (AAC) and concluded that ULT did hold the requested information for FOIA purposes. This meant that “ULT would fulfil its obligations under FOIA by disclosing not the total sums involved but that proportion, calculated in accordance with the agreement, which relates to the academies; in other words excluding that proportion which can be attributed to USCT’s private schools.”

The Tribunal noted that “in 2006 both trusts entered into an agreement with each other to apportion the expenditure on shared services” and observed that “it appeared to us from the oral and written evidence that staff work together seamlessly for all three trusts”.

Those who grapple with held/not held questions in contexts like this will wish to note the key paragraph (19) illuminating the Tribunal’s reasoning:

“We were told at the hearing, and we accept, that the disputed information is held in hard copy in one of the filing cabinets at the United Learning Head Office. Those with access to it work seamlessly, we have found, for all three trusts. They have responsibilities to all three trusts. For these purposes, we are not attracted by artificial theories suggesting that staff hold these documents only on behalf of one or two of the trusts. Looking at actualities, and applying the plain words of the statute, in our judgment the disputed information is held by ULT, even if it is also held by UCST and UCSF. This finding is consistent with the obligations of the ULT accounting officer in respect of senior officers’ payroll arrangements…”

Robin Hopkins @hopkinsrobin

In the wake of Google Spain: freedom of expression down (but not out)

July 15th, 2014 by Robin Hopkins

The CJEU’s judgment in Google Spain was wrong and has created an awful mess.

That was the near-unanimous verdict of a panel of experts – including 11KBW’s Anya Proops – at a debate hosted by ITN and the Media Society on Monday 14 July and entitled ‘Rewriting History: Is the new era in Data Protection compatible with journalism?’.

The most sanguine participant was the Information Commissioner, Christopher Graham. He cautioned against a ‘Chicken Licken’ (the sky is falling in) alarmism – we should wait and see how the right to be forgotten (RTBF) pans out in practice. He was at pains to reassure the media that its privileged status in data protection law was not in fact under threat: the s. 32 DPA exemption, for example, was here to stay. There remains space, Google Spain notwithstanding, to refuse RTBF inappropriate requests, he suggested – at least as concerns journalism which is in the public interest (a characteristic which is difficult in principle and in practice).

‘I am Chicken Licken!’, was the much less sanguine stance of John Battle, ITN’s Head of Compliance. Google Spain is a serious intrusion into media freedom, he argued. This was echoed by The Telegraph’s Holly Watt, who likened the RTBF regime to book-burning.

Peter Barron, Google’s Director of Communications and Public Affairs for Europe, Africa and the Middle East, argued that in implementing its fledgling RTBF procedure, Google was simply doing as told: it had not welcomed the Google Spain judgment, but that judgment is now the law, and implementing it was costly and burdensome. On the latter point, Chris Graham seemed less than entirely sympathetic, pointing out that Google’s business model is based heavily on processing other people’s personal data.

John Whittingdale MP, Chairman of the Culture, Media & Sport Select Committee, was markedly Eurosceptic in tone. Recent data protection judgments from the CJEU have overturned what we in the UK had understood the law to be – he was referring not only to Google Spain, but also to Digital Rights Ireland (on which see my DRIP post from earlier today). The MOJ or Parliament need to intervene and restore sanity, he argued.

Bringing more legal rigour to bear was Anya Proops, who honed in on the major flaws in the Google Spain judgment. Without there having been any democratic debate (and without jurisprudential analysis), the CJEU has set a general rule whereby privacy trumps freedom of expression. This is hugely problematic in principle. It is also impracticable: the RTBF mechanism doesn’t actually work in practice, for example because it leaves Google.com (as opposed to Google.co.uk or another EU domain) untouched – a point also made by Professor Luciano Floridi, Professor of Philosophy and Ethics of Information at the University of Oxford.

There were some probing questions from the audience too. Mark Stephens, for example, asked Chris Graham how he defined ‘journalism’ (answer: ‘if it walks and quacks like a journalist’…) and how he proposed to fund the extra workload which RTBF complaints would bring for the ICO (answer: perhaps a ‘polluter pays’ approach?).

Joshua Rozenberg asked Peter Barron if there was any reason why people should not switch their default browsers to the RTBF-free Google.com (answer: no) and whether Google would consider giving aggrieved journalists rights of appeal within a Google review mechanism (the Google RTBF mechanism is still developing).

ITN is making the video available on its website this week. Those seeking further detail can also search Twitter for the hashtag #rewritinghistory or see Adam Fellows’ blog post.

The general tenor from the panel was clear: Google Spain has dealt a serious and unjustifiable blow to the freedom of expression.

Lastly, one of my favourite comments came from ITN’s John Battle, referring to the rise of data protection as a serious legal force: ‘if we’d held a data protection debate a year ago, we’d have had one man and his dog turn up. Now it pulls in big crowds’. I do not have a dog, but I have been harping on for some time about data protection’s emergence from the shadows to bang its fist on the tables of governments, security bodies, big internet companies and society at large. It surely will not be long, however, before the right to freedom of expression mounts a legal comeback, in search of a more principled and workable balance between indispensible components of a just society.

Robin Hopkins @hopkinsrobin

Surveillance powers to be kept alive via DRIP

July 15th, 2014 by Robin Hopkins

The legal framework underpinning state surveillance of individuals’ private communications is in turmoil, and it is not all Edward Snowden’s fault. As I write this post, two hugely important developments are afoot.

Prism/Tempora

The first is the challenge by Privacy International and others to the Prism/Tempora surveillance programmes implemented by GCHQ and the security agencies. Today is day 2 of the 5-day hearing before the Investigatory Powers Tribunal. To a large extent, this turmoil was unleashed by Snowden.

DRIP – the background

The second strand of the turmoil is thanks to Digital Rights Ireland and others, whose challenge to the EU’s Data Retention Directive 2006/24 was upheld by the CJEU in April of this year. That Directive provided for traffic and location data (rather than content-related information) about individuals’ online activity to be retained by communications providers for a period of 6-24 months and made available to policing and security bodies. In the UK, that Directive was implemented via the Data Retention (EC Directive) Regulations 2009, which mandated retention of communications data for 12 months.

In Digital Rights Ireland, the CJEU held the Directive to be invalid on the grounds of incompatibility with the privacy rights enshrined under the EU’s Charter of Fundamental Rights. Strictly speaking, the CJEU’s judgment (on a preliminary ruling) then needed to be applied by the referring courts, but in reality the foundation of the UK’s law fell away with the Digital Rights Ireland judgment. The government has, however, decided that it needs to maintain the status quo in terms of the legal powers and obligations which were rooted in the invalid Directive.

On 10 July 2014, the Home Secretary made a statement announcing that this gap in legal powers was to be plugged on a limited-term basis. A Data Retention and Investigatory Powers (DRIP) Bill would be put before Parliament, together with a draft set of regulations to be made under the envisaged Act. If passed, these would remain in place until the end of 2016, by which time longer-term solutions could be considered. Ms May said this would:

“…ensure, for now at least, that the police and other law enforcement agencies can investigate some of the criminality that is planned and takes place online. Without this legislation, we face the very prospect of losing access to this data overnight, with the consequence that police investigations will suddenly go dark and criminals will escape justice. We cannot allow this to happen.”

Today, amid the ministerial reshuffle and shortly before the summer recess, the Commons is debating DRIP on an emergency basis.

Understandably, there has been much consternation about the extremely limited time allotted for MPs to debate a Bill of such enormous significance for privacy rights (I entitled my post on the Digital Rights Ireland case “Interfering with the fundamental rights of practically the entire European population”, which is a near-verbatim quote from the judgment).

DRIP – the data retention elements

The Bill is short. A very useful summary can be found in the Standard Note from the House of Commons Library (authored by Philippa Ward).

Clause 1 provides power for the Secretary of State to issue a data retention notice on a telecommunications services provider, requiring them to retain certain data types (limited to those set out in the Schedule to the 2009 Regulations) for up to 12 months. There is a safeguard that the Secretary of State must consider whether it is “necessary and proportionate” to give the notice for one or more of the purposes set out in s22(2) of RIPA.

Clause 2 then provides the relevant definitions.

The Draft Regulations explain the process in more detail. Note in particular regulation 5 (the matters the Secretary of State must consider before giving a notice) and regulation 9 (which provides for oversight by the Information Commissioner of the requirements relating to integrity, security and destruction of retained data).

DRIP – the RIPA elements

DRIP is also being used to clarify (says the government) or extend (say some critics) RIPA 2000. In this respect, as commentators such as David Allen Green have pointed out, it is not clear why the emergency legislation route is necessary.

Again, to borrow the nutshells from the House of Commons Library’s Standard Note:

Clause 3 amends s5 of RIPA regarding the Secretary of State’s power to issue interception warrants on the grounds of economic well-being.

Clause 4 aims to clarify the extra-territorial reach of RIPA in in relation to both interception and communications data by adding specific provisions. This confirms that requests for interception and communications data to overseas companies that are providing communications services within the UK are subject to the legislation.

Clause 5 clarifies the definition of “telecommunications service” in RIPA to ensure that internet-based services, such as webmail, are included in the definition.

Criticism

The Labour front bench is supporting the Coalition. A number of MPs, including David Davis and Tom Watson, have been vociferous in their opposition (see for example the proposed amendments tabled by Watson and others here). So too have numerous academics and commentators. I won’t try to link to all of them here (as there are too many). Nor can I link to a thorough argument in defence of DRIP (as I have not been able to find one). For present purposes, an excellent forensic analysis comes from Graham Smith at Cyberleagle.

I don’t seek to duplicate that analysis. It is, however, worth remembering this: the crux of the CJEU’s judgment was that the Directive authorised such vast privacy intrusions that stringent safeguards were required to render it proportionate. In broad terms, that proportionately problem can be fixed in two ways: reduce the extent of the privacy intrusions and/or introduce much better safeguards. DRIP does not seek to do the former. The issue is whether it offers sufficient safeguards for achieving an acceptable balance between security and privacy.

MPs will consider that today and Peers later this week. Who knows? – courts may even be asked for their views in due course.

Robin Hopkins @hopkinsrobin

Some results may have been removed under data protection law in Europe. Learn more.

July 3rd, 2014 by Robin Hopkins

This is the message that now regularly greets those using Google to search for information on named individuals. It relates, of course, to the CJEU’s troublesome Google Spain judgment of 13 May 2014.

I certainly wish to learn more.

So I take Google up on its educational offer and click through to its FAQ page, where the folks at Google tell me inter alia that “Since this ruling was published on 13 May 2014, we’ve been working around the clock to comply. This is a complicated process because we need to assess each individual request and balance the rights of the individual to control his or her personal data with the public’s right to know and distribute information”.

The same page also leads me to the form on which I can ask Google to remove from its search results certain URLs about me. I need to fill in gaps like this: “This URL is about me because… This page should not be included as a search result because…” 

This is indeed helpful in terms of process, but I want to understand more about the substance of decision-making. How does (and/or should) Google determine whether or not to accede to my request? Perhaps understandably (as Google remarks, this is a complicated business on which the dust is yet to settle), Google doesn’t tell me much about that just yet.

So I look to the obvious source – the CJEU’s judgment itself – for guidance. Here I learn that I can in principle ask that “inadequate, irrelevant or no longer relevant” information about me not be returned through a Google search. I also get some broad – and quite startling – rules of thumb, for example at paragraph 81, which tells me this:

“In the light of the potential seriousness of that interference, it is clear that it cannot be justified by merely the economic interest which the operator of such an engine has in that processing. However, inasmuch as the removal of links from the list of results could, depending on the information at issue, have effects upon the legitimate interest of internet users potentially interested in having access to that information, in situations such as that at issue in the main proceedings a fair balance should be sought in particular between that interest and the data subject’s fundamental rights under Articles 7 and 8 of the Charter. Whilst it is true that the data subject’s rights protected by those articles also override, as a general rule, that interest of internet users, that balance may however depend, in specific cases, on the nature of the information in question and its sensitivity for the data subject’s private life and on the interest of the public in having that information, an interest which may vary, in particular, according to the role played by the data subject in public life.”

So it seems that, in general (and subject to the sensitivity of the information and my prominence in public life), my privacy rights trump Google’s economic rights and other people’s rights to find information about me in this way. So the CJEU has provided some firm steers on points of principle.

But still I wish to learn more about how these principles will play out in practice. Media reports in recent weeks have told us about the volume of ‘right to be forgotten’ requests received by Google.

The picture this week has moved on from volumes to particulars. In the past few days, we have begun to learn how Google’s decisions filter back to journalists responsible for the content on some of the URLs which objectors pasted into the forms they sent to Google. We learn that journalists and media organisations, for example, are now being sent messages like this:

“Notice of removal from Google Search: we regret to inform you that we are no longer able to show the following pages from your website in response to certain searches on European versions of Google.”

Unsurprisingly, some of those journalists find this puzzling and/or objectionable. Concerns have been ventilated in the last day or two, most notably by the BBC’s Robert Peston (who feels that, through teething problems with the new procedures, he has been ‘cast into oblivion’) and The Guardian’s James Ball (who neatly illustrates some of the oddities of the new regime). See also The Washington Post’s roundup of UK media coverage.

That coverage suggests that the Google Spain ruling – which made no overt mention of free expression rights under Article 10 ECHR – has started to bite into the media’s freedom. The Guardian’s Chris Moran, however, has today posted an invaluable piece clarifying some misconceptions about the right to be forgotten. Academic commentators such as Paul Bernal have also offered shrewd insights into the fallout from Google Spain.

So, by following the trail from Google’s pithy new message, I am able to learn a fair amount about the tenor of this post-Google Spain world.

Inevitably, however, given my line of work, I am interested in the harder edges of enforcement and litigation: in particular, if someone objects to the outcome of a ‘please forget me’ request to Google, what exactly can they do about it?

On such questions, it is too early to tell. Google says on its FAQ page that “we look forward to working closely with data protection authorities and others over the coming months as we refine our approach”. For its part, the ICO tells us that it and its EU counterparts are working hard on figuring this out. Its newsletter from today says for example that:

“The ICO and its European counterparts on the Article 29 Working Party are working on guidelines to help data protection authorities respond to complaints about the removal of personal information from search engine results… The recommendations aim to ensure a consistent approach by European data protection authorities in response to complaints when takedown requests are refused by the search engine provider.”

So for the moment, there remain lots of unanswered questions. For example, the tone of the CJEU’s judgment is that DPA rights will generally defeat economic rights and the public’s information rights. But what about a contest between two individuals’ DPA rights?

Suppose, for example, that I am an investigative journalist with substantial reputational and career investment in articles about a particular individual who then persuades Google to ensure that my articles do not surface in EU Google searches for his name? Those articles also contain my name, work and opinions, i.e. they also contain my personal data. In acceding to the ‘please forget me’ request without seeking my input, could Google be said to have processed my personal data unfairly, whittling away my online personal and professional output (at least to the extent that the relevant EU Google searches are curtailed)? Could this be said to cause me damage or distress? If so, can I plausibly issue a notice under s. 10 of the DPA, seek damages under s. 13, or ask the ICO to take enforcement action under s. 40?

The same questions could arise, for example, if my personal backstory is heavily entwined with that of another person who persuades Google to remove from its EU search results articles discussing both of us – that may be beneficial for the requester, but detrimental to me in terms of the adequacy of personal data about me which Google makes available to the interested searcher.

So: some results may have been removed under data protection law in Europe, and I do indeed wish to learn more. But I will have to wait.

Robin Hopkins @hopkinsrobin

GCHQ’s internet surveillance – privacy and free expression join forces

July 3rd, 2014 by Robin Hopkins

A year ago, I blogged about Privacy International’s legal challenge – alongside Liberty – against GCHQ, the Security Services and others concerning the Prism/Tempora programmes which came to public attention following Edward Snowden’s whistleblowing. That case is now before the Investigatory Powers Tribunal. It will be heard for 5 days, commencing on 14 July.

Privacy International has also brought a second claim against GCHQ: in May 2014, it issued proceedings concerning the use of ‘hacking’ tools and software by intelligence services.

It has been announced this week that Privacy International is party to a third challenge which has been filed with the Investigatory Powers Tribunal. This time, the claim is being brought alongside 7 internet service providers: GreenNet (UK), Chaos Computer Club (Germany); GreenHost (Netherlands); Jimbonet (Korea), Mango (Zimbabwe), May First/People Link (US) and Riseup (US).

The claim is interesting on a number of fronts. One is the interplay between global reach (see the diversity of the claimants’ homes) and this specific legal jurisdiction (the target is GCHQ and the jurisdiction is the UK – as opposed, for example, to bringing claims in the US). Another is that it sees private companies – and therefore Article 1 Protocol 1 ECHR issues about property, business goodwill and the like – surfacing in the UK’s internet surveillance debate.

Also, the privacy rights not only of ‘ordinary’ citizens (network users) but also specifically those of the claimants’ employees are being raised.

Finally, this claim sees the right to free expression under Article 10 ECHR – conspicuously absent, for example, in the Google Spain judgment – flexing its muscle in the surveillance context. Privacy and free expression rights are so often in tension, but here they make common cause.

The claims are as follows (quoting from the claimants’ press releases):

(1) By interfering with network assets and computers belonging to the network providers, GCHQ has contravened the UK Computer Misuse Act and Article 1 of the First Additional Protocol (A1AP) of the European Convention of Human Rights (ECHR), which guarantees the individual’s peaceful enjoyment of their possessions

(2) Conducting surveillance of the network providers’ employees is in contravention of Article 8 ECHR (the right to privacy) and Article 10 ECHR (freedom of expression)

(3) Surveillance of the network providers’ users that is made possible by exploitation of their internet infrastructure, is in contravention of Arts. 8 and 10 ECHR; and

(4) By diluting the network providers’ goodwill and relationship with their users, GCHQ has contravened A1AP ECHR.

Robin Hopkins @hopkinsrobin

More on Spamalot

June 19th, 2014 by Anya Proops

Following on from my post earlier today on Niebel, readers may like to note that Jon Baines’s excellent blog, Information Rights and Wrongs has an interesting and detailed analysis of the Mansfield v John Lewis case – see here. The article suggests that Mr Mansfield’s damages may have garnered him the princely sum of £10!

Criminal records scheme incompatible with Convention rights – Supreme Court judgment

June 19th, 2014 by Anya Proops

As readers of this blog will know, the application of the Government’s criminal records scheme has been subject to extensive litigation of late (see further not least my post on an appeal involving a teacher and my post on an appeal involving a taxi-driver). Perhaps most importantly, in the case of T & Anor v Secretary of State for the Home Department, questions have been raised about whether the scheme as a whole is compatible with Convention rights and, in particular, the Article 8 right to privacy. Last year, the Court of Appeal concluded that the scheme was incompatible (see further Christopher Knight’s analysis of the Court’s judgment here). In a judgment given yesterday, the majority of the Supreme Court has agreed with that conclusion (Lord Wilson dissenting). The judgment will no doubt be subject to further analysis on Panopticon over the next few days. However, in short, the Supreme Court held that:

(a)    warnings and cautions given to the appellants by the police engaged their Article 8 right to privacy

(b)    the disclosure of those warnings and cautions in enhanced criminal records certificates (ECRCs) issued under the scheme amounted to an interference with the appellants’ right to privacy,  particularly as it affected their ability to enter a particular chosen field of endeavour, for example their ability to secure particular jobs and

(c)    the interference could not be justified under Article 8(2), particularly because the indiscriminate manner in which such information was provided under the scheme was not ‘in accordance with law’ for the purposes of Article 8(2), was not ‘necessary in a democratic society’ and was not otherwise proportionate.

On the latter point, the majority of the Supreme Court was clearly concerned about the fact that, in the context of ECRCs, warnings and cautions could be included in the relevant certificate irrespective of the nature of the offence, how the case had been disposed of, the time which had elapsed since the offence took place, the relevance of the data to the employment sought and the absence of any mechanism for independent review of a decision to disclose data. The majority of the Supreme Court evidently regarded the case of T as perfectly illustrative of the dangers inherent in such an indiscriminate scheme. In T, an ECRC was issued in respect of T containing information concerning police warnings which T had received when he was 11, in connection with the theft of bicycles. In the majority’s view, it was entirely unnecessary for such information to be disclosed when T applied, aged 17, for a job which involved working with children and also when he applied, aged 19, to attend university. The majority also refused the appeal against the Court of Appeal’s declaration of incompatibility in respect of the relevant primary legislation, namely the Police Act 1997.

What we see with this judgment, as with many judgments concerning the application of Convention rights, is a reluctant to favour blanket, administratively convenient solutions over more nuanced individual-centred schemes.

11KBW’s Jason Coppel QC acted for the Secretary of State. Tim Pitt-Payne QC appeared on behalf of Liberty.

Anya Proops

Victory for Spamalot – Niebel in the Upper Tribunal

June 19th, 2014 by Anya Proops

The spamming industry is a decidedly irritating but sadly almost unavoidable feature of our networked world. There is no question but that spamming (i.e. the sending of unsolicited direct marketing electronic communications) constitutes an unlawful invasion of our privacy (see further regs 22-23 of the Privacy and Electronic Communications Regulations 2003 (SI 2003/2426) (PECR), implemented under EU Directive 2002/21/EC). The question is what can be done to stop it, particularly given that individual citizens will typically not want to waste their time litigating over the odd spam email or text?

Well one way to address this problem would be to have an effective penalties regime in place, one that effectively kicked the spammers where it hurts by subjecting them to substantial financial penalties. No surprise then that, in 2009, the EU Directive which prohibits spamming was amended so as to require Member States to ensure that they had in place penalties regimes which were ‘effective proportionate and dissuasive’ (see Article 15a of the Directive). This provision in turn led to amendments to PECR which resulted in the monetary penalty regime provided for under s. 55A of the Data Protection Act 1998 being effectively incorporated into PECR. Readers of this blog will be aware of recent litigation over the application of s. 55A in the context of cases involving breaches of the DPA (see further the current leading case on this issue Central London Community Healthcare NHS Trust v Information Commissioner [2014] 1 Info LR 51, which you can read about here). But is the DPA monetary penalty regime really fit for purpose when it comes to dealing with spamming activities which are prohibited by PECR? If the recent decision by the Upper Tribunal in the case of Information Commissioner v Niebel is anything to go by, the answer to that question must be a resounding no.

The background to the Niebel case is as follows. Mr Niebel had sent out unsolicited text messages on an industrial scale. The texts sought out potential claimants in respect of misselling of PPI loans. The Information Commissioner, who had received hundreds of complaints about the texts, went on to issue Mr Niebel with a monetary penalty of £300,000. So far so unsurprising you might say. However, Mr Niebel has since managed to persuade the First-Tier Tribunal (FTT) to quash the penalty in its entirety (see its decision here) and now the Upper Tribunal (UT) has decided that the penalty should be left firmly quashed (see the UT’s decision here).

So how has Mr Niebel been able to avoid any penalty despite the patently unlawful nature of his activities? To answer that question one first has to understand the ostensibly high threshold which must be cleared if the power to impose a penalty is to be engaged. In short, the legislation only permits a penalty to be issued if there is ‘a serious contravention’ of the legislation (s. 55A(1)(a) and that contravention was ‘of a kind likely to cause substantial damage or substantial distress’ (s. 55A(1)(b) – there is also a knowledge requirement (s. 55A(1)(c)) however that requirement will typically be made out in the case of unlawful spammers). But can it really be said that the sending of relatively anodyne spam text message is ‘of a kind likely to cause recipients substantial damage or substantial distress’? Both the FTT and the UT have now firmly answered this question in the negative.

In the course of its decision, the UT considered the following arguments advanced by the Commissioner.

-        First, when deciding whether the contravention was ‘of a kind’ likely to cause substantial damage or substantial distress, it was possible to take into account not only the scale of the particular texts in issue but also the scale of Mr Niebel’s overall spamming operation. This was an important argument in the context of the appeal because, whilst there was no doubt that over time Mr Niebel had sent out hundreds of thousands of unsolicited communications, the Commissioner had identified ‘the contravention’ as relating only to 286 text messages in respect of which he had received complaints. (He had accepted that some 125 other complaints could not be taken into account as they related to communications sent prior to the coming into force of the penalties regime). The issue was therefore whether the wider context could be taken into account when deciding whether the contravention was ‘of a kind’ likely to cause substantial damage or substantial distress.

-        Second, the word ‘substantial’ in this context must be construed as meaning merely that the damage or distress was more than trivial. This is because the penalties regime was plainly intended to bite on unlawful spammers who caused low level damage or mere irritation, and such individuals would not be caught by the legislation if the word ‘substantial’ was construed as carrying any greater weight.

-        Third, the FTT had otherwise erred when it concluded that the 286 texts in issue were not of a kind likely to cause substantial damage or substantial distress.

On the first argument, the UT accepted that the scale of the contravention could be taken into account when deciding whether it was of a kind likely to cause substantial damage or substantial distress. However, it rejected the argument that Mr Niebel’s wider spamming activities were relevant to the analysis. The UT concluded that activities these did not form part of the ‘contravention’ relied upon by the Commissioner and were not therefore relevant to the analysis when it came to deciding whether s. 55A was engaged (para. 38).

On the second argument, the UT accepted Mr Niebel’s argument that it was not appropriate to try and deconstruct the meaning of the word ‘substantial’ and that the FTT had not erred when it had concluded simply that the question whether the substantial element was made out was ‘ultimately a question of fact and degree’ (paras. 42-51).

On the third argument, the UT held that the FTT’s decision that the 286 texts in issue were not of a kind to cause substantial damage was ‘simply unassailable’. The FTT had been entitled to conclude that the mere fact that recipients might have felt obliged to send ‘STOP’ messages to Mr Niebel did not amount to ‘substantial damage’ (para. 54). On the question of substantial distress, the FTT had been right to conclude that not all injury to feelings would amount to ‘distress’ and that irritation or frustration was not the same as distress. It concluded that there was nothing in the recent judgments in Halliday v Creation Consumer Finance or Vidal-Hall v Google which required a different result. Moreover, the UT was not prepared to accept that the FTT had failed to take into account evidence before it arguably suggesting that individual complainants were in fact substantially distressed by the messages. In the UT’s view the FTT had plainly been mindful of this evidence when it reached its conclusions (paras. 67-73).

Perhaps the most telling line in the judgment is to be found in paragraph 65 where the UT, having noted that the Commissioner had probably done all he could to draw Mr Niebel into the cross-hairs of the legislation, went on to conclude that the most profitable course would be for ‘the statutory test to be revisited with a view to making it better fit the objectives of the 2002 Directive (as amended). So, for example, a statutory test that was formulated in terms of e.g. annoyance, inconvenience and/or irritation, rather than “substantial damage or substantial distress”, might well have resulted in a different outcome. What cannot be doubted is that, absent a successful appeal against the UT’s decision, this legislation will need to be revisited so as to avoid a situation where the spammers end up laughing all the way to the bank whilst the penalties regime descends into obsolescence.

However, I should add that the picture is not altogether rosy for the spammers of this world. According to recent media reports, John Lewis has recently had to pay out damages to Roddy Mansfield, Sky News producer, after it sent him an unsolicited marketing email (see the Sky News report of the matter here – the report does not confirm the quantum of the damages). This rather raises the question of whether, in the face of an apparently deficient monetary penalty regime, the best cure for the disease of unlawful spamming might be to mount a group action.

The Niebel case was another 11KBW affair with Robin Hopkins acting for Mr Niebel and James Cornwell acting for the ICO.

Anya Proops