Loss of personal data: £20k award upheld on appeal

September 16th, 2014 by Robin Hopkins

If you breach your legal duties as regards personal data in your control, what might you expect to pay by way of compensation to the affected individual? The received wisdom has tended to be something along these lines. First, has the individual suffered any financial loss? If not, they are not entitled to a penny under s. 13 DPA. Second, even if they get across that hurdle, how much should they get for distress? Generally, not very much – reported awards have tended to be very low (in the low thousands at most).

All of that is very comforting for data controllers who run into difficulties.

That picture is, however, increasingly questionable. “Damage” (the precondition for any award, under s. 13 DPA) could mean something other than “financial loss” – other sorts of damage (even a nominal sort of damage) can, it seems, serve as the trigger. Also, provided the evidence is sufficiently persuasive, it seems that awards – whether under the DPA or at common law (negligence) – could actually be substantial.

These trends are evident in the judgment of the Court of Appeal of Northern Ireland in CR19 v Chief Constable of the Police Service of Northern Ireland [2014] NICA 54.

The appellant, referred to as CR19, was a police officer with the Royal Ulster Constabulary. Due to his exposure to some serious terrorist incidents, he developed Post-Traumatic Stress Disorder (PTSD); he also developed a habit of excessive alcohol consumption. He left the Constabulary in 2001. In 2002, there was a burglary at Castlereagh Police, apparently carried out on behalf of a terrorist organisation. Data and records on officers including CR19 were stolen.

The Constabulary admitted both negligence and a breach of the seventh data protection principle (failure to take appropriate technical and organisational measures). The issue at trial was the amount of compensation to which CR19 was entitled.

Note the losses for which CR19 sought compensation: he claimed that, as a result of the stress which that data loss incident caused him, his PTSD and alcohol problems worsened, he lost out on an employment opportunity and that his house had been devalued as a result of threats to the property and the package of security measures that had been implemented for protection.

The trial judge heard evidence from a number of parties, including medical experts on both sides. He found some aspects of CR19’s evidence unsatisfactory. Overall, however, he awarded CR19 £20,000 (plus interest) for the Constabulary’s negligence. He did not expressly deal with any award under s. 13 of the DPA.

CR19 appealed, saying the award was too low. His appeal was largely dismissed: the trial judge had been entitled to reach his conclusions on the evidence before him.

Further, the s. 13 DPA claim added nothing to the quantum. The Court of Appeal considered the cases of Halliday (a £750 award) and AB (£2,250) (both reported on Panopticon) and concluded as follows (para. 24):

“In this case we have earlier recorded that three eminent psychiatrists gave professional evidence as to the distress sustained by CR19 as a consequence of the break-in. While accepting that the breach and its consequences in this case are of a different order to the matters considered in Halliday or AB, we conclude that the damages for distress arising from the breach of the Data Protection Act must be considered to be subsumed into the judge’s award which, while rejected as too low by the appellant, was by no means an insignificant award. The assessment took account of the distress engendered by the breach of data protection. We cannot conceive of any additional evidence that might be relevant to any additional damages for distress in respect of breach of section 4. Accordingly, we affirm the award of compensation made by the learned trial judge. However, in view of Arden LJ’s reasoning in Halliday, we conclude that the appellant must in addition be entitled to nominal damages of £1.00 to reflect the fact that there was an admitted breach of section 4 of the Data Protection Act.”

Whilst it is not strictly correct to read the CR19 judgment as affirming a DPA award for £20,000 (that award was for negligence), the judgment is nonetheless interesting from a DPA perspective in a number of respects, including these:

(i) While it was conceded in Halliday that nominal damage suffices as “damage” for s. 13(1) purposes, that conclusion looks like it is being applied more widely.

(ii) One problem in Halliday (and to an extent also in AB) was the lack of cogent evidence supporting the alleged damage. The CR19 case illustrates how evidence, including expert medical evidence, can be deployed to effect in data breach cases (whether based on negligence or on the DPA).

(iii) Unlawful acts with respect to individuals’ personal information can, it seems, lead one way or another to a substantial award. The DPA may aim to offer relatively modest awards (so said the Court of Appeal in Halliday), but serious misuse or loss of personal data can nonetheless be very damaging, and the law will recognise and compensate for this where appropriate.

Robin Hopkins @hopkinsrobin

Facebook, FOI and children

August 6th, 2014 by Robin Hopkins

The Upper Tribunal has got its teeth into personal data disputes on a number of occasions in recent months – Edem was followed by Farrand, and now Surrey Heath Borough Council v IC and Morley [2014] UKUT 0330 (AAC): Morley UT decision. Panopticon reported on the first-instance Morley decision in 2012. In brief: Mr Morley asked for information about members of the local authority’s Youth Council who had provided input into a planning application. The local authority withheld the names of the Youth Councillors (who were minors) under s. 40(2) of FOAI (personal data). In a majority decision, the First-Tier Tribunal ordered that some of those names be disclosed, principally on the grounds that it seemed that they appeared on the Youth Council’s (closed) Facebook page.

The local authority and the ICO challenged that decision. The Upper Tribunal (Judge Jacobs) has agreed with them. He found the dissenting opinion of the First-Tier Tribunal member to have been the more sophisticated (as opposed to the overly generalised analysis of the majority) and ultimately correct. The Youth Councillors’ names were correctly withheld.

In his analysis of the First Data Protection Principle, Judge Jacobs was not much bothered by whether fairness or condition 6(1) (the relevant Schedule 2 condition) should be considered first: “the latter is but a specific instance of the former”.

Judge Jacobs found that there was no sufficient interest in the disclosure of the names of the Youth Councillors. He also rejected the argument that, by putting their names on the relevant Facebook page, the data subjects had implicitly consented to public disclosure of their identities in response to such a FOIA request.

Judge Jacobs stopped short, however, of finding that the personal data of minors should never be disclosed under FOIA, i.e. that the (privacy) interests of children would always take precedence over transparency. Maturity and autonomy matter more than mere age in this context, and sometimes (as here) minors are afforded substantial scope to make their own decisions.

Morley is an important case on the intersection between children’s personal data and transparency, particularly in the social media context, but – as Judge Jacobs himself observed – “it is by no means the last word on the subject”.

There were 11KBW appearances by Joseph Barrett (for the local authority) and Heather Emmerson (for the ICO).

Robin Hopkins @hopkinsrobin

In the wake of Google Spain: freedom of expression down (but not out)

July 15th, 2014 by Robin Hopkins

The CJEU’s judgment in Google Spain was wrong and has created an awful mess.

That was the near-unanimous verdict of a panel of experts – including 11KBW’s Anya Proops – at a debate hosted by ITN and the Media Society on Monday 14 July and entitled ‘Rewriting History: Is the new era in Data Protection compatible with journalism?’.

The most sanguine participant was the Information Commissioner, Christopher Graham. He cautioned against a ‘Chicken Licken’ (the sky is falling in) alarmism – we should wait and see how the right to be forgotten (RTBF) pans out in practice. He was at pains to reassure the media that its privileged status in data protection law was not in fact under threat: the s. 32 DPA exemption, for example, was here to stay. There remains space, Google Spain notwithstanding, to refuse RTBF inappropriate requests, he suggested – at least as concerns journalism which is in the public interest (a characteristic which is difficult in principle and in practice).

‘I am Chicken Licken!’, was the much less sanguine stance of John Battle, ITN’s Head of Compliance. Google Spain is a serious intrusion into media freedom, he argued. This was echoed by The Telegraph’s Holly Watt, who likened the RTBF regime to book-burning.

Peter Barron, Google’s Director of Communications and Public Affairs for Europe, Africa and the Middle East, argued that in implementing its fledgling RTBF procedure, Google was simply doing as told: it had not welcomed the Google Spain judgment, but that judgment is now the law, and implementing it was costly and burdensome. On the latter point, Chris Graham seemed less than entirely sympathetic, pointing out that Google’s business model is based heavily on processing other people’s personal data.

John Whittingdale MP, Chairman of the Culture, Media & Sport Select Committee, was markedly Eurosceptic in tone. Recent data protection judgments from the CJEU have overturned what we in the UK had understood the law to be – he was referring not only to Google Spain, but also to Digital Rights Ireland (on which see my DRIP post from earlier today). The MOJ or Parliament need to intervene and restore sanity, he argued.

Bringing more legal rigour to bear was Anya Proops, who honed in on the major flaws in the Google Spain judgment. Without there having been any democratic debate (and without jurisprudential analysis), the CJEU has set a general rule whereby privacy trumps freedom of expression. This is hugely problematic in principle. It is also impracticable: the RTBF mechanism doesn’t actually work in practice, for example because it leaves Google.com (as opposed to Google.co.uk or another EU domain) untouched – a point also made by Professor Luciano Floridi, Professor of Philosophy and Ethics of Information at the University of Oxford.

There were some probing questions from the audience too. Mark Stephens, for example, asked Chris Graham how he defined ‘journalism’ (answer: ‘if it walks and quacks like a journalist’…) and how he proposed to fund the extra workload which RTBF complaints would bring for the ICO (answer: perhaps a ‘polluter pays’ approach?).

Joshua Rozenberg asked Peter Barron if there was any reason why people should not switch their default browsers to the RTBF-free Google.com (answer: no) and whether Google would consider giving aggrieved journalists rights of appeal within a Google review mechanism (the Google RTBF mechanism is still developing).

ITN is making the video available on its website this week. Those seeking further detail can also search Twitter for the hashtag #rewritinghistory or see Adam Fellows’ blog post.

The general tenor from the panel was clear: Google Spain has dealt a serious and unjustifiable blow to the freedom of expression.

Lastly, one of my favourite comments came from ITN’s John Battle, referring to the rise of data protection as a serious legal force: ‘if we’d held a data protection debate a year ago, we’d have had one man and his dog turn up. Now it pulls in big crowds’. I do not have a dog, but I have been harping on for some time about data protection’s emergence from the shadows to bang its fist on the tables of governments, security bodies, big internet companies and society at large. It surely will not be long, however, before the right to freedom of expression mounts a legal comeback, in search of a more principled and workable balance between indispensible components of a just society.

Robin Hopkins @hopkinsrobin

Surveillance powers to be kept alive via DRIP

July 15th, 2014 by Robin Hopkins

The legal framework underpinning state surveillance of individuals’ private communications is in turmoil, and it is not all Edward Snowden’s fault. As I write this post, two hugely important developments are afoot.

Prism/Tempora

The first is the challenge by Privacy International and others to the Prism/Tempora surveillance programmes implemented by GCHQ and the security agencies. Today is day 2 of the 5-day hearing before the Investigatory Powers Tribunal. To a large extent, this turmoil was unleashed by Snowden.

DRIP – the background

The second strand of the turmoil is thanks to Digital Rights Ireland and others, whose challenge to the EU’s Data Retention Directive 2006/24 was upheld by the CJEU in April of this year. That Directive provided for traffic and location data (rather than content-related information) about individuals’ online activity to be retained by communications providers for a period of 6-24 months and made available to policing and security bodies. In the UK, that Directive was implemented via the Data Retention (EC Directive) Regulations 2009, which mandated retention of communications data for 12 months.

In Digital Rights Ireland, the CJEU held the Directive to be invalid on the grounds of incompatibility with the privacy rights enshrined under the EU’s Charter of Fundamental Rights. Strictly speaking, the CJEU’s judgment (on a preliminary ruling) then needed to be applied by the referring courts, but in reality the foundation of the UK’s law fell away with the Digital Rights Ireland judgment. The government has, however, decided that it needs to maintain the status quo in terms of the legal powers and obligations which were rooted in the invalid Directive.

On 10 July 2014, the Home Secretary made a statement announcing that this gap in legal powers was to be plugged on a limited-term basis. A Data Retention and Investigatory Powers (DRIP) Bill would be put before Parliament, together with a draft set of regulations to be made under the envisaged Act. If passed, these would remain in place until the end of 2016, by which time longer-term solutions could be considered. Ms May said this would:

“…ensure, for now at least, that the police and other law enforcement agencies can investigate some of the criminality that is planned and takes place online. Without this legislation, we face the very prospect of losing access to this data overnight, with the consequence that police investigations will suddenly go dark and criminals will escape justice. We cannot allow this to happen.”

Today, amid the ministerial reshuffle and shortly before the summer recess, the Commons is debating DRIP on an emergency basis.

Understandably, there has been much consternation about the extremely limited time allotted for MPs to debate a Bill of such enormous significance for privacy rights (I entitled my post on the Digital Rights Ireland case “Interfering with the fundamental rights of practically the entire European population”, which is a near-verbatim quote from the judgment).

DRIP – the data retention elements

The Bill is short. A very useful summary can be found in the Standard Note from the House of Commons Library (authored by Philippa Ward).

Clause 1 provides power for the Secretary of State to issue a data retention notice on a telecommunications services provider, requiring them to retain certain data types (limited to those set out in the Schedule to the 2009 Regulations) for up to 12 months. There is a safeguard that the Secretary of State must consider whether it is “necessary and proportionate” to give the notice for one or more of the purposes set out in s22(2) of RIPA.

Clause 2 then provides the relevant definitions.

The Draft Regulations explain the process in more detail. Note in particular regulation 5 (the matters the Secretary of State must consider before giving a notice) and regulation 9 (which provides for oversight by the Information Commissioner of the requirements relating to integrity, security and destruction of retained data).

DRIP – the RIPA elements

DRIP is also being used to clarify (says the government) or extend (say some critics) RIPA 2000. In this respect, as commentators such as David Allen Green have pointed out, it is not clear why the emergency legislation route is necessary.

Again, to borrow the nutshells from the House of Commons Library’s Standard Note:

Clause 3 amends s5 of RIPA regarding the Secretary of State’s power to issue interception warrants on the grounds of economic well-being.

Clause 4 aims to clarify the extra-territorial reach of RIPA in in relation to both interception and communications data by adding specific provisions. This confirms that requests for interception and communications data to overseas companies that are providing communications services within the UK are subject to the legislation.

Clause 5 clarifies the definition of “telecommunications service” in RIPA to ensure that internet-based services, such as webmail, are included in the definition.

Criticism

The Labour front bench is supporting the Coalition. A number of MPs, including David Davis and Tom Watson, have been vociferous in their opposition (see for example the proposed amendments tabled by Watson and others here). So too have numerous academics and commentators. I won’t try to link to all of them here (as there are too many). Nor can I link to a thorough argument in defence of DRIP (as I have not been able to find one). For present purposes, an excellent forensic analysis comes from Graham Smith at Cyberleagle.

I don’t seek to duplicate that analysis. It is, however, worth remembering this: the crux of the CJEU’s judgment was that the Directive authorised such vast privacy intrusions that stringent safeguards were required to render it proportionate. In broad terms, that proportionately problem can be fixed in two ways: reduce the extent of the privacy intrusions and/or introduce much better safeguards. DRIP does not seek to do the former. The issue is whether it offers sufficient safeguards for achieving an acceptable balance between security and privacy.

MPs will consider that today and Peers later this week. Who knows? – courts may even be asked for their views in due course.

Robin Hopkins @hopkinsrobin

Fairness under the DPA: public interests can outweigh those of the data subject

June 18th, 2014 by Robin Hopkins

Suppose a departing employee was the subject of serious allegations which you never had the chance properly to investigate or determine. Should you mention these (unproven) allegations to a future employer? Difficult questions arise, in both ethical and legal terms. One aspect of the legal difficulty arises under data protection law: would it be fair to share that personal information with the prospective employer?

The difficulty is enhanced because fairness – so pivotal to data protection analysis – has had little or no legal treatment.

This week’s judgment of Mr Justice Cranston in AB v A Chief Constable [2014] EWHC 1965 (QB) is in that sense a rare thing – a judicial analysis of fairness.

AB was a senior police officer – specifically, a chief superintendent. He was given a final written warning in 2009 following a disciplinary investigation. Later, he was subject to further investigation for allegedly seeking to influence the police force’s appointment process in favour of an acquaintance of AB; this raised a number of serious questions, including about potential dishonesty, lack of integrity, and so on.

AB was on sick leave (including for reasons related to psychological health) for much of the period when that second investigation was unfolding. He was unhappy with how the Force was treating him. He got an alternative job offer from a regulator. He then resigned from the Force before the hearing concerning his alleged disciplinary offences. His resignation was accepted. The Force provided him with a standard reference, but the Chief Constable then took the view that – given the particular, unusual circumstances – he should provide the prospective employer with a second reference, explaining the allegations about AB.

The second reference was to say inter alia that:

“[AB’s] resignation letter pre-dated by some 13 days a gross misconduct hearing at which he was due to appear to face allegations of (i) lack of honesty and integrity (ii) discreditable conduct and (iii) abuse of authority in relation to a recruitment issue. It is right to record that he strenuously denied those allegations. In the light of his resignation the misconduct hearing has been stayed as it is not in the public interest to incur the cost of a hearing when the officer concerned has already resigned, albeit his final date of service post-dating the hearing.”

AB objected to the giving of the second reference and issued a section 10 notice under the Data Protection Act 1998. The lawfulness of the Force’s proposed second reference arose for consideration by Cranston J.

The first issue was this: was the Chief Constable legally obliged to provide a second reference explaining those concerns?

Cranston J held that, in terms of the common/private law duty of care (on the Hedley Byrne line of authority), the answer was no. As a matter of public law, however – and specifically by reference to the Police Conduct Regulations – the answer was yes: “the Chief Constable was obliged by his duty to act with honesty and integrity not to give a standard reference for the recipient because that was misleading. Something more was demanded. In this case the Chief Constable was prima facie under a duty to supply the Regulatory Body at the least with the information about disciplinary matters in the second reference.”

Note the qualifier ‘prima facie’: the upshot was that the duty was displaced if the provision of the second reference would breach the DPA. This raised a number of issues for the Court.

First, no information about AB’s health could be imparted: this was sensitive personal data, and the Chief Constable did not assert that a Schedule 3 DPA condition was met (as required under the First Data Protection Principle).

What about the information as to the disciplinary allegations AB faced? This was not sensitive personal data. Therefore, under the First Data Protection Principle, it could be disclosed if to do so would be (a) fair, (b) lawful, and (c) in accordance with a Schedule 2 condition.

The last two were unproblematic: given the prima facie public law duty to make the second reference here, it would lawful to do so and condition 3 from Schedule 2 would be met.

This left ‘fairness’, which Cranston J discussed in the following terms:

“There is no definition of fairness in the 1998 Act. The Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995, to which the 1998 Act gives effect, contains a reference to protecting privacy rights, as recognised in article 8 of the European Convention on Human Rights and in general principles of EU law: recital 10. However, I cannot accept Mr Lock QC’s submission that the duty of fairness under the Directive and the 1998 Act is a duty to be fair primarily to the data subject. The rights to private and family life in Article 8 are subject to the countervailing public interests set out in Article 8(2). So it is here: assessing fairness involves a balancing of the interests of the data subject in non-disclosure against the public interest in disclosure.”

In conducting this balance between the interests of AB and those of others (including the public interests), Cranston J ultimately – on the particular facts – concluded that it would have been unfair to provide the second reference. There were strong fairness arguments in favour of disclosure – a see paragraph 78 (my emphasis):

“… The focus must be on fairness in the immediate decision to disclose the data [as opposed to a wider-ranging inquiry into the data subject’s conduct in the build-up to disclosure]. In this case the factors making it fair to disclose the information were the public interest in full and frank references, especially the duty of the police service properly to inform other police forces and other regulatory bodies of the person they are seeking to employ. To disclose the information in the second reference would patently have been fair to the Regulatory Body, so it could make a rounded assessment of the claimant, especially given his non-disclosure during the application process.”

However, the balance tipped in AB’s favour. This was partly because the Force’s policy – as well as the undertaken specifically given to AB – was to provide only a standard reference. But (see paragraph 79):

“… what in my view is determinative, and tips the balance of fairness in this case in favour of the claimant, is that he changed his position by resigning from the Force and requesting it to discontinue the disciplinary proceedings, before knowing that the Chief Constable intended to send the second reference. That second reference threatened the job which he had accepted with the Regulatory Body. It is unrealistic to think that the claimant could have taken steps to reverse his resignation in the few weeks before it would take effect. Deputy Chief Constable CD for one had indicated that he would not allow it. The reality was that the claimant was in an invidious position, where in reliance on what the Force through GH had said and done, he was deprived of the opportunity to reinstate the disciplinary proceedings and to fight the allegations against him. This substantive unfairness for the claimant was coupled with the procedural unfairness in the decision to send the second reference without giving him the opportunity to make representations against that course of action. Asking him to comment on its terms after the final decision to send the second reference was too little, too late.”

Therefore, because of unfairness in breach of the DPA and because of AB’s legitimate expectations, the second reference was not lawful.

While Cranston J rightly emphasised the highly fact-specific nature of his overall conclusion, aspects of his discussion of fairness will potentially be of wider application.

So too will his reminder (by way of quoting ICO guidance) that, when it comes to section 10 notices, “Although this [section 10] may give the impression that an individual can simply demand than an organisation stops processing personal data about them, or stops processing it in a particular way, the right is often overstated. In practice, it is much more limited”. Again, in other words, a balancing of interests and an assessment of the justification for the processing is required.

With the ‘right to be forgotten’ very much in vogue, that is a useful point to keep in mind.

Robin Hopkins @hopkinsrobin

Section 13 DPA in the High Court: nominal damage plus four-figure distress award

June 13th, 2014 by Robin Hopkins

Given the paucity of case law, it is notoriously difficult to estimate likely awards of compensation under section 13 of the Data Protection Act 1998 for breaches of that Act. It is also very difficult to assess any trends in compensation awards over time.

AB v MoJ [2014] EWHC 1847 (QB) is the Courts’ (Mr Justice Jeremy Baker) latest consideration of compensation under the DPA. The factual background involves protracted correspondence involving numerous subject access requests. Ultimately, it was held that the Defendant failed to provide certain documents to which the Claimant was entitled under section 7 of the DPA within the time frames set out under that section.

Personal data?

There was a dispute as to whether one particular document contained the Claimant’s ‘personal data’. Baker J noted the arguments from Common Services Agency, and he is not the first to observe (at his paragraph 50) that it is sometimes not a ‘straightforward issue’ to determine whether or not information comes within the statutory definition of personal data. Ultimately, he considered that the disputed document did not come within that definition: it “is in wholly neutral terms, and is indeed merely a conduit for the provision of information contained in the letters which it enclosed which certainly did contain the claimant’s personal data”.

Nonetheless, the DPA had been breached in virtue of the delays in the provision of other information to which the Claimant was entitled under section 7. What compensation should he be awarded?

Damage under section 13(1) DPA

Baker J was satisfied, having considered In Halliday v Creation Consumer Finance Limited [2013] EWCA Civ 333, [2013] 2 Info LR 85 (where the same point was conceded), that nominal damage sufficed as ‘damage’ for section 13(1) purposes: “In this regard the word “damage” in this sub-section is not qualified in any way, such that to my mind provided that there has, as in this case, been some relevant loss, then an individual who has also suffered relevant distress is entitled to an award of compensation in respect of it”.

Here the Court was satisfied that nominal damages should be awarded. The Claimant had spent a lot of time pursuing his requests, albeit that much of that time also involved pursuing requests on clients’ behalves, and albeit that no actual loss had been quantified:

“Essentially the claimant is a professional man who, it is apparent from his witness statement, has expended a considerable amount of time and expense in the pursuit of the disclosure of his and others’ data from various Government Departments and other public bodies, including the disclosed and withheld material from the defendant. Having said that, the claimant has not sought to quantify his time and expense, nor has he allocated it between the various requests on his own and others’ behalves. In these circumstances, although I am satisfied that he has suffered damage in accordance with s.13(1) of the DPA 1998, I consider that this is a case in which an award of nominal damages is appropriate under this head, which will be in the conventional sum of £1.00.”

Distress under section 13(2) DPA

That finding opened the door to an award for distress. The Court found that distress had been suffered, although it was difficult to disentangle his distress attributable to the breaches of the DPA from his distress as to the other surrounding circumstances: “doing the best I am able to on the evidence before me I consider that any award of compensation for distress caused as a result of the relevant delays in this case, should be in the sum of £2,250.00”.

Until this week, Halliday was the Courts’ last reported (on Panopticon at any rate) award of compensation under section 13 DPA. That was 14 months ago. In AB, the Court awarded precisely triple that sum for distress.

For a further (and quicker-off-the-mark) discussion of AB, see this post on Jon Baines’ blog, Information Rights and Wrongs.

Robin Hopkins @hopkinsrobin

Privacy, electronic communications and monetary penalties: new Upper Tribunal decision

June 12th, 2014 by Robin Hopkins

Panopticon reported late last year that the First-Tier Tribunal overturned the first monetary penalty notice issued by the Information Commissioner for breaches of the Privacy and Electronic Communications Regulations 2003. This was the decision in Niebel v IC (EA/2012/0260).

The Information Commissioner appealed against that decision. The Upper Tribunal gave its decision on the appeal yesterday: see here IC v Niebel GIA 177 2014. It dismissed the Commissioner’s appeal and upheld the First-Tier Tribunal’s cancellation of the £300,000 penalty imposed for the sending of marketing text messages.

I appeared in this case, as did James Cornwell (also of the Panopticon fold), so I will not be offering an analysis of the case just now. With any luck, one of my colleagues will be cajoled into doing so before too long.

It is worth pointing out simply that this is the first binding decision on the meaning of the various limbs of s. 55A of the DPA 1998, which contains the preconditions for the issuing of a monetary penalty notice.

Robin Hopkins @hopkinsrobin

Google Spain and the CJEU judgment it would probably like to forget.

May 19th, 2014 by Akhlaq Choudhury

In the landmark judgment in Google Spain SL and Google Inc., v Agencia Espanola de Proteccion de Datos, Gonzales (13th May 2014), the CJEU found that Google is a data controller and is engaged in processing personal data within the meaning of Directive 95/46 whenever an internet search about an individual results in the presentation of information about that individual with links to third party websites.  The judgment contains several findings which fundamentally affect the approach to data protection in the context of internet searches, and which may have far-reaching implications for search engine operators as well as other websites which collate and present data about individuals.

The case was brought Mr Costeja Gonzales, who was unhappy that two newspaper reports of a 16-year old repossession order against him for the recovery of social security debts would come up whenever a Google search was performed against his name. He requested both the newspaper and Google Spain or Google Inc. to remove or conceal the link to the reports on the basis that the matter had long since been resolved and was now entirely irrelevant. The Spanish Data Protection Agency rejected his complaint against the newspaper on the basis that publication was legally justified. However, his complaint against Google was upheld. Google took the matter to court, which made a reference to the CJEU.

The first question for the CJEU was whether Google was a data controller for the purposes of Directive 95/46. Going against the opinion of the Advocate General (see earlier post), the Court held that the collation, retrieval, storage, organisation and disclosure of data undertaken by a search engine when a search is performed amounted to “processing” within the meaning of the Directive; and that as Google determined the purpose and means of that processing, it was indeed the controller. This is so regardless of the fact that such data is already published on the internet and is not altered by Google in any way.

 The Court went on to find that the activity of search engines makes it easy for any internet user to obtain a structured overview of the information available about an individual thereby enabling them to establish a detailed profile of that person involving a vast number of aspects of his private life.  This entails a significant interference with rights to privacy and to data protection, which could not be justified by the economic interests of the search engine operator.  In a further remark that will send shockwaves through many commercial operators providing search services, it was said that as a “general rule” the data subject’s rights in this regard will override “not only the economic interest of the operator of the search engine but also the interest of the general public in finding that information upon a search relating to the data subject’s name” (at paras 81 and 97). Exceptions would exist, e.g. for those in public life where the “the interference with…fundamental rights is justified by the preponderant interest of the general public in having…access to the information in question”.

However, the Court did not stop there with a mere declaration about interference. Given the serious nature of the interference with privacy and data protection rights, the Court said that search engines like Google could be required by a data subject to remove links to websites containing information about that person, even without requiring simultaneous deletion from those websites.

Furthermore, the CJEU lent support to the “right to be forgotten” by holding that the operator of a search engine could be required to delete links to websites containing a person’s information. The reports about Mr Costejas Gonzales’s financial difficulties in 1998 were no longer relevant having regard to his right to private life and the time that had elapsed, and he had therefore established the right to require Google to remove links to the relevant reports from the list of search results against his name. In so doing, he did not even have to establish that the publication caused him any particular prejudice.

The decision clearly has huge implications, not just for search engine operators like Google, but also other operators providing web-based personal data search services. Expect further posts in coming days considering some of the issues arising from the judgment.

Akhlaq Choudhury

Global Witness and the journalism exemption: ICO to have the first go?

April 30th, 2014 by Robin Hopkins

Panopticon has previously reported on the novel and important data protection case Steinmetz and Others v Global Witness [2014] EWHC 1186 (Ch). The High Court (Henderson J) has now given a judgment on a procedural point which will set the shape for this litigation.

The broad background to the case has been set out in Jason Coppel QC’s previous post – see here. In a nutshell, Global Witness is an NGO which reports and campaigns on natural resource related corruption around the world. Global Witness is one of a number of organisations which has recently reported on allegations that a particular company, BSG Resources Ltd (“BSGR”), secured a major mining concession in Guinea through corrupt means. Global Witness is now facing claims brought under the Data Protection Act 1998 by a number of individuals who are all in some way connected with BSGR. The claims include a subject access claim brought under s. 7; a claim under s. 10 requiring Global Witness to cease processing data in connection with the claimants and BSGR; a claim for rectification under s. 14 and a claim for compensation under s. 13.

For its part, Global Witness relies on the ‘journalism’ exemption under s. 32 of the DPA, which applies to “processing… undertaken with a view to the publication by any person of any journalistic, literary or artistic material”. Global Witness says it is exempt from the provisions of the DPA on which the claimants rely.

An unusual feature of the s. 32 exemption is that it provides, at subsections (4) and (5), for a mandatory stay mechanism which is designed in essence to enable the ICO to assume an important adjudicative role in the proceedings (my emphasis):

(4) Where at any time (“the relevant time”) in any proceedings against a data controller under section 7(9), 10(4), 12(8) or 14 or by virtue of section 13 the data controller claims, or it appears to the court, that any personal data to which the proceedings relate are being processed

(a) only for the special purposes, and

(b) with a view to the publication by any person of any journalistic, literary or artistic material which, at the time twenty-four hours immediately before the relevant time, had not previously been published by the data controller, the court shall stay the proceedings until either of the conditions in subsection (5) is met.

(5) Those conditions are—

(a) that a determination of the Commissioner under section 45 with respect to the data in question takes effect, or

(b) in a case where the proceedings were stayed on the making of a claim, that the claim is withdrawn.

So: if the conditions in s. 32(4) are met, then the court must stay proceedings until either the claim is withdrawn or the ICO has issued a determination under section 45. S. 45 effectively requires the ICO to adjudicate upon the application of the journalism/’special purposes’ exemption to the facts of the particular case. Any determination made under s. 45 can be appealed to the Tribunal: see s. 48(4), which confers a right of appeal on the data controller.

Global Witness has invoked s. 32(4) in its defence and has since applied to the Court for a stay under that provision. The claimants disagree that a stay should be granted. They say Global Witness’ reliance on section 32 is misconceived and have made a cross-application to have the s. 32 defence struck out and for summary judgment in the alternative.

The question for Henderson J was whether those rival applications should be heard together (the claimant’s case), or whether Global Witness’ application for a stay should be determined first (Global Witness’ case). Henderson J has agreed with Global Witness on this point. In reaching the view that the stay application should be heard first, it appears that Henderson J had in mind arguments to the effect that requiring the two applications to be heard together would itself risk pre-empting Global Witness’ stay application and may also result in a more cumbersome and costly process (see in particular paragraphs 16-24). Henderson J went on to make the following observation as to the effect of s. 32(4): :

“Subject to argument about the precise nature of a claim sufficient to trigger section 32, Parliament has, in my view, pretty clearly taken the line that issues of this kind should be determined in the first instance by the Commissioner, and any proceedings brought in court should be stayed until that has been done” (paragraph 21).

The stay application will now be heard at the end of June. The matter will then either go off to the ICO or, if the stay application fails, the claimants’ summary judgment/strike-out applications will be considered. The stay application will therefore determine the immediate trajectory of this particular litigation. Whilst the Court declined to order indemnity costs against the claimants, it did award Global Witness close to 100% of its costs.

Anya Proops acts for Global Witness.

Robin Hopkins @hopkinsrobin

Interfering with the fundamental rights of practically the entire European population

April 10th, 2014 by Robin Hopkins

In the Digital Rights Ireland case, the Grand Chamber of the CJEU has this week declared invalid the 2006 Directive which provides for the mass retention – and disclosure to policing and security authorities – of individuals’ online traffic data. It found this regime to be a disproportionate interference with privacy rights. Depending on your perspective, this is a major step forward for digital privacy, or a major step backwards in countering terrorism and serious crime. It probably introduces even more uncertainty in terms of the wider project of data protection reform at the EU level. Here is my synopsis of this week’s Grand Chamber judgment.

Digital privacy vs national security: a brief history

There is an overlapping mesh of rights under European law which aims to protect citizens’ rights with respect to their personal data – an increasingly important strand of the broader right to privacy. The Data Protection Directive (95/46/EC) was passed in 1995, when the internet was in its infancy. It provides that personal data must be processed (obtained, held, used, disclosed) fairly and lawfully, securely, for legitimate purposes and so on.

Then, as the web began to mature into a fundamental aspect of everyday life, a supplementary Directive was passed in 2002 (2002/58/EC) on privacy and electronic communications. It is about privacy, confidentiality and the free movement of electronic personal data in particular.

In the first decade of the 21st century, however, security objectives became increasingly urgent. Following the London bomings of 2005 in particular, the monitoring of would-be criminals’ web activity was felt to be vital to effective counter-terrorism and law enforcement. The digital confidentiality agenda needed to make space for a measure of state surveillance.

This is how Directive 2006/24 came to be. In a nutshell, it provides for traffic and location data (rather than content-related information) about individuals’ online activity to be retained by communications providers and made available to policing and security bodies. This data was to be held for a minimum of six months and a maximum of 24 months.

That Directive – like all others – is however subject to the EU’s Charter of Fundamental Rights. Article 7 of that Charter enshrines the right to respect for one’s private and family life, home and communications. Article 8 is about the right to the protection and fair processing of one’s personal data.

Privacy and Digital Rights Ireland prevail

Digital Rights Ireland took the view that the 2006 Directive was not compatible with those fundamental rights. It asked the Irish Courts to refer this to the CJEU. Similar references were made during different litigation before the Austrian Courts.

The CJEU gave its answer this week. In Digital Rights Ireland Ltd v Minister for Communications, Marine and Natural Resources and Others (C‑293/12) joined with Kärntner Landesregierung and Others (C‑594/12), the Grand Chamber held the 2006 Directive to be invalid on the grounds of its incompatibility with fundamental privacy rights.

The Grand Chamber accepted that, while privacy rights were interfered with, this was in pursuit of compelling social objectives (the combatting of terrorism and serious crime). The question was one of proportionality. Given that fundamental rights were being interfered with, the Courts would allow the European legislature little lee-way: anxious scrutiny would be applied.

Here, in no particular order, are some of the reasons why the 2006 Directive failed its anxious scrutiny test (quotations are all from the Grand Chamber’s judgment). Unsurprisingly, this reads rather like a privacy impact assessment which data controllers are habitually called upon to conduct.

The seriousness of the privacy impact

First, consider the nature of the data which, under Articles 3 and 5 the 2006 Directive, must be retained and made available. “Those data make it possible, in particular, to know the identity of the person with whom a subscriber or registered user has communicated and by what means, and to identify the time of the communication as well as the place from which that communication took place. They also make it possible to know the frequency of the communications of the subscriber or registered user with certain persons during a given period.”

This makes for a serious incursion into privacy: “Those data, taken as a whole, may allow very precise conclusions to be drawn concerning the private lives of the persons whose data has been retained, such as the habits of everyday life, permanent or temporary places of residence, daily or other movements, the activities carried out, the social relationships of those persons and the social environments frequented by them.”

Second, consider the volume of data gathered and the number of people affected. Given the ubiquity of internet communications, the 206 Directive “entails an interference with the fundamental rights of practically the entire European population”.

Admittedly, the 2006 regime does not undermine “the essence” of data protection rights (because it is confined to traffic data – the contents of communications are not retained), and is still subject to data security rules (see the seventh data protection principle under the UK’s DPA 1998).

Nonetheless, this is a serious interference with privacy rights. It has objective and subjective impact: “it is wide-ranging, and it must be considered to be particularly serious… the fact that data are retained and subsequently used without the subscriber or registered user being informed is likely to generate in the minds of the persons concerned the feeling that their private lives are the subject of constant surveillance.”

Such a law, said the Grand Chamber, can only be proportionate if it includes clear and precise laws governing the scope of the measures and providing minimum safeguards for individual rights. The 2006 Directive fell short of those tests.

Inadequate rules, boundaries and safeguards

The regime has no boundaries, in terms of affected individuals: it “applies even to persons for whom there is no evidence capable of suggesting that their conduct might have a link, even an indirect or remote one, with serious crime”.

It also makes no exception for “persons whose communications are subject, according to rules of national law, to the obligation of professional secrecy”.

There are no sufficiently specific limits on the circumstances in which this can be accessed by security bodies, on the purposes to which that data can be put by those bodies, or the persons with whom those particular bodies may share the data.

There are no adequate procedural safeguards: no court or administrative authority is required to sign off the transfers.

There are also no objective criteria for justifying the retention period of 6-24 months.

The Grand Chamber’s conclusion

In summary, the Grand Chamber found that “in the first place, Article 7 of Directive 2006/24 does not lay down rules which are specific and adapted to (i) the vast quantity of data whose retention is required by that directive, (ii) the sensitive nature of that data and (iii) the risk of unlawful access to that data, rules which would serve, in particular, to govern the protection and security of the data in question in a clear and strict manner in order to ensure their full integrity and confidentiality. Furthermore, a specific obligation on Member States to establish such rules has also not been laid down…”

There was also an international transfer aspect to its concern: “in the second place, it should be added that that directive does not require the data in question to be retained within the European Union…”

This last point is of course highly relevant to another of the stand-offs between digital privacy and national security which looms in UK litigation, namely the post-Snowden litigation against security bodies.

Robin Hopkins @hopkinsrobin