Privacy and data protection – summer roundup

August 18th, 2015 by Robin Hopkins

August tends to be a quiet month for lawyers. There has, however, been little by way of a summer break in privacy and data protection developments. Here are some August highlights.

Privacy injunction: sexual affairs of sportsman (not philosophers)

Mrs Justice Laing’s August does not appear to have begun restfully. Following a telephone hearing on the afternoon of Saturday 1 August, she granted what became a widely-reported privacy injunction (lasting only until 5 August) restraining the publication of a story about an affair which a prominent sportsman had some years ago: see the judgment in AMC and KLJ v News Group Newspapers [2015] EWHC 2361 (QB).

As usual in such cases, Article 8 and Article 10 rights were relied upon to competing ends. There is no automatic favourite in such contests – an intense focus on the facts is required.

In this case, notwithstanding submissions about the extent to which the affected individuals ‘courted publicity’ or were not ‘private persons’ – there was a reasonable expectation of privacy about a secret sexual affair conducted years ago. The interference needed to be justified.

The right to free expression did not constitute adequate justification without more: “I cannot balance these two incommensurables [Articles 8 and 10] without asking why, and for what purposes, X and R seek to exercise their article 10 rights… The public interest here is, I remind myself, a contribution to a debate in the general interest”.

On the facts, there was insufficient public interest to justify that interference. The sportsman was not found to have hypocritically projected himself as ‘whiter than white’, and his alleged deceits and breaches of protocols in the coducting of his affair were not persuasive – especially years after the event. In any event, the sportsman was a role model for sportsmen or aspiring sportsmen: “he is not a role model for cooks, or for moral philosophers”. The latter point will no doubt be a weight off many a sporting shoulder.

Subject access requests: upcoming appeals

Subject access requests have traditionally received little attention in the courts. As with data protection matters more broadly, this is changing.

Holly Stout blogged earlier this month about the High Court’s judgment in Dawson-Damer and Ors v Taylor Wessing and Ors [2015] EWHC 2366 (Ch). The case concerned legal professional privilege, manual records and relevant filing systems, disproportionate searches and the court’s discretion under section 7(9) DPA. That case is on its way to the Court of Appeal.

So too is the case of Ittihadieh [2015] EWHC 1491 (QB), in which I appeared. That case concerned, among other issues, identification of relevant data controllers and the domestic purposes exemption. It too is on its way to the Court of Appeal.

Subject access requests: the burden of review and redaction

There has also been judgment this month in a County Court case in which I appeared for the Metropolitan Police Service. Mulcahy v MPS, a judgment of District Judge Langley in the Central London County Court, deals in part with the purposes behind a subject access request. It also deals with proportionality and burden, which – as Holly’s recent post discusses – has tended to be a vexed issue under the DPA (see Ezsias, Elliott, Dawson-Damer and the like).

Mulcahy deals with the proportionality of the burden imposed not so much by searching for information within the scope of a subject access request, but for reviewing (and, where necessary, redacting) that information before disclosure. This is an issue which commonly concerns data controllers. The judgment is available here: Mulcahy Judgment.

Privacy damages: Court of Appeal to hear Gulati appeal

May of 2015 saw Mr Justice Mann deliver a ground-breaking judgment on damages awards for privacy breaches: see Gulati & Ors v MGN Ltd [2015] EWHC 1482 (Ch), which concerned victims of phone-hacking (including Paul Gascoigne and Sadie Frost). The awards ranged between £85,000 and £260,250. The judgment and grounds of appeal against the levels of damages awards are explained in this post by Louise Turner of RPC.

Earlier this month, the Court of Appeal granted MGN permission to appeal. The appeal is likely to be expedited. It will not be long before there is a measure of certainty on quantum for privacy breaches.

ICO monetary penalties

Lastly, I turn to privacy-related financial sanctions of a different kind. August has seen the ICO issue two monetary penalty notices.

One was for £50,000 against ‘Stop the Calls’ (ironically, a company which markets devices for blocking unwanted marketing calls) for serious contraventions of regulation 21 of the Privacy and Electronic Regulations 2003 (direct marketing phone calls to persons who registered their opposition to such calls with the Telephone Preference Service).

Another was for £180,000 for a breach of the seventh data protection principle. It was made against The Money Shop following a burglary in which an unencrypted server containing customers’ personal information was stolen.

Robin Hopkins @hopkinsrobin

Facebook, drag artists and data protection dilemmas: ‘if you stand on our pitch, you must play by our rules’

July 31st, 2015 by Robin Hopkins

Facebook is one of the main battlegrounds between privacy and other social goods such as safety and security.

On the one hand, it faces a safeguarding challenge. Interactions through Facebook have the potential to cause harm: defamation, data protection breaches, stalking, harassment, abuse and the like. One safeguard against such harms is to ensure that users are identifiable, i.e. that they really are who they say they are. This facilitates accountability and helps to ensure that only users of an appropriate age are communicating on Facebook. The ongoing litigation before the Northern Irish courts in the HL case raises exactly these sorts of concerns about child protection.

Part of the solution is Facebook’s ‘real names’ policy: you cannot register using a pseudonym, but only with your official identity.

On the other hand, Facebook encounters an argument which runs like this: individuals should be free to decide how they project themselves in their communications with the world. This means that, provided they are doing no harm, they should in principle be allowed to use whatever identity they like, including pseudonyms, working names (for people who wish to keep their private Facebooking and their professional lives separate) or stage names (particularly relevant for drag artists, for example). The real names policy arguably undermines this element of human autonomy, dignity and privacy. There have been colourful recent protests against the policy on these sorts of grounds.

Which is the stronger argument? Well, the answer to the question seems to depend on who you ask, and where you ask.

The Data Protection Commissioner in Ireland, where Facebook has its EU headquarters, has upheld the real names policy. When one of Germany’s regional Data Protection Commissioners (Schleswig-Holstein) took the opposite view, Facebook challenged his ruling and secured a court victory in 2013. The German court suspended the order against the real names policy and, equally importantly, decided that the challenge should proceed in Ireland, not Germany.

This week, however, another German decision turned the tables on the real names policy yet again. The Hamburg data protection authority upheld a complaint from someone who used a pseudonym on Facebook so as to separate her private and professional communications. The Hamburg DPA found against Facebook and held that it was not allowed unilaterally to change users’ chosen usernames to their real names. Nor was it entitled to demand official identification documents – an issue of particular relevance to child protection issues such as those arising in HL.

The Hamburg ruling is notable on a number of fronts. It exemplifies the tension between privacy – in all its nuanced forms – and other values. It illustrates the dilemmas bedevilling the business models of social media companies such as Facebook.

The case also highlights real challenges for the future of European data protection. The General Data Protection Regulation – currently clawing its way from draft to final form – aspires to harmonised pan-European standards. It includes a mechanism for data protection authorities to co-operate and resolve differences. But if authorities within the same country are prone to divergence on issues such as the real names policy, how optimistic can one be that regulators across the EU will sing from the same hymn sheet?

Important questions arise about data protection and multinational internet companies: in which country (or region, for that matter) should a user raise a complaint to a regulator? If they want to complain to a court, where do they do that? If a German user complains to an Irish regulator or court, to what extent do those authorities have to consider German law?

For the moment, Facebook clearly seeks home ground advantage. But its preference for the Irish forum was rejected by the Hamburg authority in this week’s ruling. He is reported as saying that “… Facebook cannot again argue that only Irish Data Protection law would be applicable … anyone who stands on our pitch also has to play our game”.

The draft Regulation has something to say on these matters, but is far from clear as to how to decide on the right pitch and the right rules for vital privacy battles like these.

Robin Hopkins @hopkinsrobin

Facebook, child protection and outsourced monitoring

July 22nd, 2015 by Robin Hopkins

Facebook is no stranger to complaints about the content of posts. Usually, one user complains to Facebook about what other users’ posts say about him. By making the offending posts available, Facebook is processing the complainant’s personal data, and must do so in compliance with data protection law.

More unusually, a user could also complain about their own Facebook posts. Surely a complainant cannot make data protection criticisms about information they deliberately posted about themselves? After all, Facebook processes those posts with the author’s consent, doesn’t it?

Generally, yes – but that will not necessarily be true in every instance, especially when it comes to Facebook posts by children. This is the nature of the complaint in striking litigation currently afoot before the High Court in Northern Ireland.

The case is HL v Facebook Inc, Facebook Ireland Ltd, the Northern Health & Social Care Trust and DCMS [2015] NIQB 61. It is currently only in its preliminary stages, but it raises very interesting and important issues about Facebook’s procedures for preventing underage users from utilising the social network. Those issues are illuminated in the recent judgment of Stephen J, who is no stranger to claims against Facebook – he heard the recent case of CG v Facebook [2015] NIQB 11, concerning posts about a convicted paedophile.

From the age of 11 onwards, HL maintained a Facebook page on which she made posts of an inappropriate sexual nature. She was exposed to responses from sexual predators. She says that Facebook is liable for its failure to prevent her from making these posts. She alleges that Facebook (i) unlawfully processed her sensitive personal data, (ii) facilitated her harassment by others, and (iii) was negligent in failing to have proper systems in place to minimise the risks of children setting up Facebook accounts by lying about their age.

The data protection claim raises a number of issues of great importance to the business of Facebook and others with comparable business models. One is the extent to which a child can validly consent to the processing of their personal data – especially sensitive personal data. Minors are (legitimately or not) increasingly active online, and consent is a cornerstone of online business. The consent issue is of one of wide application beyond the HL litigation.

A second issue is whether, in its processing of personal data, Facebook does enough to stop minors using their own personal data in ways which could harm them. In her claim, for example, HL refers to evidence given to a committee of the Australian Parliament – apparently by a senior privacy advisor to Facebook (though Facebook was unable to tell Stephens J who he was). That evidence apparently said that Facebook removes 20,000 under-age user profiles a day.

Stephens J was also referred to comments apparently made by a US Senator to Mark Zuckerberg about the vulnerability of underage Facebook users.

Another element of HL’s case concerns Facebook’s use of an outsourcing company called oDesk, operating for example from Morocco, to moderate complaints about Facebook posts. She calls into question the adequacy of these oversight measures: ‘where then is the oversight body for these underpaid global police?’ (to quote from a Telegraph article referred to in the recent HL judgment). Facebook says that – given its number of users in multiple languages across the globe – effective policing is a tall order (an argument J summed up at paragraph 22 as ‘the needle in a haystack argument, there is just too much to monitor, the task of dealing with underage users is impossible’).

In short, HL says that Facebook seems to be aware of the scale and seriousness of the problem of underage use of its network and has not done enough to tackle that problem.

Again, the issue is one of wider import for online multinationals for whom personal data is stock-in-trade.

The same goes for the third important data protection issue surfacing in the HL litigation. This concerns jurisdiction, cross-border data controllers and section 5 of the Data Protection Act 1998. For example, is Facebook Ireland established in the UK by having an office, branch or agency, and does it process the personal data in Facebook posts in the context of that establishment?

These issues are all still to be decided. Stephens J’s recent judgment in HL was not about the substantive issues, but about HL’s applications for specific discovery and interrogatories. He granted those applications. In addition to details of HL’s Facebook account usage, he ordered the Facebook defendants to disclose agreements between them and Facebook (UK) Ltd and between them and o-Desk (to whom some moderating processes were outsourced). He has also ordered the Facebook defendants to answer interrogatory questions about their procedures for preventing underage Facebook use.

In short, the HL litigation has – thus far – raised difficult data protection and privacy issues which are fundamental to Facebook’s business, and it has required Facebook to lay bare internal details of its safeguarding practices. The case is only just beginning. The substantive hearing, which is listed for next term, could groundbreaking.

Robin Hopkins @hopkinsrobin

DRIPA 2014 declared unlawful

July 17th, 2015 by Robin Hopkins

In a judgment of the Divisional Court handed down this morning, Bean LJ and Collins J have declared section 1 of the Data Retention and Investigatory Powers Act 2014 (DRIPA) to be unlawful.

For the background to that legislation, see our posts on Digital Rights Ireland and then on the UK’s response, i.e. passing DRIPA in an attempt to preserve data retention powers.

That attempt has today suffered a serious setback via the successful challenges brought by the MPs David Davis and Tom Watson, as well as Messrs Brice and Lewis. The Divisional Court did, however, suspend the effect of its order until after 31 March 2016, so as to give Parliament time to consider how to put things right.

Analysis to follow in due course, but for now, here is the judgment: Davis Watson Judgment.

Robin Hopkins @hopkinsrobin

Google and the ordinary person’s right to be forgotten

July 15th, 2015 by Robin Hopkins

The Guardian has reported today on data emerging from Google about how it has implemented the Google Spain ‘right to be forgotten’ principle over the past year or so: see this very interesting article by Julia Powles.

While the data is rough-and-ready, it appears to indicate that the vast majority of RTBF requests actioned by Google have concerned ‘ordinary people’. By that I mean people who are neither famous nor infamous, and who seek not to have high-public-interest stories erased from history, but to have low-public-interest personal information removed from the fingertips of anyone who cares to Google their name. Okay, that explanation here is itself rough-and-ready, but you get the point: most RTBF requests come not from Max Mosley types, but from Mario Costeja González types (he being the man who brought the Google Spain complaint in the first place).

As Julia Powles points out, today’s rough-and-ready is thus far the best we have to go on in terms of understanding how the RTBF is actually working in practice. There is very little transparency on this. Blame for that opaqueness cannot fairly be levelled only at Google and its ilk – though, as the Powles articles argues, they may have a vested interest in maintaining that opaqueness. Opaqueness was inevitable following a judgment like Google Spain, and European regulators have, perhaps forgivably, not yet produced detailed guidance at a European level on how the public can expect such requests to be dealt with. In the UK, the ICO has given guidance (see here) and initiated complaints process (see here).

Today’s data suggests to me that a further reason for this opaqueness is the ‘ordinary person’ factor: the Max Mosleys of the world tend to litigate (and then settle) when they are dissatisfied, but the ordinary person tends not to (Mr González being an exception). We remain largely in the dark about how this web-shaping issue works.

So: the ordinary person is most in need of transparent RTBF rules, but least equipped to fight for them.

How might that be resolved? Options seem to me to include some combination of (a) clear regulatory guidance, tested in the courts, (b) litigation by a Max Mosley-type figure which runs its course, (c) litigation by more Mr González figures (i.e. ordinary individuals), (d) litigation by groups of ordinary people (as in Vidal Hall, for example) – or perhaps (e) litigation by members of the media who object to their stories disappearing from Google searches.

The RTBF is still in its infancy. Google may be its own judge for now, but one imagines not for long.

Robin Hopkins @hopkinsrobin

Austria will not host Europe vs Facebook showdown

July 6th, 2015 by Robin Hopkins

As illustrated by Anya Proops’ recent post on a Hungarian case currently before the CJEU, the territorial jurisdiction of European data protection law can raise difficult questions.

Such questions have bitten hard in the Europe vs Facebook litigation. Max Schrems, an Austrian law graduate, is spearheading a massive class action in which some 25,000 Facebook users allege numerous data protection violations by the social media giant. Those include: unlawful obtaining of personal data (including via plug-ins and “like” buttons); invalid consent to Facebook’s processing of users’ personal data; use of personal data for impermissible purposes, including the unlawful analysing of data/profiling of users (“the Defendant analyses the data available on every user and tries to explore users’ interests, preferences and circumstances…”); unlawful sharing of personal data with third parties and third-party applications. The details of the claim are here.

Importantly, however, the claim is against Facebook Ireland Ltd, a subsidiary of the Californian-based Facebook Inc. The class action has been brought in Austria.

Facebook challenged the Austrian court’s jurisdiction. Last week, it received a judgment in its favour from the Viennese Regional Civil Court. The Court said it lacks jurisdiction in part because Mr Schrems is not deemed to be a ‘consumer’ of Facebook’s services. In part also, it lacks jurisdiction because Austria is not the right place to be bringing the claim. Facebook argued that the claim should be brought either in Ireland or in California, and the Court agreed.

Mr Schrems has announced his intention to appeal. In the meantime, the Austrian decision will continue to raise both eyebrows and questions, particularly given that a number of other judgments in recent years have seen European courts accepting jurisdiction to hear claims against social media companies (such as Google: see Vidal-Hall, for example) based elsewhere.

The Austrian decision also highlights the difficulties of the ‘one-stop shop’ principle which remains part of the draft Data Protection Regulation (albeit in more nuanced and complicated formulation than had earlier been proposed). In short, why should an Austrian user have to sue in Ireland?

Panopticon will report on any developments in this case in due course. It will also report on the other strand of Mr Schrems’ privacy campaign, namely his challenge to the lawfulness of the Safe Harbour regime for the transferring of personal data to the USA. That challenge has been heard by the CJEU, and the Advocate General’s opinion is imminent. The case will have major implications for those whose business involves transatlantic data transfers.

Robin Hopkins @hopkinsrobin

Disclosing child protection information: make sure you ask the right questions first

June 1st, 2015 by Robin Hopkins

High-profile revelations in recent years illustrate the importance of public authorities sharing information on individuals who are of concern in relation to child protection matters. When inaccurate information is shared, however, the consequences for the individual can be calamitous.

AB v Chief Constable of Hampshire Constabulary [2015] EWHC 1238 (Admin) is a recent High Court judgment (Jeremy Baker J) which explores the implications of such inaccurate disclosures. The case is not only about inaccuracies per se, but about why those inaccuracies were not picked up before the disclosure was made.

Perhaps the most notable point from the judgment is this: if such a disclosure is to be necessary, then the data controller must take care to ask themselves reasonable questions about that information, check it against other obvious sources, and make necessary enquiries before disclosure takes place.

In other words, failure to ask the right questions can lead to the wrong course of action in privacy terms. Here is how that principle played out in the AB case.

Background

In 2010, AB was summarily dismissed from his job as a science teacher for inappropriate comments and conduct with potential sexual undertones, as well as a failure to maintain an appropriately professional boundary with students. His appeal against dismissal failed. The Independent Safeguarding Authority, however, decided not to include AB on its barred lists. The General Teaching Council also investigated AB, but it did not find that the allegations of improper conduct were made out.

AB’s dismissal, however, came to the attention of a member of the child abuse investigation public protection unit of the Hampshire Constabulary. Enquiries were made of the college, and certain email correspondence and records were generated and retained on police systems.

Later the following year, AB was offered a teaching job elsewhere. This came to the police’s attention in 2013. There was internal discussion within the police about this. One officer said in an email that, among other things (i) AB had also been dismissed from another school, and (ii) AB’s 2010 dismissal had involved inappropriate touching between himself and pupils. There was no evidence that either of those points was true. That email concluded “From What I’ve been told he should be nowhere near female students. I will put an intel report in on [AB]”.

The above information was passed to the Local Authority Designated Officer (‘LADO’) and in turn to the school, who terminated AB’s employment. He then made a subject access request under the DPA, by which he learnt of the above communication, and also the source of that information, which was said to be a notebook containing a police officer’s notes from 2010 (which did not in fact record either (i) or (ii) above). AB complained of the disclosure and also of the relevant officer’s failures to follow the requisite safeguarding procedures. The police dismissed his complaint.

The Court’s judgment

AB sought judicial review of both the disclosure of the inaccurate email in the email, and of the dismissal of his complaint about the police officer’s conduct in his reporting of the matter.

The Court (Jeremy Baker J) granted the application on both issues. I focus here on the first, namely the lawfulness of the disclosure in terms of Article 8 ECHR.

Was the disclosure “in accordance with the law” for Article 8 purposes?

The Court considered the key authorities in this – by now quite well-developed – area of law (Article 8 in the context of disclosures by the police), notably:

MM v United Kingdom [2010] ECHR 1588 (the retention and disclosure of information relating to an individual by a public authority engages Article 8, and must therefore be justified under Article 8(2));

Tysiac v Poland (2007) 45 EHRR 42, where the ECtHR stressed the importance of procedural safeguards to protecting individuals’ Article 8 rights from unlawful interference by public bodies;

R v Chief Constable of North Wales Ex. Parte Thorpe [1999] QB 396: a decision about whether or not to disclose the identity of paedophiles to members of the public, is a highly sensitive one. “Disclosure should only be made when there is a pressing need for that disclosure”);

R (L) v Commissioner of Police for the Metropolis [2010] 1 AC 410: such cases are essentially about proportionality;

R (A) v Chief Constable of Kent [2013] EWCA Civ 1706: such a disclosure is often “in practice the end of any opportunity for the individual to be employed in an area for which an [Enhanced Criminal Record Certificate] is required. Balancing the risks of non-disclosure to the interests of the members of the vulnerable group against the right of the individual concerned to respect for his or her private life is a particularly sensitive and difficult exercise where the allegations have not been substantiated and are strongly denied”;

R (T) v Chief Constable of Greater Manchester Police & others [2015] AC 49 and R (Catt) v ACPO [2015] 2 WLR 664 on whether disclosures by police were in accordance with the law and proportionate.

The Court concluded that, in light of the above authorities, the disclosure made in AB’s case was “in accordance with the law”. It was made under the disclosure regime made up of: Part V of the Police Act 1997, the Home Office’s Statutory Disclosure Guidance on enhanced criminal records certificates, section 10 of the Children Act 2004 and the Data Protection Act 1998.

See Jeremy Baker J’s conclusion – and notes of caution – at [73]-[75]:

“73. In these circumstances it seems to me that not only does the common law empower the police to disclose relevant information to relevant parties, where it is necessary for one of these police purposes, but that the DPA 1998, together with the relevant statutory and administrative codes, provide a sufficiently clear, accessible and consistent set of rules, so as to prevent arbitrary or abusive interference with an individual’s Article 8 rights; such that the disclosure will be in accordance with law.

74. However, it will clearly be necessary in any case, and in particular in relation to a decision to disclose information to a third party, for the decision-maker to examine with care the context in which his/her decision is being made.

75. In the present case, although the disclosure of the information by the police was to a LADO in circumstances involving the safeguarding of children, it also took place in the context of the claimant’s employment. The relevance of this being, as DC Pain was clearly aware from the contents of his e-mail to PS Bennett dated 10th June 2013, that the disclosure of the information had the potential to adversely affect the continuation of the claimant’s employment at the school….”

Was the disclosure proportionate?

While the disclosure decision was in accordance with the law, this did not remove the need for the police carefully to consider whether disclosure was necessary and proportionate, particularly in light of the serious consequences of disclosure for AB’s employment.

The Court held that the disclosure failed these tests. The crucial factor was that if such information about AB was well founded, then it would have been contained in his Enhanced Criminal Record Certificate – and if it was not, this would have prompted enquiries about the cogency of the information (why, if it was correct, was such serious information omitted from the ECRC?) which would reasonably have been pursued to bottom the matter out before the disclosure was made. These questions had not been asked in this case. See [80]-[81]:

“… In these circumstances, it was in my judgment, a necessary procedural step for DC Pain to ascertain from the DBS unit as to, whether, and if so, what information it had already disclosed on any enhanced criminal record certificate, as clearly if the unit had already disclosed the information which DC Pain believed had been provided to him by the college, then it would not have been necessary for him to have made any further disclosure of that information.

81. If either DC Pain or PS Bennett had taken this basic procedural step, then not only would it have been immediately obvious that this information had not been provided to the school, but more importantly, in the context of this case, it would also have been obvious that further enquiries were required to be made: firstly as to why no such disclosure had been made by the DBS unit; and secondly, once it had been ascertained that the only information which was in the possession of the DBS unit was the exchange of e-mails on the defendant’s management system, as to the accuracy of the information with which DC Pain believed he had been provided by the college.”

Judicial reviews of disclosure decisions concerning personal data: the DPA as an alternative remedy?

Finally, the Court dealt with a submission that judicial review should not be granted as this case focused on what was essentially a data protection complaint, which could have been taken up with the ICO under the DPA (as was suggested in Lord Sumption’s comments in Catt). That submission was dismissed: AB had not simply ignored or overlooked that prospect, but had rather opted to pursue an alternative course of complaint; the DPA did not really help with the police conduct complaint, and the case raised important issues.

Robin Hopkins @hopkinsrobin

Google and the DPA – RIP section 13(2)

March 27th, 2015 by Christopher Knight

Well, isn’t this an exciting week (and I don’t mean Zayn leaving One Direction)? First, Evans and now Vidal-Hall. We only need Dransfield to appear before Easter and there will be a full red bus analogy. Robin opened yesterday’s analysis of Evans by remarking on the sexiness of FOIA. If there is one thing you learn quickly as an information law practitioner, it is not to engage in a sexiness battle with Robin Hopkins. But high-profile though Evans is, the judgment in Vidal-Hall will be of far wider significance to anyone having to actually work in the field, rather than simply tuning every now and then to see the Supreme Court say something constitutional against a FOIA background. Vidal-Hall might not be the immediate head-turner, but it is probably going to be the life-changer for most of us. So, while still in the ‘friend zone’ with the Court of Appeal, before it all gets serious, it is important to explain what Vidal-Hall v Google [2015] EWCA Civ 311 does.

The Context

The claims concern the collection by Google of information about the internet usage of Apple Safari using, by cookies. This is known as “browser generated information” or “BGI”. Not surprisingly, it is used by Google to more effectively target advertising at the user. Anyone who has experienced this sort of thing will know how bizarre it can sometimes get – the sudden appearance of adverts for maternity clothes which would appear on my computer followed eerily quickly from my having to research pregnancy information for a discrimination case I was doing. Apple Safari users had not given their consent to the collection of BGI. The Claimants brought claims for misuse of private information, breach of confidence and breach of the DPA, seeking damages under section 13. There is yet to be full trial; the current proceedings arise because of the need to serve out of the jurisdiction on Google.

The Issues

These were helpfully set out in the joint judgment of Lord Dyson MR and Sharp LJ (with whom Macfarlane LJ agreed) at [13]. (1) whether misuse of private info is a tort, (2) whether damages are recoverable under the DPA for mere distress, (3) whether there was a serious issue to be tried that the browser generated data was personal data and (4) whether permission to serve out should have been refused on Jameel principles (i.e. whether there was a real and substantial cause of action).

Issues (1) and (4) are less important to readers of this blog, and need only mention them briefly (#spoilers!). Following a lengthy recitation of the development of the case law, the Court held that the time had come to talk not of cabbages and kings, but of the tort of misuse of private information, rather than being an equitable action for breach of confidence: at [43], [50]-[51]. This allowed service out under the tort gateway in PD6B. The comment of the Court on issue (4) is worth noting, because it held that although claims for breach of the DPA would involve “relatively modest” sums in damages, that did not mean the claim was not worth the candle. On the contrary, “the damages may be small, but the issues of principle are large”: at [139].

Damages under Section 13 DPA

Issue (2) is the fun stuff for DP lawyers. As we all know, Johnson v MDU [2007] EWCA Civ 262 has long cast a baleful glare over the argument that one can recover section 13 damages for distress alone. The Court of Appeal have held such comments to be obiter and not binding on them: at [68]. The word ‘damage’ in Art 23 of the Directive had to be given an autonomous EU law meaning: at [72]. It also had to be construed widely having regard to the underlying aims of the legislation: the legislation was primarily designed to protect privacy not economic rights and it would be strange if data subjects could not recover compensation for an invasion of their privacy rights merely because they had not suffered pecuniary loss, especially given Article 8 ECHR does not impose such a bar: at [76]-[79]. However, it is not necessary to establish whether there has also been a breach of Article 8; the Directive is not so restricted (although something which does not breach Article 8 is unlikely to be serious enough to have caused distress): at [82].

What then to do about section 13(2) which squarely bars recovery for distress alone and is incompatible with that reading of Article 23? The Court held it could not be ‘read down’ under the Marleasing principle; Parliament had intended section 13(2) to impose this higher test, although there was nothing to suggest why it had done so: at [90]-[93]. The alternative was striking it down on the basis that it conflicted with Articles 7 and 8 of the EU Charter of Fundamental Rights, which the Court of Appeal accepted. In this case, privacy and DP rights were enshrined as fundamental rights in the Charter; breach of DP rights meant that EU law rights were engaged; Article 47 of the Charter requires an effective remedy in respect of the breach; Article 47 itself had horizontal direct effect (as per the court’s conclusion in Benkharbouche v Embassy of Sudan [2015] EWCA Civ 33); the Court was compelled to disapply any domestic provision which offended against the relevant EU law requirement (in this case Article 23); and there could be no objections to any such disapplication in the present case e.g. on the ground that the Court was effectively recalibrating the legislative scheme: at [95]-[98], [105].

And thus, section 13(2) was no more. May it rest in peace. It has run down the curtain and joined the bleedin’ choir invisible.

What this means, of course, is a potential flood of DP litigation. All of a sudden, it will be worth bringing a claim for ‘mere’ distress even without pecuniary loss, and there can be no doubt many will do so. Every breach of the DPA now risks an affected data subject seeking damages. Those sums will invariably be small (no suggestion from the Court of Appeal that Article 23 requires a lot of money), and perhaps not every case will involve distress, but it will invariably be worth a try for the data subject. Legal costs defending such claims will increase. Any data controllers who were waiting for the new Regulation with its mega-fines before putting their house in order had better change their plans…

Was BGI Personal Data

For the DP geeks, much fun was still to be had with Issue (3). Google cannot identify a particular user by name; it only identifies particular browsers. If I search for nasal hair clippers on my Safari browser, Google wouldn’t recognise me walking down the street, no matter how hirsute my proboscis. The Court of Appeal did not need to determine the issue, it held only that there was a serious issue to be tried. Two main arguments were run. First, whether the BGI looked at in isolation was personal data (under section 1(1)(a) DPA); and secondly, whether the BGI was personal data when taken together with gmail account data held by Google (application of limb (b)).

On the first limb, the Court held that it was clearly arguable that the BGI was personal data. This was supported by the terms of the Directive, an Article 29 WP Opinion and the CJEU’s judgment in Lindqvist. The fact that the BGI data does not name the individual is immaterial: it clearly singles them out, individuates them and therefore directly identifies them: at [115] (see more detail at [116]-[121]).

On the second limb, it was also clearly arguable that the BGI was personal data. Google had argued that in practice G had no intention of amalgamating them, therefore there was no prospect of identification. The Court rejected this argument both on linguistic grounds (having regard to the wording of the definition of personal data, which does not require identification to actually occur) and on purposive grounds (having regard to the underlying purpose of the legislation): at [122]-[125].

A third route of identification, by which enable individual users could be identified by third parties who access the user’s device and then learn something about the user by virtue of the targeted advertising, the Court concluded it was a difficult question and the judge was not plainly wrong on the issue, and so it should be left for trial: at [126]-[133].

It will be interesting to see whether the trial happens. If it does, there could be some valuable judicial discussion on the nature of the identification question. For now, much is left as arguable.

Conclusion

The Court of Appeal’s judgment in Vidal-Hall is going to have massive consequences for DP in the UK. The disapplication of section 13(2) is probably the most important practical development since Durant, and arguably more so than that. Google are proposing to seek permission to appeal to the Supreme Court, and given the nature of the issues they may well get it on Issues (1) and (2) at least. In meantime, the Court’s judgment will repay careful reading. And data controllers should start looking very anxiously over their shoulders. The death of their main shield in section 13(2) leaves them vulnerable, exposed and liable to death by a thousand small claims.

Anya Proops and Julian Milford appeared for the ICO, intervening in the Court of Appeal.

Christopher Knight

PS No judicial exclamation marks to be found in Vidal-Hall. Very restrained.

Google Spain, freedom of expression and security: the Dutch fight back

March 13th, 2015 by Robin Hopkins

The Dutch fighting back against the Spanish, battling to cast off the control exerted by Spanish decisions over Dutch ideologies and value judgments. I refer of course to the Eighty Years’ War (1568-1648), which in my view is a sadly neglected topic on Panopticon.

The reference could also be applied, without too much of a stretch, to data protection and privacy rights in 2015.

The relevant Spanish decision in this instance is of course Google Spain, which entrenched what has come to be called the ‘right to be forgotten’. The CJEU’s judgment on the facts of that case saw privacy rights trump most other interests. The judgment has come in for criticism from advocates of free expression.

The fight-back by free expression (and Google) has found the Netherlands to be its most fruitful battleground. In 2014, a convicted criminal’s legal battle to have certain links about his past ‘forgotten’ (in the Google Spain sense) failed.

This week, a similar challenge was also dismissed. This time, a KPMG partner sought the removal of links to stories about him allegedly having to live in a container on his own estate (because a disgruntled builder, unhappy over allegedly unpaid fees, changed the locks on the house!).

In a judgment concerned with preliminary relief, the Court of Amsterdam rejected his application, finding in Google’s favour. There is an excellent summary on the Dutch website Media Report here.

The Court found that the news stories to which the complaint about Google links related remained relevant in light of public debates on this story.

Importantly, the Court said of Google Spain that the right to be forgotten “is not meant to remove articles which may be unpleasant, but not unlawful, from the eyes of the public via the detour of a request for removal to the operator of a search machine.”

The Court gave very substantial weight to the importance of freedom of expression, something which Google Spain’s critics say was seriously underestimated in the latter judgment. If this judgment is anything to go by, there is plenty of scope for lawyers and parties to help Courts properly to balance privacy and free expression.

Privacy rights wrestle not only against freedom of expression, but also against national security and policing concerns.

In The Hague, privacy has recently grabbed the upper hand over security concerns. The District Court of The Hague has this week found that Dutch law on the retention of telecommunications data should be down due to its incompatibility with privacy and data protection rights. This is the latest in a line of cases challenging such data retention laws, the most notable of which was the ECJ’s judgment in Digital Rights Ireland, on which see my post here. For a report on this week’s Dutch judgment, see this article by Maarten van Tartwijk in The Wall Street Journal.

As that article suggests, the case illustrates the ongoing tension between security and privacy. In the UK, security initially held sway as regards the retention of telecoms data: see the DRIP Regulations 2014 (and Panopticon passim). That side of the argument has gathered some momentum of late, in light of (for example) the Paris massacres and revelations about ‘Jihadi John’.

Just this week, however, the adequacy of UK law on security agencies has been called into question: see the Intelligence and Security Committee’s report entitled “Privacy and Security: a modern and transparent legal framework”. There are also ongoing challenges in the Investigatory Powers Tribunal – for example this one concerning Abdul Hakim Belhaj.

So, vital ideological debates continue to rage. Perhaps we really should be writing more about 17th century history on this blog.

Robin Hopkins @hopkinsrobin

Googling Orgies – Thrashing out the Liability of Search Engines

January 30th, 2015 by Christopher Knight

Back in 2008, the late lamented News of the World published an article under the headline “F1 boss has sick Nazi orgy with 5 hookers”. It had obtained footage of an orgy involving Max Mosley and five ladies of dubious virtue, all of whom were undoubtedly (despite the News of the World having blocked out their faces) not Mrs Mosley. The breach of privacy proceedings before Eady J (Mosley v News Group Newspapers Ltd [2008] EWHC 687 (QB)) established that the ‘Nazi’ allegation was unfounded and unfair, that the footage was filmed by a camera secreted in “such clothing as [one of the prostitutes] was wearing” (at [5]), and also the more genteel fact that even S&M ‘prison-themed’ orgies stop for a tea break (at [4]), rather like a pleasant afternoon’s cricket, but with a rather different thwack of willow on leather.

Since that time, Mr Mosley’s desire to protect his privacy and allow the public to forget his penchant for themed tea breaks has led him to bring or fund ever more litigation, whilst simultaneously managing to remind as many people as possible of the original incident. His latest trip to the High Court concerns the inevitable fact of the internet age that the photographs and footage obtained and published by the News of the World remain readily available for those in possession of a keyboard and a strong enough constitution. They may not be on a scale of popularity as last year’s iCloud hacks, but they can be found.

Alighting upon the ruling of the CJEU in Google Spain that a search engine is a data controller for the purposes of the Data Protection Directive (95/46/EC) (on which see the analysis here), Mr Mosley claimed that Google was obliged, under section 10 of the Data Protection Act 1998, to prevent processing of his personal data where he served a notice requesting it to do so, in particular by not blocking access to the images and footage which constitute his personal data. He also alleged misuse of private information. Google denied both claims and sought to strike them out. The misuse of private information claim being (or soon to be) withdrawn, Mitting J declined to strike out the DPA claim: Mosley v Google Inc [2015] EWHC 59 (QB). He has, however, stayed the claim for damages under section 13 pending the Court of Appeal’s decision in Vidal-Hall v Google (on which see the analysis here).

Google ran a cunning defence to what, post-Google Spain, might be said to be a strong claim on the part of a data subject. It relied on Directive 2000/31/EC, the E-Commerce Directive. Article 13 protects internet service providers from liability for the cached storage of information, providing they do not modify the information. Mitting J was content to find that by storing the images as thumbnails, Google was not thereby modifying the information in any relevant sense: at [41]. Article 15 of the E-Commerce Directive also prohibits the imposition of a general obligation on internet service providers to monitor the information they transmit or store.

The problem for Mitting J was how to resolve the interaction between the E-Commerce Directive and the Data Protection Directive; the latter of which gives a data subject rights which apparently extend to cached information held by internet service providers which the former of which apparently absolves them of legal responsibility for. It was pointed out that recital (14) and article 1.5(b) of the E-Commerce Directive appeared to make that instrument subject to the Data Protection Directive. It was also noted that Google’s argument did not sit very comfortably with the judgment (or at least the effect of the judgment) of the CJEU in Google Spain.

Mitting J indicated that there were only two possible answers: either the Data Protection Directive formed a comprehensive code, or the two must be read in harmony and given full effect to: at [45]. His “provisional preference is for the second one”: at [46]. Unfortunately, the judgment does not then go on to consider why that is so, or more importantly, how both Directives can be read in harmony and given full effect to. Of course, on a strike out application provisional views are inevitable, but it leaves rather a lot of legal work for the trial judge, and one might think that it would be difficult to resolve the interaction without a reference to the CJEU. What, for example, is the point of absolving Google of liability for cached information if that does not apply to any personal data claims, which will be a good way of re-framing libel/privacy claims to get around Article 13?

The Court also doubted that Google’s technology really meant that it would have to engage in active monitoring, contrary to Article 15, because they may be able to do so without “disproportionate effort or expense”: at [54]. That too was something for the trial judge to consider.

So, while the judgment of Mitting J is an interesting interlude in the ongoing Mosley litigation saga, the final word certainly awaits a full trial (and/or any appeal by Google), and possibly a reference. All the judgment decides is that Mr Mosley’s claim is not so hopeless it should not go to trial. Headlines reading ‘Google Takes a Beating (with a break for tea)’ would be premature. But the indications given by Mitting J are not favourable to Google, and it may well be that the footage of Mr Mosley will not be long for the internet.

Christopher Knight