Unsafe Harbor: some practical implications of the Schrems judgment

October 6th, 2015 by Robin Hopkins

Panopticon has been quick-off-the-mark in reporting on today’s enormously significant Schrems judgment from the CJEU: see Chris’ alert and Anya’s commentary. I hope readers will excuse a third excursion into the same waters, given the enormous consequences the judgment. Here are a few observations on what those consequences mean in practice.

  1. Is this the end for Safe Harbor?

In its current form, yes. In theory, it can be fixed, rather than binned. Efforts have in fact been underway for some time aimed at renegotiating and tightening up aspects of the Safe Harbor arrangements, spurred by the Snowden revelations about the extent of US surveillance. The tenor of the judgment, however, is that tweaks will not suffice. ‘Dead in the water’ is the right shorthand for Safe Harbor.

  1. Does the Schrems judgment affect all companies transferring data to the US?

No – it torpedoes the Safe Harbor scheme, but it does not torpedo all EU-US data transfers. The Safe Harbor scheme was one of the major ways in which EU-US transfers of personal data ticked the box in terms of complying with Article 25 of Directive 95/46/EC (or the eighth data protection principle, in UK parlance). But it was not the only way.

Not all US companies were part of that scheme – in fact, you can see the full list of companies that are certified for Safe Harbor on the website of the US Department of Commerce (which administers certification for the scheme) here. There are around 5,000 companies affected by the Schrems judgment.

  1. Without Safe Harbour, how can data transfers to the US be lawful?

Obviously, the options include avoiding transfers to the US henceforth. Data processing arrangements could be retained within the EU, or they could be switched to one of a number of countries which already have an EU seal of approval: see the list here, which include Andorra, New Zealand, Canada, Uruguay, Israel and Argentina. Again, however, the Schrems judgment arguably implies that not even those countries are immune from scrutiny. Though those countries are not tainted by the Snowden/NSA revelations, their approved status is no longer inviolable.

Another option for multinationals transferring data to the US (or elsewhere) is to use Binding Corporate Rules. These provide a framework for how the organisation handles personal data. The data controller drafts its BCRs and submits them to the regulator for approval. Where more than one EU state is involved, the other regulators all need to have their say before the data controller’s arrangements are given the green light.

The BCR process is explained by the ICO here. Note the observation that a straightforward BCR application can take 12 months. So no quick fix for plugging the Safe Harbor gap here. Companies may need to find interim solutions while they work on adopting BCRs.

Another option is the use of Model Contract Clauses, explained by the ICO here. This involves incorporating off-the-shelf, EU-approved provisions into your contracts relating to personal data. These are inflexible, and they will not fit every data controller’s needs. Again, data controllers may need to craft stop-gap contractual solutions.

And again, it is arguably implicit in the Schrems judgment that even BCRs and Model Contract Clauses are flawed, i.e. they do not suffice to ensure that adequate data protection standards are maintained.

Lastly, as a data controller, you are able to do it yourself, i.e. to carry out your own assessment of the level of protection afforded in your data’s destination country. Again, the ICO helpfully explains. Again, however, the solutions are not straightforward.

  1. Are regulators going to take immediate action against all Safe Harbor-based transfers?

Unclear, but it is doubtful that they have the will or the way.

In the immediate term, the Irish Data Protection Commissioner now needs to decide whether or not Facebook’s US data transfers are lawful in the absence of Safe Harbor. This alone will be an important decision.

In the UK, the ICO has issued a press release on Schrems. It recognises that it will take time for businesses to adapt. Its tone is neither immediate nor pitiless.

This is no doubt because the business implications – both for the private sector and the regulators – would be enormous if a whole-scale clampdown were to be commenced immediately. It is likely that many regulators will give data controllers some time to get their houses (or harbors) in order – though the CJEU declined to take a similar approach in its judgment today.

  1. Will the new Data Protection Regulation fix the problem?

No. Its approach to international transfers is largely the same to the one which is currently in place. It contains no automatic fixes to the current quandary.

These are just preliminary observations. The dust has not yet settled, and businesses face some thorny practicalities in the meantime.

Robin Hopkins @hopkinsrobin

Refusing a subject access request: proportionality, anxious scrutiny and judicial discretion

August 25th, 2015 by Robin Hopkins

Zaw Lin and Wai Phyo v Commissioner of Police for the Metropolis [2015] EWHC 2484 (QB), a judgment of Green J handed down today, is an interesting – if somewhat fact-specific – contribution to the burgeoning body of case law on how subject access requests (SARs) made under the Data Protection Act 1998 (DPA) should be approached, both by data controllers and by courts.

The Claimants are on trial in Thailand for the murder in September 2014 of British tourists Hannah Witheridge and David Miller. They could face the death penalty if convicted.

Under the Police Act 1996, and following high-level discussions (including at Prime Ministerial level), it was agreed that the Metropolitan Police Service (MPS) would send an officer to observe and review – but not assist with – the Thai police investigation. The MPS compiled a detailed Report. They agreed to keep this confidential, except that it could be summarised verbally to the families of the victims so as to reassure about the state of the investigation and proceedings. The Report has never been provided to the families or the Thai authorities.

The Claimants made SARs, seeking disclosure of the MPS’ Report. Green J summarised their objectives as follows (para 29):

“The Claimants have endeavoured to clothe their arguments in the somewhat technical language of the DPA.  It seems to me that the bottom line of these arguments, stripped bare of technical garb, can be put in two ways.  First, the views of the MPS carry weight. Scotland Yard has an international reputation.  If the Report is seen as favourable to the prosecution and contains material supportive of the RTP [Royal Thai Police] investigation (which is in effect how the Claimants say it has been presented in public by the families) then they should have the right to see the personal data so they can correct any misapprehensions.  Secondly, that in any event they should be able to use any personal data which is favourable to their defence.”

The Claimants were entitled to request disclosure of at least some of the contents of the Report, though Green J estimated that only a small percentage of its contents constituted their personal data (para 25).

The MPS refused the SARs, relying on the exemption for crime and taxation under section 29 DPA.

In determining the claim under section 7(9) DPA, Green J considered arguments as to the applicability (or not) of Directive 95/46/EC (which contains exceptions for criminal matters: see Articles 3 and 13) and the European Convention on Human Rights. His view was that not much turned on these points here (para 49). At common law, the court’s scrutiny must always be fact- and context-specific. In a life-and-death context, anxious scrutiny would be applied to a data controller’s refusal. See para 69:

“… when construing the DPA 1998 (whether through common law or European eyes) decision makers and courts must have regard to all relevant fundamental rights that arise when balancing the interest of the State and those of the individual.  There are no artificial limits to be placed on the exercise.”

Green J expressed his discomfort about the application of section 15(2) DPA, which allows the court – but not the data subject – to view the withheld information. This, together with the prospect of a closed session, raised concerns as to natural and open justice. Given the expedited nature of the case before him, it was not appropriate to appoint a special advocate, but that may need to be considered in future cases where the stakes are very high. Green J proceeded by asking questions and hearing submissions on an open basis in a sufficiently generic and abstract way.

In expressing those procedural misgivings, Green J has touched on an important aspect of DPA litigation which has received little attention to date.

He also took a narrower view of the breadth of his discretion under section 7(9) DPA than has often been assumed. At para 98, he said this of the ‘general and untrammelled’ nature of that judicial discretion:

“If Parliament had intended to confer such a broad residual discretion on the court then, in my view, it would have used far more specific language in section 7(9) than in fact it did. In any event I do not understand the observations in the authorities referred to above to suggest that if I find that the MPS has erred that I should simply make up and then apply whatever test I see fit.  If I find an error on the part of the MPS such that I must form my own view then I should do in accordance with the principles set out in the DPA 1998 and taking account of the relevant background principles in the Directive and the Convention. My discretion is unfettered by the decision that has gone before, and which I find unlawful, but I cannot depart from Parliament’s intent.”

Such an approach to section 7(9) could make a material difference to litigation concerning SARs.

Green J then set out and determined the issues before him as follows:

Issue I: Who has the burden of proof of proving both the right to invoke the exemption? What is the standard of proof?

Following R (Lord) v Secretary of State of the Home Department [2003] EWHC 2073 (Admin), the answer is that the data controller bears the burden. “The burden of proof is thus upon the MPS in this case to show its entitlement to refuse access and it must do this with significant and weighty grounds and evidence” (para 85).

Issue II: Was the personal data in the MPS report “processed” for purposes of (a) the prevention or detection of crime or (b) the apprehension or prosecution of offenders?

Green J’s answer was yes. Although the purposes behind the Report differed from the usual policing context, there should be no artificially narrow interpretation of the ‘prevention and detection of crime/apprehension or prosecution of offenders’.

Issue III: Would granting access be likely to prejudice any of those purposes?

This required a balancing exercise to be performed between the individual’s right to access and the interests being pursued by the data controller in refusing disclosure. This called for a “classic proportionality balancing exercise to be performed” (para 78).

Here, the starting point was the Claimant’s prima facie right to the personal data. This was bolstered by the life-and-death context of the present case.

The MPS’ refusal, however, pursued legitimate and weighty objectives. In assessing those objectives, it was relevant to consider what precedent would be set by disclosure: the “focus of attention was not just on the facts of the instant case but could also take account of the impact on other cases” (as per Lord).

On that basis, and in light of the evidence, the MPS’ ‘chilling effect’ argument was powerful. See para 107:

“… I accept their judgment and opinion as to the risks that release of the Report would give rise to and in particular, their position on: the considerable benefit to the public interest (in relation to crime enforcement and public security) generally in the MPS (and other relevant police authorities) being able to engage with foreign authorities; the high importance that is attached by foreign authorities to confidentiality; and the risk that not being able to give strong assurances as to confidentiality would pose to the ability of the MPS and others to enter into meaningful working relationship with such overseas authorities.”

It was also important to avoid any potential interference with a criminal trial in a foreign country.

The Claimants’ SARs were not made for any improper purposes, i.e. for purposes other than those which Directive 95/46/EC sought to further. In that respect, the present case was wholly unlike Durant.

The balancing exercise, however, favoured the MPS. Having considered each item of personal data, Green J said his “ultimate conclusion is that there is nothing in the personal data which would be of any real value to the Claimants” (para 125). He expressed his unease with both the procedure and the outcome. Permission to appeal was granted, though Panopticon understands that an appeal is not being pursued by the Claimants.

Anya Proops and Christopher Knight acted for the Defendant.

Robin Hopkins @hopkinsrobin

Privacy and data protection – summer roundup

August 18th, 2015 by Robin Hopkins

August tends to be a quiet month for lawyers. There has, however, been little by way of a summer break in privacy and data protection developments. Here are some August highlights.

Privacy injunction: sexual affairs of sportsman (not philosophers)

Mrs Justice Laing’s August does not appear to have begun restfully. Following a telephone hearing on the afternoon of Saturday 1 August, she granted what became a widely-reported privacy injunction (lasting only until 5 August) restraining the publication of a story about an affair which a prominent sportsman had some years ago: see the judgment in AMC and KLJ v News Group Newspapers [2015] EWHC 2361 (QB).

As usual in such cases, Article 8 and Article 10 rights were relied upon to competing ends. There is no automatic favourite in such contests – an intense focus on the facts is required.

In this case, notwithstanding submissions about the extent to which the affected individuals ‘courted publicity’ or were not ‘private persons’ – there was a reasonable expectation of privacy about a secret sexual affair conducted years ago. The interference needed to be justified.

The right to free expression did not constitute adequate justification without more: “I cannot balance these two incommensurables [Articles 8 and 10] without asking why, and for what purposes, X and R seek to exercise their article 10 rights… The public interest here is, I remind myself, a contribution to a debate in the general interest”.

On the facts, there was insufficient public interest to justify that interference. The sportsman was not found to have hypocritically projected himself as ‘whiter than white’, and his alleged deceits and breaches of protocols in the coducting of his affair were not persuasive – especially years after the event. In any event, the sportsman was a role model for sportsmen or aspiring sportsmen: “he is not a role model for cooks, or for moral philosophers”. The latter point will no doubt be a weight off many a sporting shoulder.

Subject access requests: upcoming appeals

Subject access requests have traditionally received little attention in the courts. As with data protection matters more broadly, this is changing.

Holly Stout blogged earlier this month about the High Court’s judgment in Dawson-Damer and Ors v Taylor Wessing and Ors [2015] EWHC 2366 (Ch). The case concerned legal professional privilege, manual records and relevant filing systems, disproportionate searches and the court’s discretion under section 7(9) DPA. That case is on its way to the Court of Appeal.

So too is the case of Ittihadieh [2015] EWHC 1491 (QB), in which I appeared. That case concerned, among other issues, identification of relevant data controllers and the domestic purposes exemption. It too is on its way to the Court of Appeal.

Subject access requests: the burden of review and redaction

There has also been judgment this month in a County Court case in which I appeared for the Metropolitan Police Service. Mulcahy v MPS, a judgment of District Judge Langley in the Central London County Court, deals in part with the purposes behind a subject access request. It also deals with proportionality and burden, which – as Holly’s recent post discusses – has tended to be a vexed issue under the DPA (see Ezsias, Elliott, Dawson-Damer and the like).

Mulcahy deals with the proportionality of the burden imposed not so much by searching for information within the scope of a subject access request, but for reviewing (and, where necessary, redacting) that information before disclosure. This is an issue which commonly concerns data controllers. The judgment is available here: Mulcahy Judgment.

Privacy damages: Court of Appeal to hear Gulati appeal

May of 2015 saw Mr Justice Mann deliver a ground-breaking judgment on damages awards for privacy breaches: see Gulati & Ors v MGN Ltd [2015] EWHC 1482 (Ch), which concerned victims of phone-hacking (including Paul Gascoigne and Sadie Frost). The awards ranged between £85,000 and £260,250. The judgment and grounds of appeal against the levels of damages awards are explained in this post by Louise Turner of RPC.

Earlier this month, the Court of Appeal granted MGN permission to appeal. The appeal is likely to be expedited. It will not be long before there is a measure of certainty on quantum for privacy breaches.

ICO monetary penalties

Lastly, I turn to privacy-related financial sanctions of a different kind. August has seen the ICO issue two monetary penalty notices.

One was for £50,000 against ‘Stop the Calls’ (ironically, a company which markets devices for blocking unwanted marketing calls) for serious contraventions of regulation 21 of the Privacy and Electronic Regulations 2003 (direct marketing phone calls to persons who registered their opposition to such calls with the Telephone Preference Service).

Another was for £180,000 for a breach of the seventh data protection principle. It was made against The Money Shop following a burglary in which an unencrypted server containing customers’ personal information was stolen.

Robin Hopkins @hopkinsrobin

Facebook, drag artists and data protection dilemmas: ‘if you stand on our pitch, you must play by our rules’

July 31st, 2015 by Robin Hopkins

Facebook is one of the main battlegrounds between privacy and other social goods such as safety and security.

On the one hand, it faces a safeguarding challenge. Interactions through Facebook have the potential to cause harm: defamation, data protection breaches, stalking, harassment, abuse and the like. One safeguard against such harms is to ensure that users are identifiable, i.e. that they really are who they say they are. This facilitates accountability and helps to ensure that only users of an appropriate age are communicating on Facebook. The ongoing litigation before the Northern Irish courts in the HL case raises exactly these sorts of concerns about child protection.

Part of the solution is Facebook’s ‘real names’ policy: you cannot register using a pseudonym, but only with your official identity.

On the other hand, Facebook encounters an argument which runs like this: individuals should be free to decide how they project themselves in their communications with the world. This means that, provided they are doing no harm, they should in principle be allowed to use whatever identity they like, including pseudonyms, working names (for people who wish to keep their private Facebooking and their professional lives separate) or stage names (particularly relevant for drag artists, for example). The real names policy arguably undermines this element of human autonomy, dignity and privacy. There have been colourful recent protests against the policy on these sorts of grounds.

Which is the stronger argument? Well, the answer to the question seems to depend on who you ask, and where you ask.

The Data Protection Commissioner in Ireland, where Facebook has its EU headquarters, has upheld the real names policy. When one of Germany’s regional Data Protection Commissioners (Schleswig-Holstein) took the opposite view, Facebook challenged his ruling and secured a court victory in 2013. The German court suspended the order against the real names policy and, equally importantly, decided that the challenge should proceed in Ireland, not Germany.

This week, however, another German decision turned the tables on the real names policy yet again. The Hamburg data protection authority upheld a complaint from someone who used a pseudonym on Facebook so as to separate her private and professional communications. The Hamburg DPA found against Facebook and held that it was not allowed unilaterally to change users’ chosen usernames to their real names. Nor was it entitled to demand official identification documents – an issue of particular relevance to child protection issues such as those arising in HL.

The Hamburg ruling is notable on a number of fronts. It exemplifies the tension between privacy – in all its nuanced forms – and other values. It illustrates the dilemmas bedevilling the business models of social media companies such as Facebook.

The case also highlights real challenges for the future of European data protection. The General Data Protection Regulation – currently clawing its way from draft to final form – aspires to harmonised pan-European standards. It includes a mechanism for data protection authorities to co-operate and resolve differences. But if authorities within the same country are prone to divergence on issues such as the real names policy, how optimistic can one be that regulators across the EU will sing from the same hymn sheet?

Important questions arise about data protection and multinational internet companies: in which country (or region, for that matter) should a user raise a complaint to a regulator? If they want to complain to a court, where do they do that? If a German user complains to an Irish regulator or court, to what extent do those authorities have to consider German law?

For the moment, Facebook clearly seeks home ground advantage. But its preference for the Irish forum was rejected by the Hamburg authority in this week’s ruling. He is reported as saying that “… Facebook cannot again argue that only Irish Data Protection law would be applicable … anyone who stands on our pitch also has to play our game”.

The draft Regulation has something to say on these matters, but is far from clear as to how to decide on the right pitch and the right rules for vital privacy battles like these.

Robin Hopkins @hopkinsrobin

Facebook, child protection and outsourced monitoring

July 22nd, 2015 by Robin Hopkins

Facebook is no stranger to complaints about the content of posts. Usually, one user complains to Facebook about what other users’ posts say about him. By making the offending posts available, Facebook is processing the complainant’s personal data, and must do so in compliance with data protection law.

More unusually, a user could also complain about their own Facebook posts. Surely a complainant cannot make data protection criticisms about information they deliberately posted about themselves? After all, Facebook processes those posts with the author’s consent, doesn’t it?

Generally, yes – but that will not necessarily be true in every instance, especially when it comes to Facebook posts by children. This is the nature of the complaint in striking litigation currently afoot before the High Court in Northern Ireland.

The case is HL v Facebook Inc, Facebook Ireland Ltd, the Northern Health & Social Care Trust and DCMS [2015] NIQB 61. It is currently only in its preliminary stages, but it raises very interesting and important issues about Facebook’s procedures for preventing underage users from utilising the social network. Those issues are illuminated in the recent judgment of Stephen J, who is no stranger to claims against Facebook – he heard the recent case of CG v Facebook [2015] NIQB 11, concerning posts about a convicted paedophile.

From the age of 11 onwards, HL maintained a Facebook page on which she made posts of an inappropriate sexual nature. She was exposed to responses from sexual predators. She says that Facebook is liable for its failure to prevent her from making these posts. She alleges that Facebook (i) unlawfully processed her sensitive personal data, (ii) facilitated her harassment by others, and (iii) was negligent in failing to have proper systems in place to minimise the risks of children setting up Facebook accounts by lying about their age.

The data protection claim raises a number of issues of great importance to the business of Facebook and others with comparable business models. One is the extent to which a child can validly consent to the processing of their personal data – especially sensitive personal data. Minors are (legitimately or not) increasingly active online, and consent is a cornerstone of online business. The consent issue is of one of wide application beyond the HL litigation.

A second issue is whether, in its processing of personal data, Facebook does enough to stop minors using their own personal data in ways which could harm them. In her claim, for example, HL refers to evidence given to a committee of the Australian Parliament – apparently by a senior privacy advisor to Facebook (though Facebook was unable to tell Stephens J who he was). That evidence apparently said that Facebook removes 20,000 under-age user profiles a day.

Stephens J was also referred to comments apparently made by a US Senator to Mark Zuckerberg about the vulnerability of underage Facebook users.

Another element of HL’s case concerns Facebook’s use of an outsourcing company called oDesk, operating for example from Morocco, to moderate complaints about Facebook posts. She calls into question the adequacy of these oversight measures: ‘where then is the oversight body for these underpaid global police?’ (to quote from a Telegraph article referred to in the recent HL judgment). Facebook says that – given its number of users in multiple languages across the globe – effective policing is a tall order (an argument J summed up at paragraph 22 as ‘the needle in a haystack argument, there is just too much to monitor, the task of dealing with underage users is impossible’).

In short, HL says that Facebook seems to be aware of the scale and seriousness of the problem of underage use of its network and has not done enough to tackle that problem.

Again, the issue is one of wider import for online multinationals for whom personal data is stock-in-trade.

The same goes for the third important data protection issue surfacing in the HL litigation. This concerns jurisdiction, cross-border data controllers and section 5 of the Data Protection Act 1998. For example, is Facebook Ireland established in the UK by having an office, branch or agency, and does it process the personal data in Facebook posts in the context of that establishment?

These issues are all still to be decided. Stephens J’s recent judgment in HL was not about the substantive issues, but about HL’s applications for specific discovery and interrogatories. He granted those applications. In addition to details of HL’s Facebook account usage, he ordered the Facebook defendants to disclose agreements between them and Facebook (UK) Ltd and between them and o-Desk (to whom some moderating processes were outsourced). He has also ordered the Facebook defendants to answer interrogatory questions about their procedures for preventing underage Facebook use.

In short, the HL litigation has – thus far – raised difficult data protection and privacy issues which are fundamental to Facebook’s business, and it has required Facebook to lay bare internal details of its safeguarding practices. The case is only just beginning. The substantive hearing, which is listed for next term, could groundbreaking.

Robin Hopkins @hopkinsrobin

DRIPA 2014 declared unlawful

July 17th, 2015 by Robin Hopkins

In a judgment of the Divisional Court handed down this morning, Bean LJ and Collins J have declared section 1 of the Data Retention and Investigatory Powers Act 2014 (DRIPA) to be unlawful.

For the background to that legislation, see our posts on Digital Rights Ireland and then on the UK’s response, i.e. passing DRIPA in an attempt to preserve data retention powers.

That attempt has today suffered a serious setback via the successful challenges brought by the MPs David Davis and Tom Watson, as well as Messrs Brice and Lewis. The Divisional Court did, however, suspend the effect of its order until after 31 March 2016, so as to give Parliament time to consider how to put things right.

Analysis to follow in due course, but for now, here is the judgment: Davis Watson Judgment.

Robin Hopkins @hopkinsrobin

Google and the ordinary person’s right to be forgotten

July 15th, 2015 by Robin Hopkins

The Guardian has reported today on data emerging from Google about how it has implemented the Google Spain ‘right to be forgotten’ principle over the past year or so: see this very interesting article by Julia Powles.

While the data is rough-and-ready, it appears to indicate that the vast majority of RTBF requests actioned by Google have concerned ‘ordinary people’. By that I mean people who are neither famous nor infamous, and who seek not to have high-public-interest stories erased from history, but to have low-public-interest personal information removed from the fingertips of anyone who cares to Google their name. Okay, that explanation here is itself rough-and-ready, but you get the point: most RTBF requests come not from Max Mosley types, but from Mario Costeja González types (he being the man who brought the Google Spain complaint in the first place).

As Julia Powles points out, today’s rough-and-ready is thus far the best we have to go on in terms of understanding how the RTBF is actually working in practice. There is very little transparency on this. Blame for that opaqueness cannot fairly be levelled only at Google and its ilk – though, as the Powles articles argues, they may have a vested interest in maintaining that opaqueness. Opaqueness was inevitable following a judgment like Google Spain, and European regulators have, perhaps forgivably, not yet produced detailed guidance at a European level on how the public can expect such requests to be dealt with. In the UK, the ICO has given guidance (see here) and initiated complaints process (see here).

Today’s data suggests to me that a further reason for this opaqueness is the ‘ordinary person’ factor: the Max Mosleys of the world tend to litigate (and then settle) when they are dissatisfied, but the ordinary person tends not to (Mr González being an exception). We remain largely in the dark about how this web-shaping issue works.

So: the ordinary person is most in need of transparent RTBF rules, but least equipped to fight for them.

How might that be resolved? Options seem to me to include some combination of (a) clear regulatory guidance, tested in the courts, (b) litigation by a Max Mosley-type figure which runs its course, (c) litigation by more Mr González figures (i.e. ordinary individuals), (d) litigation by groups of ordinary people (as in Vidal Hall, for example) – or perhaps (e) litigation by members of the media who object to their stories disappearing from Google searches.

The RTBF is still in its infancy. Google may be its own judge for now, but one imagines not for long.

Robin Hopkins @hopkinsrobin

Austria will not host Europe vs Facebook showdown

July 6th, 2015 by Robin Hopkins

As illustrated by Anya Proops’ recent post on a Hungarian case currently before the CJEU, the territorial jurisdiction of European data protection law can raise difficult questions.

Such questions have bitten hard in the Europe vs Facebook litigation. Max Schrems, an Austrian law graduate, is spearheading a massive class action in which some 25,000 Facebook users allege numerous data protection violations by the social media giant. Those include: unlawful obtaining of personal data (including via plug-ins and “like” buttons); invalid consent to Facebook’s processing of users’ personal data; use of personal data for impermissible purposes, including the unlawful analysing of data/profiling of users (“the Defendant analyses the data available on every user and tries to explore users’ interests, preferences and circumstances…”); unlawful sharing of personal data with third parties and third-party applications. The details of the claim are here.

Importantly, however, the claim is against Facebook Ireland Ltd, a subsidiary of the Californian-based Facebook Inc. The class action has been brought in Austria.

Facebook challenged the Austrian court’s jurisdiction. Last week, it received a judgment in its favour from the Viennese Regional Civil Court. The Court said it lacks jurisdiction in part because Mr Schrems is not deemed to be a ‘consumer’ of Facebook’s services. In part also, it lacks jurisdiction because Austria is not the right place to be bringing the claim. Facebook argued that the claim should be brought either in Ireland or in California, and the Court agreed.

Mr Schrems has announced his intention to appeal. In the meantime, the Austrian decision will continue to raise both eyebrows and questions, particularly given that a number of other judgments in recent years have seen European courts accepting jurisdiction to hear claims against social media companies (such as Google: see Vidal-Hall, for example) based elsewhere.

The Austrian decision also highlights the difficulties of the ‘one-stop shop’ principle which remains part of the draft Data Protection Regulation (albeit in more nuanced and complicated formulation than had earlier been proposed). In short, why should an Austrian user have to sue in Ireland?

Panopticon will report on any developments in this case in due course. It will also report on the other strand of Mr Schrems’ privacy campaign, namely his challenge to the lawfulness of the Safe Harbour regime for the transferring of personal data to the USA. That challenge has been heard by the CJEU, and the Advocate General’s opinion is imminent. The case will have major implications for those whose business involves transatlantic data transfers.

Robin Hopkins @hopkinsrobin

Google Spain, freedom of expression and security: the Dutch fight back

March 13th, 2015 by Robin Hopkins

The Dutch fighting back against the Spanish, battling to cast off the control exerted by Spanish decisions over Dutch ideologies and value judgments. I refer of course to the Eighty Years’ War (1568-1648), which in my view is a sadly neglected topic on Panopticon.

The reference could also be applied, without too much of a stretch, to data protection and privacy rights in 2015.

The relevant Spanish decision in this instance is of course Google Spain, which entrenched what has come to be called the ‘right to be forgotten’. The CJEU’s judgment on the facts of that case saw privacy rights trump most other interests. The judgment has come in for criticism from advocates of free expression.

The fight-back by free expression (and Google) has found the Netherlands to be its most fruitful battleground. In 2014, a convicted criminal’s legal battle to have certain links about his past ‘forgotten’ (in the Google Spain sense) failed.

This week, a similar challenge was also dismissed. This time, a KPMG partner sought the removal of links to stories about him allegedly having to live in a container on his own estate (because a disgruntled builder, unhappy over allegedly unpaid fees, changed the locks on the house!).

In a judgment concerned with preliminary relief, the Court of Amsterdam rejected his application, finding in Google’s favour. There is an excellent summary on the Dutch website Media Report here.

The Court found that the news stories to which the complaint about Google links related remained relevant in light of public debates on this story.

Importantly, the Court said of Google Spain that the right to be forgotten “is not meant to remove articles which may be unpleasant, but not unlawful, from the eyes of the public via the detour of a request for removal to the operator of a search machine.”

The Court gave very substantial weight to the importance of freedom of expression, something which Google Spain’s critics say was seriously underestimated in the latter judgment. If this judgment is anything to go by, there is plenty of scope for lawyers and parties to help Courts properly to balance privacy and free expression.

Privacy rights wrestle not only against freedom of expression, but also against national security and policing concerns.

In The Hague, privacy has recently grabbed the upper hand over security concerns. The District Court of The Hague has this week found that Dutch law on the retention of telecommunications data should be down due to its incompatibility with privacy and data protection rights. This is the latest in a line of cases challenging such data retention laws, the most notable of which was the ECJ’s judgment in Digital Rights Ireland, on which see my post here. For a report on this week’s Dutch judgment, see this article by Maarten van Tartwijk in The Wall Street Journal.

As that article suggests, the case illustrates the ongoing tension between security and privacy. In the UK, security initially held sway as regards the retention of telecoms data: see the DRIP Regulations 2014 (and Panopticon passim). That side of the argument has gathered some momentum of late, in light of (for example) the Paris massacres and revelations about ‘Jihadi John’.

Just this week, however, the adequacy of UK law on security agencies has been called into question: see the Intelligence and Security Committee’s report entitled “Privacy and Security: a modern and transparent legal framework”. There are also ongoing challenges in the Investigatory Powers Tribunal – for example this one concerning Abdul Hakim Belhaj.

So, vital ideological debates continue to rage. Perhaps we really should be writing more about 17th century history on this blog.

Robin Hopkins @hopkinsrobin

Googling Orgies – Thrashing out the Liability of Search Engines

January 30th, 2015 by Christopher Knight

Back in 2008, the late lamented News of the World published an article under the headline “F1 boss has sick Nazi orgy with 5 hookers”. It had obtained footage of an orgy involving Max Mosley and five ladies of dubious virtue, all of whom were undoubtedly (despite the News of the World having blocked out their faces) not Mrs Mosley. The breach of privacy proceedings before Eady J (Mosley v News Group Newspapers Ltd [2008] EWHC 687 (QB)) established that the ‘Nazi’ allegation was unfounded and unfair, that the footage was filmed by a camera secreted in “such clothing as [one of the prostitutes] was wearing” (at [5]), and also the more genteel fact that even S&M ‘prison-themed’ orgies stop for a tea break (at [4]), rather like a pleasant afternoon’s cricket, but with a rather different thwack of willow on leather.

Since that time, Mr Mosley’s desire to protect his privacy and allow the public to forget his penchant for themed tea breaks has led him to bring or fund ever more litigation, whilst simultaneously managing to remind as many people as possible of the original incident. His latest trip to the High Court concerns the inevitable fact of the internet age that the photographs and footage obtained and published by the News of the World remain readily available for those in possession of a keyboard and a strong enough constitution. They may not be on a scale of popularity as last year’s iCloud hacks, but they can be found.

Alighting upon the ruling of the CJEU in Google Spain that a search engine is a data controller for the purposes of the Data Protection Directive (95/46/EC) (on which see the analysis here), Mr Mosley claimed that Google was obliged, under section 10 of the Data Protection Act 1998, to prevent processing of his personal data where he served a notice requesting it to do so, in particular by not blocking access to the images and footage which constitute his personal data. He also alleged misuse of private information. Google denied both claims and sought to strike them out. The misuse of private information claim being (or soon to be) withdrawn, Mitting J declined to strike out the DPA claim: Mosley v Google Inc [2015] EWHC 59 (QB). He has, however, stayed the claim for damages under section 13 pending the Court of Appeal’s decision in Vidal-Hall v Google (on which see the analysis here).

Google ran a cunning defence to what, post-Google Spain, might be said to be a strong claim on the part of a data subject. It relied on Directive 2000/31/EC, the E-Commerce Directive. Article 13 protects internet service providers from liability for the cached storage of information, providing they do not modify the information. Mitting J was content to find that by storing the images as thumbnails, Google was not thereby modifying the information in any relevant sense: at [41]. Article 15 of the E-Commerce Directive also prohibits the imposition of a general obligation on internet service providers to monitor the information they transmit or store.

The problem for Mitting J was how to resolve the interaction between the E-Commerce Directive and the Data Protection Directive; the latter of which gives a data subject rights which apparently extend to cached information held by internet service providers which the former of which apparently absolves them of legal responsibility for. It was pointed out that recital (14) and article 1.5(b) of the E-Commerce Directive appeared to make that instrument subject to the Data Protection Directive. It was also noted that Google’s argument did not sit very comfortably with the judgment (or at least the effect of the judgment) of the CJEU in Google Spain.

Mitting J indicated that there were only two possible answers: either the Data Protection Directive formed a comprehensive code, or the two must be read in harmony and given full effect to: at [45]. His “provisional preference is for the second one”: at [46]. Unfortunately, the judgment does not then go on to consider why that is so, or more importantly, how both Directives can be read in harmony and given full effect to. Of course, on a strike out application provisional views are inevitable, but it leaves rather a lot of legal work for the trial judge, and one might think that it would be difficult to resolve the interaction without a reference to the CJEU. What, for example, is the point of absolving Google of liability for cached information if that does not apply to any personal data claims, which will be a good way of re-framing libel/privacy claims to get around Article 13?

The Court also doubted that Google’s technology really meant that it would have to engage in active monitoring, contrary to Article 15, because they may be able to do so without “disproportionate effort or expense”: at [54]. That too was something for the trial judge to consider.

So, while the judgment of Mitting J is an interesting interlude in the ongoing Mosley litigation saga, the final word certainly awaits a full trial (and/or any appeal by Google), and possibly a reference. All the judgment decides is that Mr Mosley’s claim is not so hopeless it should not go to trial. Headlines reading ‘Google Takes a Beating (with a break for tea)’ would be premature. But the indications given by Mitting J are not favourable to Google, and it may well be that the footage of Mr Mosley will not be long for the internet.

Christopher Knight