Interfering with the fundamental rights of practically the entire European population

April 10th, 2014 by Robin Hopkins

In the Digital Rights Ireland case, the Grand Chamber of the CJEU has this week declared invalid the 2006 Directive which provides for the mass retention – and disclosure to policing and security authorities – of individuals’ online traffic data. It found this regime to be a disproportionate interference with privacy rights. Depending on your perspective, this is a major step forward for digital privacy, or a major step backwards in countering terrorism and serious crime. It probably introduces even more uncertainty in terms of the wider project of data protection reform at the EU level. Here is my synopsis of this week’s Grand Chamber judgment.

Digital privacy vs national security: a brief history

There is an overlapping mesh of rights under European law which aims to protect citizens’ rights with respect to their personal data – an increasingly important strand of the broader right to privacy. The Data Protection Directive (95/46/EC) was passed in 1995, when the internet was in its infancy. It provides that personal data must be processed (obtained, held, used, disclosed) fairly and lawfully, securely, for legitimate purposes and so on.

Then, as the web began to mature into a fundamental aspect of everyday life, a supplementary Directive was passed in 2002 (2002/58/EC) on privacy and electronic communications. It is about privacy, confidentiality and the free movement of electronic personal data in particular.

In the first decade of the 21st century, however, security objectives became increasingly urgent. Following the London bomings of 2005 in particular, the monitoring of would-be criminals’ web activity was felt to be vital to effective counter-terrorism and law enforcement. The digital confidentiality agenda needed to make space for a measure of state surveillance.

This is how Directive 2006/24 came to be. In a nutshell, it provides for traffic and location data (rather than content-related information) about individuals’ online activity to be retained by communications providers and made available to policing and security bodies. This data was to be held for a minimum of six months and a maximum of 24 months.

That Directive – like all others – is however subject to the EU’s Charter of Fundamental Rights. Article 7 of that Charter enshrines the right to respect for one’s private and family life, home and communications. Article 8 is about the right to the protection and fair processing of one’s personal data.

Privacy and Digital Rights Ireland prevail

Digital Rights Ireland took the view that the 2006 Directive was not compatible with those fundamental rights. It asked the Irish Courts to refer this to the CJEU. Similar references were made during different litigation before the Austrian Courts.

The CJEU gave its answer this week. In Digital Rights Ireland Ltd v Minister for Communications, Marine and Natural Resources and Others (C‑293/12) joined with Kärntner Landesregierung and Others (C‑594/12), the Grand Chamber held the 2006 Directive to be invalid on the grounds of its incompatibility with fundamental privacy rights.

The Grand Chamber accepted that, while privacy rights were interfered with, this was in pursuit of compelling social objectives (the combatting of terrorism and serious crime). The question was one of proportionality. Given that fundamental rights were being interfered with, the Courts would allow the European legislature little lee-way: anxious scrutiny would be applied.

Here, in no particular order, are some of the reasons why the 2006 Directive failed its anxious scrutiny test (quotations are all from the Grand Chamber’s judgment). Unsurprisingly, this reads rather like a privacy impact assessment which data controllers are habitually called upon to conduct.

The seriousness of the privacy impact

First, consider the nature of the data which, under Articles 3 and 5 the 2006 Directive, must be retained and made available. “Those data make it possible, in particular, to know the identity of the person with whom a subscriber or registered user has communicated and by what means, and to identify the time of the communication as well as the place from which that communication took place. They also make it possible to know the frequency of the communications of the subscriber or registered user with certain persons during a given period.”

This makes for a serious incursion into privacy: “Those data, taken as a whole, may allow very precise conclusions to be drawn concerning the private lives of the persons whose data has been retained, such as the habits of everyday life, permanent or temporary places of residence, daily or other movements, the activities carried out, the social relationships of those persons and the social environments frequented by them.”

Second, consider the volume of data gathered and the number of people affected. Given the ubiquity of internet communications, the 206 Directive “entails an interference with the fundamental rights of practically the entire European population”.

Admittedly, the 2006 regime does not undermine “the essence” of data protection rights (because it is confined to traffic data – the contents of communications are not retained), and is still subject to data security rules (see the seventh data protection principle under the UK’s DPA 1998).

Nonetheless, this is a serious interference with privacy rights. It has objective and subjective impact: “it is wide-ranging, and it must be considered to be particularly serious… the fact that data are retained and subsequently used without the subscriber or registered user being informed is likely to generate in the minds of the persons concerned the feeling that their private lives are the subject of constant surveillance.”

Such a law, said the Grand Chamber, can only be proportionate if it includes clear and precise laws governing the scope of the measures and providing minimum safeguards for individual rights. The 2006 Directive fell short of those tests.

Inadequate rules, boundaries and safeguards

The regime has no boundaries, in terms of affected individuals: it “applies even to persons for whom there is no evidence capable of suggesting that their conduct might have a link, even an indirect or remote one, with serious crime”.

It also makes no exception for “persons whose communications are subject, according to rules of national law, to the obligation of professional secrecy”.

There are no sufficiently specific limits on the circumstances in which this can be accessed by security bodies, on the purposes to which that data can be put by those bodies, or the persons with whom those particular bodies may share the data.

There are no adequate procedural safeguards: no court or administrative authority is required to sign off the transfers.

There are also no objective criteria for justifying the retention period of 6-24 months.

The Grand Chamber’s conclusion

In summary, the Grand Chamber found that “in the first place, Article 7 of Directive 2006/24 does not lay down rules which are specific and adapted to (i) the vast quantity of data whose retention is required by that directive, (ii) the sensitive nature of that data and (iii) the risk of unlawful access to that data, rules which would serve, in particular, to govern the protection and security of the data in question in a clear and strict manner in order to ensure their full integrity and confidentiality. Furthermore, a specific obligation on Member States to establish such rules has also not been laid down…”

There was also an international transfer aspect to its concern: “in the second place, it should be added that that directive does not require the data in question to be retained within the European Union…”

This last point is of course highly relevant to another of the stand-offs between digital privacy and national security which looms in UK litigation, namely the post-Snowden litigation against security bodies.

Robin Hopkins @hopkinsrobin

The Google/Safari users case: a potential revolution in DPA litigation?

January 16th, 2014 by Robin Hopkins

I posted earlier on Tugendhat J’s judgment this morning in Vidal-Hall and Others v Google Inc [2014] EWHC 13 (QB). The judgment is now available here – thanks as ever to Bailii.

This is what the case is about: a group of claimants say that, by tracking and collating information relating to their internet usage on the Apple Safari browser without their consent, Google (a) misused their private information (b) breached their confidences, and (c) breached its duties under the Data Protection Act 1998 – in particular, under the first, second, sixth and seventh data protection principles. They sought damages and injunctive relief.

As regards damages, “what they claim damages for is the damage they suffered by reason of the fact that the information collected from their devices was used to generate advertisements which were displayed on their screens. These were targeted to their apparent interests (as deduced from the information collected from the devices they used). The advertisements that they saw disclosed information about themselves. This was, or might have been, disclosed also to other persons who either had viewed, or might have viewed, these same advertisements on the screen of each Claimant’s device” (paragraph 24).

It is important to note that “what each of the Claimants claims in the present case is that they have suffered acute distress and anxiety. None of them claims any financial or special damage. And none of them claims that any third party, who may have had sight of the screen of a device used by them, in fact thereby discovered information about that Claimant which was detrimental” (paragraph 25).

The Claimants needed permission to serve proceedings on the US-based Google. They got permission and served their claim forms. Google then sought to have that service nullified, by seeking an order declaring that the English court has no jurisdiction to try these particular claims (i.e. it was not saying that it could never be sued in the English courts).

Tugendhat J disagreed – as things stand, the claims will now progress before the High Court (although Google says it intends to appeal).

Today’s judgment focused in part on construction of the CPR rules about service outside of this jurisdiction. I wanted to highlight some of the other points.

One of the issues was whether the breach of confidence and misuse of private information claims were “torts”. Tugendhat J said this of the approach: “Judges commonly adopt one or both of two approaches to resolving issues as to the meaning of a legal term, in this case the word “tort”. One approach is to look back to the history or evolution of the disputed term. The other is to look forward to the legislative purpose of the rule in which the disputed word appears”. Having looked to the history, he observed that “history does not determine identity. The fact that dogs evolved from wolves does not mean that dogs are wolves”.

The outcome (paragraphs 68-71): misuse of private information is a tort (and the oft-cited proposition that “the tort of invasion of privacy is unknown in English law” needs revisiting) but breach of confidence is not (given Kitetechnology BV v Unicor GmbH Plastmaschinen [1995] FSR 765).

Google also objected to the DPA claims being heard. This was partly because they were raised late; this objection was dismissed.

Google also said that, based on Johnson v MDU [2007] EWCA Civ 262; (2007) 96 BMLR 99, financial loss was required before damages under section 13 of the DPA could be awarded. Here, the Claimants alleged no financial loss. The Claimants argued against the Johnson proposition: they relied on Copland v UK 62617/00 [2007] ECHR 253, argued for a construction of the DPA that accords with Directive 95/46/EC as regards relief, and argued that – unlike in Johnson – this was a case in which their Article 8 ECHR rights were engaged. Tugendhat J has allowed this to proceed to trial, where it will be determined: “This is a controversial question of law in a developing area, and it is desirable that the facts should be found”.

If the Johnson approach is overturned – i.e. if the requirement for financial loss is dispensed with, at least for some types of DPA claim – then this could revolutionise data protection litigation in the UK. Claims under section 13 could be brought without claimants having suffered financially due to the alleged DPA breaches they have suffered.

Tugendhat went on to find that there were sufficiently serious issues to be tried here so as to justify service out of the jurisdiction – it could not be said that they were “not worth the candle”.

Further, there was an arguable case that the underlying information was, contrary to Google’s case, “private” and that it constituted “personal data” for DPA purposes (Google say the ‘identification’ limb of that definition is not met here).

Tugendhat was also satisfied that this jurisdiction was “clearly the appropriate one” (paragraph 134). He accepted the argument of Hugh Tomlinson QC (for the Claimants) that “in the world in which Google Inc operates, the location of documents is likely to be insignificant, since they are likely to be in electronic form, accessible from anywhere in the world”.

Subject to an appeal from Google, the claims will proceed in the UK. Allegations about Google’s conduct in other countries are unlikely to feature. Tugendhat J indicated a focus on what Google has done in the UK, to these individuals: “I think it very unlikely that a court would permit the Claimants in this case to adduce evidence of what Mr Tench refers to as alleged wrongdoing by Google Inc against other individuals, in particular given that it occurred in other parts of the world, governed by laws other than the law of England” (paragraph 47).

Robin Hopkins @hopkinsrobin

High Court to hear Safari users’ privacy claim against Google

January 16th, 2014 by Robin Hopkins

Panopticon has from time to reported on Google’s jurisdictional argument when faced with privacy/data protection actions in European countries: it tends to argue that such claims should be dismissed and must be brought in California instead. This argument is not always successful.

The same jurisdictional argument was advanced before Mr Justice Tugendhat in response to a claim brought by a group calling itself ‘Safari Users Against Google’s Secret Tracking’ who, as their name suggests, complain that Google unlawfully gathers data from Safari browser usage.

This morning, Mr Justice Tugendhat dismissed that jurisdictional argument. The case can be heard in the UK. Matthew Sparkes reports in the Daily Telegraph that the judge said ”I am satisfied that there is a serious issue to be tried in each of the claimant’s claims for misuse of private information” and that “the claimants have clearly established that this jurisdiction is the appropriate one in which to try each of the above claims”.

The same article says that Google will appeal. This follows Google’s announcement yesterday that it will appeal a substantial fine issued by the French data protection authority for unlawful processing (gathering and storing) of user data.

Panopticon will continue to gather data on these and other Google-related matters.

Robin Hopkins @hopkinsrobin

Facebook fan pages: data protection buck stops with Facebook, not page owners

October 22nd, 2013 by Robin Hopkins

In Re Facebook, VG, Nos. 8 A 37/12, 8 A 14/12, 8 A 218/11, 10/9/13 the Schleswig-Holstein Administrative Court has allowed Facebook’s appeals against rulings of the regional data protection authority (the ULD), Thilo Weichert.

The case involved a number of companies’ use of Facebook fan pages. The ULD’s view was that Facebook breached German privacy law, including through its use of cookies, facial recognition and other data processing. He considered that, by using Facebook fan pages, the companies were facilitating Facebook’s violations by processing users’ personal data on those pages. He ordered them to shut down the fan pages or face fines of up to €50,000.

The appellant companies argued that they could not be held responsible for data protection violations (if any) allegedly committed by Facebook, as they had no control over how that data on the pages was processed and used by the social networking site. The Administrative Court agreed.

The case raises interesting questions about where the buck stops in terms of data processing – both in terms of who controls the processing, and in terms of where they are based. Facebook is based in Ireland, without a substantive operational presence in Germany. Earlier this year, the Administrative Court found – again against the Schleswig-Holstein ULD’s ruling – that Facebook’s ‘real names’ policy (i.e. a ban on pseudonymised profiles) was a matter for Irish rather than German law.

The ULD is unlikely to be impressed by the latest judgment, given that he is reported as having said in 2011 that:

“We see a much bigger privacy issue behind the Facebook case: the main business model of Google, Apple, Amazon and others is based on privacy law infringements. This is the reason why Facebook and all the other global internet players are so reluctant in complying with privacy law: they would lose their main profit resource.”

For more on this story, see links here and here.

Robin Hopkins

Fingerprints requirement for passport does not infringe data protection rights

October 22nd, 2013 by Robin Hopkins

Mr Schwarz applied to his regional authority, the city of Bochum, for a passport. He was required to submit a photograph and fingerprints. He did not like the fingerprint part. He considered it unduly invasive. He refused. So Bochum refused to give him a passport. He asked the court to order it to give him one. The court referred to the Court of Justice of the European Union questions about whether the requirement to submit fingerprints in addition to photographs complied with the Data Protection Directive 95/46/EC.

Last week, the Fourth Chamber of the CJEU gave its judgment: the requirement is data protection-compliant.

The requirement had a legal basis, namely Article 1(2) of Council Regulation 2252/2004, which set down minimum security standards for identity-confirmation purposes in passports.

This pursued a legitimate aim, namely preventing illegal entry into the EU.

Moreover, while the requirements entailed the processing of personal data and an interference with privacy rights, the ‘minimum security standards’ rules continued to “respect the essence” of the individual’s right to privacy.

The fingerprint requirement was proportionate because while the underlying technology is not 100% successful in fraud-detection terms, it works well enough. The only real alternative as an identity-verifier is an iris scan, which is no less intrusive and is technologically less robust. The taking of fingerprints is not very intrusive or intimate – it is comparable to having a photograph taken for official purposes, which people don’t tend to complain about when it comes to passports.

Importantly, the underlying Regulation provided that the fingerprints could only be used for identity-verification purposes and that there would be no central database of fingerprints (instead, each set is stored only in the passport).

This is all common-sense stuff in terms of data protection compliance. Data controllers take heart!

Robin Hopkins

Refusal to destroy part of a ‘life story’ justified under Article 8(2) ECHR

October 4th, 2013 by Robin Hopkins

The High Court of Justice (Northern Ireland) has today given judgment In the matter of JR60’s application for judicial review [2013] NIQB 93. The applicant sought to challenge the right of the two Social Care Trusts to keep and use various records generated when she was a resident of children’s homes and a training school between the years 1978-1991.

In most cases of challenges to the retention of records, the applicant seeks to expunge information which suggests they have done wrong. This application is interesting because it focused (though not exclusively) on what the applicant had suffered, as opposed to what she had done. In short, she wished to erase from the record a part of her life story which was painful for her to recall. The application failed: there were weightier reasons for retaining those records, and in any event whatever her current wish to forget matters of such import, she might come to change her mind.

The applicant was described as having had a very difficult childhood, to which those records relate. It was not known who her father was. She had grown up to achieve impressive qualifications. Horner J described her as having “survived the most adverse conditions imaginable and triumphed through the force of her will. By any objective measurement she is a success”.

She wished to move on, and to have the records about her childhood expunged. The Trusts refused; their policy was to retain such information for a 75-year period. The applicant challenged this refusal on Article 8 ECHR grounds. Horner J readily agreed that the retention of such information interfered with her rights under Article 8, but dismissed her application on the grounds that the interference was justified.

The applicant had argued that (i) she did not intend to make any claim for ill-treatment or abuse while she was in care, (ii) she did not want to retrieve information about her life story, (iii) she did not want the records to be used to carry out checks on her, as persons who were not in care would not be burdened by such records in respect of their early lives, and (iv) she did not want others, including her own child, to be able to access these records.

In response to the applicant’s assertion that she did not want and did not envisage wanting access to her records, Horner J said this at paragraph 19:

“Even if the applicant does not want to know at present what is in her records, it does not follow that she may not want to find out in the future what they contain for all sorts of reasons. She may, following the birth of a grandchild, be interested in her personal history for that grandchild’s sake. She may want to find out about her genetic inheritance because she may discover, for example, that she, or her off-spring, is genetically predisposed to a certain illness whether mental or physical. She may want to know whether or not this has been passed down through her mother’s side or her father’s side. There may be other reasons about which it is unnecessary to speculate that will make her want to seek out her lost siblings. There are any number of reasons why she may change her mind in the future about accessing her care records. Of course, if the records are destroyed then the opportunity to consider them is lost forever.”

The Trusts argued that they needed to retain such records for the purposes of their own accountability, any background checks on the applicant or related individuals which may become necessary, for the purposes of (hypothetical) public interest issues such as inquiries, and for responding to subject access requests under the Data Protection Act 1998. Horner J observed that the “right for an individual to be able to establish details of his or her identity applies not just to the Looked After Child but also, inter alia, to that child’s offspring”.

In the circumstances, the application failed; the Trusts’ interference with the applicant’s Article 8 rights was justified.

Horner J added a short concluding observation about the DPA (paragraph 29):

“It is significant that no challenge has been made to the Trust’s storage of personal information of the applicant on the basis that such storage constitutes a breach of the Data Protection Act 1998. This act strengthens the safeguards under the 1984 Act which it replaced. The Act protects “personal data which is data relating to a living individual who can be identified from data whether taken alone or read with other information which is the possession (or is likely to come into possession) of the data controller: see 12-63 of Clayton and Tomlinson on The Law of Human Rights (2nd Edition). It will be noted that “personal” has been interpreted as almost meaning the same as “private”: see Durant v Financial Services Authority [2004] FSR 28 at paragraph [4].”

Robin Hopkins

One hundred years of solicitude

July 29th, 2013 by Robin Hopkins

In 2004, a man known as TD was arrested for an alleged sexual assault. He was interviewed twice. No further action was taken. The biometric data was in due course destroyed, as will be the case with others in such positions, thanks to provisions of the Protection of Freedoms Act 2012. But 40 pages of information about his arrest and the allegation are to be retained by the Metropolitan Police in the form of crime reports and a record shall be retained on the Police National Computer until 2104, when the claimant would be 128 years old. The Metropolitan Police’s policy (of August 2012) concerned Serious Specified Offences provides for retention of such information – without review – for a century. It contends that such long-term policing solicitude as regards these types of allegations is supported by research conducted by University College London in 2009.

TD sought judicial review of this retention to decision (i.e. the refusal to delete this information). Last week, in R (TD) v Commissioner of Police for the Metropolis and Secretary of State for the Home Department [2013] EWHC 2231 (Admin), Moses LJ and Burnett J dismissed his application.

The Court surveyed the relevant line of domestic and Strasbourg authorities which have abounded in recent years: R(L), R (C) and (J), S v UK, Catt, MM v UK (the majority of which are covered in Panopticon’s archive).

The Police said its policy will need to be reviewed, but that it was too early to say that the records about TD are of no use.

Moses LJ said this (paragraph 14):

“It is necessary to be cautious as to how far the considerations of the use to which the records may be put take the Commissioner.  Every record of an allegation of crime may be of use for the indefinite future, as the research to which the Commissioner refers demonstrates.  This was the very argument on which the United Kingdom Government relied in Strasbourg in S, relying on the “inestimable value” of the data [91].  But S shows that the fact that material is of potential use, and, certainly, of greater use than in Catt, is not dispositive.  Weighed against that there remains the discomfort or worse that any citizen must feel when the state retains personal information about him, particularly when it relates to an allegation, however unfounded, of a sexual nature.  In S, it was recognised that the mere storage and retention of the data amounted to an interference within the meaning of Article 8 (para 67).”

He concluded, however (and Burnett J agreed) that (paragraph 16):

“In my view, now that only nine years have elapsed and in the knowledge that access to the information is restricted to those who seek to investigate a crime it seems to me, like Richards LJ in J, that the Commissioner has demonstrated that the use to which the records of the allegation may be put justifies their retention, at least for the time being.”

The important qualifier was that the Police’s policy should provide for a review of the retention decision, but again, it was considered too early to order any such review in this case.

This will not be the last in this line of cases. The jurisprudential debate about balancing policing utility with the privacy rights of suspects – particularly concerning the question ‘how long is too long?’  – continues.

Robin Hopkins (@hopkinsrobin)

Anonymity: publication and open justice

July 11th, 2013 by Robin Hopkins

The tension between transparency and individual privacy is part of what makes information rights such a fascinating and important area. When it comes to high-public interest issues involving particular individuals, prevailing wisdom has tended to be something like this: say as much as possible on an open basis, but redact and anonymise so as to protect the identity of the individuals involved. Increasingly, however, transparency is outmuscling privacy. See for example my post about the Tribunal’s order of disclosure, in the FOIA context, of the details of the compensation package of a Chief Executive of an NHS Trust (the case of Dicker v IC (EA/2012/0250).

The recent Care Quality Commission debate is the highest-profile recent illustration: the health regulator published a consultant’s report into failings regarding the deaths of babies at Furness General Hospital, but withheld the names of the individuals being criticised (including for alleged ‘cover-ups’), relying on the Data Protection Act 1998. The anonymisation was not endorsed by the Information Commissioner, and attracted widespread criticism in media and political circles. Transparency pressures held sway.

In a similar vein, the BBC has come under great pressure over the past week – particularly from Parliament’s Public Accounts Committee – to reveal the names of approximately 150 departing senior managers who received pay-offs averaging £164,000 in the past three years. As the Telegraph reports, the Committee is threatening to use parliamentary privilege to publish those names. The BBC admits that it “got things wrong” by overpaying in many cases (as confirmed by the National Audit Office), but is concerned to protect the DPA and privacy rights of the affected individuals, as well as to safeguard its own independence. The Committee says the public interest in transparency is compelling; Lord Patten, chair of the BBC Trust, says there will be “one hell of an argument” about this.

Such arguments become all the more thorny in the context of open justice disputes, of which there have been a number in recent weeks.

In the matter of Global Torch Ltd/Apex Global Management Ltd (The Guardian, The Financial Times and others intervening) [2013] EWCA Civ 819 involved competing petitions of unfair prejudice alleging misconduct in the affairs of a particular company. Two Saudi Arabian princes and one of their private advisers applied to have the interlocutory hearings held in private under CPR rule 39.2(3). The Court of Appeal agreed with the judge who dismissed those applications. It rejected the contention that the judge had elevated open justice above Article 8 ECHR rights as a matter of law. Rather, he noted that some general presumptions were valid (for example, open justice is likely to trump reputational damage) and applied those in the factual context of this case. Maurice Kay LJ said  (paragraph 34) that there was sometimes a “need for a degree of protection so as to avoid the full application of the open justice principle exposing a victim to the very detriment which his cause of action is designed to prevent… If such an approach were to be extended to a case such as the present one, it could equally be applied to countless commercial and other cases in which allegations of serious misconduct are made. That would result in a significant erosion of the open justice principle. It cannot be justified where adequate protection exists in the form of vindication of the innocent through the judicial process to trial”.

Open justice is of course fundamental not only to freedom of expression, but is also the default setting for fair trials. This is illustrated in the regulatory/disciplinary context by Miller v General Medical Council [2013] EWHC 1934 (Admin). The case involved a challenge to a decision by a Fitness to Practise Panel of the Council’s Medical Practitioners Tribunal Service that a fitness to practise hearing should take place in private because it considered that the complainant, a former patient of the claimant, was otherwise unlikely to give evidence. HHJ Pelling quashed the decision; there was insufficient evidence for the Panel’s conclusion about witness participation, and in any event the Panel “fell into error at the outset by not reminding itself sufficiently strongly or at all that the clear default position under Article 6 is that the hearing should be in public. It failed to remind itself that Article 6 creates or declares rights that are the rights of the Claimant and that it was for the GMC to prove both the need for any derogation from those rights and for a need to derogate to the extent claimed” (paragraph 20).

Robin Hopkins

Prism and Tempora: Privacy International commences legal action

July 10th, 2013 by Robin Hopkins

Panopticon has reported in recent weeks that, following the Edward Snowden/Prism disclosures, Liberty has brought legal proceedings against the UK’s security bodies. This week, Privacy International has announced that it too is bringing a claim in the Investigatory Powers Tribunal – concerning both the Prism and Tempora programmes. It summarises its claim in these terms:

“Firstly, for the failure to have a publicly accessible legal framework in which communications data of those located in the UK is accessed after obtained and passed on by the US National Security Agency through the Prism programme.  Secondly, for the indiscriminate interception and storing of huge amounts of data via tapping undersea fibre optic cables through the Tempora programme.”

Legal complaints on Prism-related transfers have been made elsewhere on data protection grounds also. A group of students who are members of a group called Europe vs. Facebook have filed complaints to the data protection authorities in Ireland (against Facebook and Apple), Luxembourg (against Skype and Microsoft) and Germany (against Yahoo).

European authorities have expressed concerns on these issues in their own right. For example, the Vice President of the European Commission, Viviane Reding, has written to the British Foreign Secretary, William Hague, about the Tempora programme, and has directed similar concerns at the US (including in a piece in the New York Times). The European Parliament has also announced that a panel of its Committee on Civil Liberties, Justice and Home Affairs will be convened to investigate the Prism-related surveillance of EU citizens. It says the panel will report by the end of 2013.

In terms of push-back within the US, it has been reported that Texas has introduced a bill strengthening the requirements for warrants to be obtained before any emails (as opposed to merely unread ones) can be disclosed to state and local law enforcement agencies.

Further complaints, litigation and potential legal challenges will doubtless arise concerning Prism, Tempora and the like.

Robin Hopkins

Google and data protection: no such thing as the ‘right to be forgotten’

June 28th, 2013 by Robin Hopkins

Chris Knight has blogged recently about enforcement action against Google by European Data Protection authorities (but not yet the UK’s ICO). I blogged last month about a German case (BGH, VI ZR 269/12 of 14th May 2013) concerning Google’s ‘autocomplete’ function, and earlier this year about the Google Spain case (Case C‑131/12). The latter arises out of complaints made to that authority by a number of Spanish citizens whose names, when Googled, generated results linking them to allegedly false, inaccurate or out-of-date information (contrary to the data protection principles) – for example an old story mentioning a surgeon’s being charged with criminal negligence, without mentioning that he had been acquitted. The Spanish authority ordered Google to remove the offending entries. Google challenged this order, arguing that it was for the authors or publishers of those websites to remedy such matters. The case was referred to the CJEU by the Spanish courts.

Advocate General Jääskinen this week issued his opinion in this case.

The first point concerns territorial jurisdiction. Google claims that no processing of personal data relating to its search engine takes place in Spain. Google Spain acts merely as commercial representative of Google for its advertising functions. In this capacity it has taken responsibility for the processing of personal data relating to its Spanish advertising customers. The Advocate General has disagreed with Google on this point. His view is that national data protection legislation is applicable to a search engine provider when it sets up in a member state, for the promotion and sale of advertising space on the search engine, an office which orientates its activity towards the inhabitants of that state.

The second point is substantive, and is good news for Google. The Advocate General says that Google is not generally to be considered – either in law or in fact – as a ‘data controller’ of the personal data appearing on web pages it processes. It has no control over the content included on third party web pages and cannot even distinguish between personal data and other data on those pages.

Thirdly, the Advocate General tells us that there is no such thing as the so-called “right to be forgotten” (a favourite theme of debates on the work-in-progress new Data Protection Regulation) under the current Directive. The Directive offers accuracy as to safeguards and so on, but Google had not itself said anything inaccurate here. At paragraph 108 of his opinion, the Advocate General says this:

“… I consider that the Directive does not provide for a general right to be forgotten in the sense that a data subject is entitled to restrict or terminate dissemination of personal data that he considers to be harmful or contrary to his interests. The purpose of processing and the interests served by it, when compared to those of the data subject, are the criteria to be applied when data is processed without the subject’s consent, and not the subjective preferences of the latter. A subjective preference alone does not amount to a compelling legitimate ground within the meaning of Article 14(a) of the Directive.”

It remains to be seen of course whether the Court agrees with the Advocate General. The territorial issue and the ‘data controller’ question are of great significance to Google’s business model – and to those whose businesses face similar issues. The point about objectivity rather than subjectivity being the essential yardstick for compliance with data protection standards is potentially of even wider application.

“This is a good opinion for free expression,” Bill Echikson, a spokesman for Google, said in an e-mailed statement reported by Bloomberg.

Robin Hopkins