Interfering with the fundamental rights of practically the entire European population

April 10th, 2014 by Robin Hopkins

In the Digital Rights Ireland case, the Grand Chamber of the CJEU has this week declared invalid the 2006 Directive which provides for the mass retention – and disclosure to policing and security authorities – of individuals’ online traffic data. It found this regime to be a disproportionate interference with privacy rights. Depending on your perspective, this is a major step forward for digital privacy, or a major step backwards in countering terrorism and serious crime. It probably introduces even more uncertainty in terms of the wider project of data protection reform at the EU level. Here is my synopsis of this week’s Grand Chamber judgment.

Digital privacy vs national security: a brief history

There is an overlapping mesh of rights under European law which aims to protect citizens’ rights with respect to their personal data – an increasingly important strand of the broader right to privacy. The Data Protection Directive (95/46/EC) was passed in 1995, when the internet was in its infancy. It provides that personal data must be processed (obtained, held, used, disclosed) fairly and lawfully, securely, for legitimate purposes and so on.

Then, as the web began to mature into a fundamental aspect of everyday life, a supplementary Directive was passed in 2002 (2002/58/EC) on privacy and electronic communications. It is about privacy, confidentiality and the free movement of electronic personal data in particular.

In the first decade of the 21st century, however, security objectives became increasingly urgent. Following the London bomings of 2005 in particular, the monitoring of would-be criminals’ web activity was felt to be vital to effective counter-terrorism and law enforcement. The digital confidentiality agenda needed to make space for a measure of state surveillance.

This is how Directive 2006/24 came to be. In a nutshell, it provides for traffic and location data (rather than content-related information) about individuals’ online activity to be retained by communications providers and made available to policing and security bodies. This data was to be held for a minimum of six months and a maximum of 24 months.

That Directive – like all others – is however subject to the EU’s Charter of Fundamental Rights. Article 7 of that Charter enshrines the right to respect for one’s private and family life, home and communications. Article 8 is about the right to the protection and fair processing of one’s personal data.

Privacy and Digital Rights Ireland prevail

Digital Rights Ireland took the view that the 2006 Directive was not compatible with those fundamental rights. It asked the Irish Courts to refer this to the CJEU. Similar references were made during different litigation before the Austrian Courts.

The CJEU gave its answer this week. In Digital Rights Ireland Ltd v Minister for Communications, Marine and Natural Resources and Others (C‑293/12) joined with Kärntner Landesregierung and Others (C‑594/12), the Grand Chamber held the 2006 Directive to be invalid on the grounds of its incompatibility with fundamental privacy rights.

The Grand Chamber accepted that, while privacy rights were interfered with, this was in pursuit of compelling social objectives (the combatting of terrorism and serious crime). The question was one of proportionality. Given that fundamental rights were being interfered with, the Courts would allow the European legislature little lee-way: anxious scrutiny would be applied.

Here, in no particular order, are some of the reasons why the 2006 Directive failed its anxious scrutiny test (quotations are all from the Grand Chamber’s judgment). Unsurprisingly, this reads rather like a privacy impact assessment which data controllers are habitually called upon to conduct.

The seriousness of the privacy impact

First, consider the nature of the data which, under Articles 3 and 5 the 2006 Directive, must be retained and made available. “Those data make it possible, in particular, to know the identity of the person with whom a subscriber or registered user has communicated and by what means, and to identify the time of the communication as well as the place from which that communication took place. They also make it possible to know the frequency of the communications of the subscriber or registered user with certain persons during a given period.”

This makes for a serious incursion into privacy: “Those data, taken as a whole, may allow very precise conclusions to be drawn concerning the private lives of the persons whose data has been retained, such as the habits of everyday life, permanent or temporary places of residence, daily or other movements, the activities carried out, the social relationships of those persons and the social environments frequented by them.”

Second, consider the volume of data gathered and the number of people affected. Given the ubiquity of internet communications, the 206 Directive “entails an interference with the fundamental rights of practically the entire European population”.

Admittedly, the 2006 regime does not undermine “the essence” of data protection rights (because it is confined to traffic data – the contents of communications are not retained), and is still subject to data security rules (see the seventh data protection principle under the UK’s DPA 1998).

Nonetheless, this is a serious interference with privacy rights. It has objective and subjective impact: “it is wide-ranging, and it must be considered to be particularly serious… the fact that data are retained and subsequently used without the subscriber or registered user being informed is likely to generate in the minds of the persons concerned the feeling that their private lives are the subject of constant surveillance.”

Such a law, said the Grand Chamber, can only be proportionate if it includes clear and precise laws governing the scope of the measures and providing minimum safeguards for individual rights. The 2006 Directive fell short of those tests.

Inadequate rules, boundaries and safeguards

The regime has no boundaries, in terms of affected individuals: it “applies even to persons for whom there is no evidence capable of suggesting that their conduct might have a link, even an indirect or remote one, with serious crime”.

It also makes no exception for “persons whose communications are subject, according to rules of national law, to the obligation of professional secrecy”.

There are no sufficiently specific limits on the circumstances in which this can be accessed by security bodies, on the purposes to which that data can be put by those bodies, or the persons with whom those particular bodies may share the data.

There are no adequate procedural safeguards: no court or administrative authority is required to sign off the transfers.

There are also no objective criteria for justifying the retention period of 6-24 months.

The Grand Chamber’s conclusion

In summary, the Grand Chamber found that “in the first place, Article 7 of Directive 2006/24 does not lay down rules which are specific and adapted to (i) the vast quantity of data whose retention is required by that directive, (ii) the sensitive nature of that data and (iii) the risk of unlawful access to that data, rules which would serve, in particular, to govern the protection and security of the data in question in a clear and strict manner in order to ensure their full integrity and confidentiality. Furthermore, a specific obligation on Member States to establish such rules has also not been laid down…”

There was also an international transfer aspect to its concern: “in the second place, it should be added that that directive does not require the data in question to be retained within the European Union…”

This last point is of course highly relevant to another of the stand-offs between digital privacy and national security which looms in UK litigation, namely the post-Snowden litigation against security bodies.

Robin Hopkins @hopkinsrobin

Steinmetz and Others v Global Witness: latest developments

April 2nd, 2014 by Robin Hopkins

Panopticon devotees will have noted that important DPA litigation is afoot between a group of businessmen (Beny Steinmetz and others) and the NGO Global Witness. The Economist has recently reported on the latest developments in the case: see here.

I particularly like the article’s subtitle: “Libel laws have become laxer. Try invoking data protection instead”. This is an observation I (and others) have made in the past: see here for example. The point appears to be gathering momentum.

Robin Hopkins @hopkinsrobin

Data protection and compensation: the “irreversible march” towards revolutionary change

March 21st, 2014 by Robin Hopkins

At 11KBW’s Information Law conference this past Tuesday, I talked a bit about the progress of the draft EU Data Protection Regulation. I omitted to mention last week’s development (my reason: I was on holiday in Venice, where data protection seemed less pressing). In a plenary session on 12 March, the European Parliament voted overwhelmingly in support of the Commission’s current draft of the Regulation. This is all explain in this Memo from the European Commission. Here are some key points.

One is the apparently “irreversible” progress towards getting the Regulation onto the EU statute books. “The position of the Parliament is now set in stone and will not change even if the composition of the Parliament changes following the European elections in May. As a reminder, the remaining stage is for the European Council to agree to the proposal. Its ministers are meeting again in early June. So far, they have been broadly supportive.

Another point is about business size and data protection risk: SMEs will not need to notify (so where will the ICO get its funding?), they won’t need to have data protection officers or carry out privacy impact assessments as a default rule. “We want to make sure that obligations are not imposed except where they are necessary to protect personal data: the baker on the corner will not be subject to the same rules as a (multinational) data processing specialist.”

A third point has great consequences for international transfers: “Non-European companies, when offering services to European consumers, will have to apply the same rules and adhere to the same levels of protection of personal data. The reasoning is simple: if companies outside Europe want to take advantage of the European market with more than 500 million potential customers, then they have to play by the European rules”.

Fourth, the “right to be forgotten” is still very much on the agenda. “If an individual no longer wants his or her personal data to be processed or stored by a data controller, and if there is no legitimate reason for keeping it, the data should be removed from their system” (subject to freedom of expression). This “citizen in the driving seat” principle, like the consistency aim (the same rules applied the same away across the whole EU) and the “one-stop shop” regulatory model has been part of the reform package from the outset.

A final point is that the Parliament wants regulators to be able to impose big fines: “It has proposed strengthening the Commission’s proposal by making sure that fines can go up to 5% of the annual worldwide turnover of a company (up from 2% in the Commission’s proposal)”. Monetary penalties will not be mandatory, but they will potentially be huge.

On this last point about money: as under the current law, a regulatory fine is one thing and the individual’s right to be compensated another. At out seminar on Tuesday, we discussed whether there would soon be a sweeping away (see for example the Vidal-Hall v Google litigation) of the long-established Johnson v MDU principle that in order to be compensated for distress under section 13 of the DPA, you need first to prove that you suffered financial loss. That may well be so for the DPA, in which case the short- and medium-term consequences for data protection litigation in the UK will be huge.

But it is important to be clear about the longer term: this is going to happen anyway, regardless of any case-law development in UK jurisprudence. Article 77 of the current draft of the Regulation begins like this “Any person who has suffered damage, including non-pecuniary damage, as a result of an unlawful processing operation or of an action incompatible with this Regulation shall have the right to claim compensation from the controller or the processor for the damage suffered”.

If we are indeed irreversibly on track towards a new Regulation, then data protection litigation – notably, though not only about compensating data subjects – is guaranteed to be revolutionised.

Robin Hopkins @hopkinsrobin

The EU’s Data Protection Regulation: where are we?

January 20th, 2014 by Robin Hopkins

The replacement of Directive 95/26/EC – the bedrock of data protection in Europe – with a new Regulation is intended as a radical overhaul, making protections for personal data fit for the digital world. It has now been over two years since the first substantive draft of that Regulation was made public. I dimly recall Tim Pitt-Payne and I summarising it – see here.

The Regulation is yet to emerge. As a number of Panopticon readers have asked: where have we got to? Here are five points by way of summary.

1. Two members of the trinity are on board

Following seemingly interminable negotiations, the European Parliament’s civil liberties committee (LIBE) now endorses the European Commission’s position on the modified draft. This means that two of the three key bodies at the EU level appear to be of one mind. The next step is for the third body, the European Council, to be persuaded during negotiations. See this blog post by the ICO’s Deputy Commissioner, David Smith.

2. In search of the cardinal virtues – consent, consistency, proportionality

In a very illuminating summary of the major principles at issue, the ICO tells us that it welcomes the following features of the current draft: a stringent approach to consent (or, in low-risk situations, a ‘legitimate interests’ condition justifying the processing of personal data); consistency and an EU-wide ‘one-stop shop’ model; ensuring that processing conditions are proportionate to risk (by, for example, requiring data subjects to be notified ‘without delay’ rather than within 24 hours, as was originally proposed).

The ICO remains concerned, however, that the draft Regulation continues to suffer from some vices: its use of the ‘pseudonymisation’ concept muddies the distinction between personal and non-personal data; the approach to profiling is insufficiently nuanced, and the international transfer rules may be unrealistically stringent.

3. The Regulation is dead!

Peter Fleischer, Google’s global privacy counsel, considers that the stalled progress of 2013 effectively means that “the old draft is dead”. His view, however, is that this delay will provide an opportunity for a more realistic re-think: “Whatever comes next will be the most important privacy legislation in the world, setting the global standards. I’m hopeful that this pause will give lawmakers time to write a better, more modern and more balanced law.”

4. Long live the Regulation!

EU officials are, however, optimistic about the current draft being spurred on to finality in 2014. Peter Hustinx, the outgoing European Data Protection Supervisor (curiously, no successor has yet been appointed), hopes that Greece’s imminent turn in the presidency seat will provide a fresh impetus for productive negotiation. Importantly, he sees Germany (often characterised as setting very stringent standards for data protection) as being in the driving seat: “The new German government can tackle this subject with the necessary drive and energy and thereby gain acceptance of the German position at European level and lead Europe to a higher level of data protection.”

5. Are the Americans Safe?

The processing of EU citizens’ data by US-based companies sits outside the direct reach of the envisaged Regulation, as with the current Directive. Since 2000, transfers of personal data to the US have been governed by the Safe Harbour Agreement, under which approximately 3,300 companies have been certified as safe (in the sense of being EU compliant in their data protection standards).

The European Council and Parliament have, however, expressed concern about the fitness for purpose of the Safe Harbour scheme. They have observed that “Web companies such as Google, Facebook, Microsoft, Apple, Yahoo have hundreds of millions of clients in Europe and transfer personal data for processing to the US on a scale inconceivable in the year 2000 when the Safe Harbour was created”. They area also concerned about the ongoing revelations about surveillance: “divergent responses of data protection authorities to the surveillance revelations demonstrate the real risk of the fragmentation of the Safe Harbour scheme and raise questions as to the extent to which it is enforced”.

Progress by the US Department of Commerce is now sought – by March 2014 – on improving transparency, the application of EU principles and enforcement. The arrangements will be further reviewed in 2014.

Robin Hopkins @hopkinsrobin

The Google/Safari users case: a potential revolution in DPA litigation?

January 16th, 2014 by Robin Hopkins

I posted earlier on Tugendhat J’s judgment this morning in Vidal-Hall and Others v Google Inc [2014] EWHC 13 (QB). The judgment is now available here – thanks as ever to Bailii.

This is what the case is about: a group of claimants say that, by tracking and collating information relating to their internet usage on the Apple Safari browser without their consent, Google (a) misused their private information (b) breached their confidences, and (c) breached its duties under the Data Protection Act 1998 – in particular, under the first, second, sixth and seventh data protection principles. They sought damages and injunctive relief.

As regards damages, “what they claim damages for is the damage they suffered by reason of the fact that the information collected from their devices was used to generate advertisements which were displayed on their screens. These were targeted to their apparent interests (as deduced from the information collected from the devices they used). The advertisements that they saw disclosed information about themselves. This was, or might have been, disclosed also to other persons who either had viewed, or might have viewed, these same advertisements on the screen of each Claimant’s device” (paragraph 24).

It is important to note that “what each of the Claimants claims in the present case is that they have suffered acute distress and anxiety. None of them claims any financial or special damage. And none of them claims that any third party, who may have had sight of the screen of a device used by them, in fact thereby discovered information about that Claimant which was detrimental” (paragraph 25).

The Claimants needed permission to serve proceedings on the US-based Google. They got permission and served their claim forms. Google then sought to have that service nullified, by seeking an order declaring that the English court has no jurisdiction to try these particular claims (i.e. it was not saying that it could never be sued in the English courts).

Tugendhat J disagreed – as things stand, the claims will now progress before the High Court (although Google says it intends to appeal).

Today’s judgment focused in part on construction of the CPR rules about service outside of this jurisdiction. I wanted to highlight some of the other points.

One of the issues was whether the breach of confidence and misuse of private information claims were “torts”. Tugendhat J said this of the approach: “Judges commonly adopt one or both of two approaches to resolving issues as to the meaning of a legal term, in this case the word “tort”. One approach is to look back to the history or evolution of the disputed term. The other is to look forward to the legislative purpose of the rule in which the disputed word appears”. Having looked to the history, he observed that “history does not determine identity. The fact that dogs evolved from wolves does not mean that dogs are wolves”.

The outcome (paragraphs 68-71): misuse of private information is a tort (and the oft-cited proposition that “the tort of invasion of privacy is unknown in English law” needs revisiting) but breach of confidence is not (given Kitetechnology BV v Unicor GmbH Plastmaschinen [1995] FSR 765).

Google also objected to the DPA claims being heard. This was partly because they were raised late; this objection was dismissed.

Google also said that, based on Johnson v MDU [2007] EWCA Civ 262; (2007) 96 BMLR 99, financial loss was required before damages under section 13 of the DPA could be awarded. Here, the Claimants alleged no financial loss. The Claimants argued against the Johnson proposition: they relied on Copland v UK 62617/00 [2007] ECHR 253, argued for a construction of the DPA that accords with Directive 95/46/EC as regards relief, and argued that – unlike in Johnson – this was a case in which their Article 8 ECHR rights were engaged. Tugendhat J has allowed this to proceed to trial, where it will be determined: “This is a controversial question of law in a developing area, and it is desirable that the facts should be found”.

If the Johnson approach is overturned – i.e. if the requirement for financial loss is dispensed with, at least for some types of DPA claim – then this could revolutionise data protection litigation in the UK. Claims under section 13 could be brought without claimants having suffered financially due to the alleged DPA breaches they have suffered.

Tugendhat went on to find that there were sufficiently serious issues to be tried here so as to justify service out of the jurisdiction – it could not be said that they were “not worth the candle”.

Further, there was an arguable case that the underlying information was, contrary to Google’s case, “private” and that it constituted “personal data” for DPA purposes (Google say the ‘identification’ limb of that definition is not met here).

Tugendhat was also satisfied that this jurisdiction was “clearly the appropriate one” (paragraph 134). He accepted the argument of Hugh Tomlinson QC (for the Claimants) that “in the world in which Google Inc operates, the location of documents is likely to be insignificant, since they are likely to be in electronic form, accessible from anywhere in the world”.

Subject to an appeal from Google, the claims will proceed in the UK. Allegations about Google’s conduct in other countries are unlikely to feature. Tugendhat J indicated a focus on what Google has done in the UK, to these individuals: “I think it very unlikely that a court would permit the Claimants in this case to adduce evidence of what Mr Tench refers to as alleged wrongdoing by Google Inc against other individuals, in particular given that it occurred in other parts of the world, governed by laws other than the law of England” (paragraph 47).

Robin Hopkins @hopkinsrobin

High Court to hear Safari users’ privacy claim against Google

January 16th, 2014 by Robin Hopkins

Panopticon has from time to reported on Google’s jurisdictional argument when faced with privacy/data protection actions in European countries: it tends to argue that such claims should be dismissed and must be brought in California instead. This argument is not always successful.

The same jurisdictional argument was advanced before Mr Justice Tugendhat in response to a claim brought by a group calling itself ‘Safari Users Against Google’s Secret Tracking’ who, as their name suggests, complain that Google unlawfully gathers data from Safari browser usage.

This morning, Mr Justice Tugendhat dismissed that jurisdictional argument. The case can be heard in the UK. Matthew Sparkes reports in the Daily Telegraph that the judge said ”I am satisfied that there is a serious issue to be tried in each of the claimant’s claims for misuse of private information” and that “the claimants have clearly established that this jurisdiction is the appropriate one in which to try each of the above claims”.

The same article says that Google will appeal. This follows Google’s announcement yesterday that it will appeal a substantial fine issued by the French data protection authority for unlawful processing (gathering and storing) of user data.

Panopticon will continue to gather data on these and other Google-related matters.

Robin Hopkins @hopkinsrobin

Legal analysis of individual’s situation is not their personal data, says Advocate General

December 18th, 2013 by Robin Hopkins

YS, M and S were three people who applied for lawful residence in the Netherlands. The latter two had their applications granted, but YS’ was refused. All three wanted to see a minute drafted by an official of the relevant authority in the Netherlands containing internal legal analysis on whether to grant them residence status. They made subject access requests under Dutch data protection law, the relevant provisions of which implement Article 12 of Directive 95/46/EC. They were given some of the contents of the minutes, but the legal analysis was withheld. This was challenged before the Dutch courts. Questions were referred to the CJEU on the application of data protection law to such information. In Joined Cases C‑141/12 and C‑372/12, Advocate General Sharpston has given her opinion, which the CJEU will consider before giving its judgment next year. Here are some important points from the AG’s opinion.

The definition of personal data

The minutes in question contained inter alia: the name, date of birth, nationality, sex, ethnicity, religion and language of the applicant; information about the procedural history; information about declarations made by the applicant and documents submitted; the applicable legal provisions and an assessment of the relevant information in the light of the applicable law.

Apart from the latter – the legal advice – the AG’s view is that this information does come within the meaning of personal data under the Directive. She said this:

“44. In general, ‘personal data’ is a broad concept. The Court has held that the term covers, for example, ‘the name of a person in conjunction with his telephone coordinates or information about his working conditions or hobbies’, his address, his daily work periods, rest periods and corresponding breaks and intervals, monies paid by certain bodies and the recipients, amounts of earned or unearned incomes and assets of natural persons.

45. The actual content of that information appears to be of no consequence as long as it relates to an identified or identifiable natural person. It can be understood to relate to any facts regarding that person’s private life and possibly, where relevant, his professional life (which might involve a more public aspect of that private life). It may be available in written form or be contained in, for example, a sound or image.”

The suggestion in the final paragraph is that the information need not have a substantial bearing on the individual’s privacy in order to constitute their personal data.

The AG also observed that “Directive 95/46 does not establish a right of access to any or every document or file in which personal data are listed or used” (paragraph 71). This resonates with the UK’s long-established Durant ‘notions of assistance’.

Legal analysis is not personal data

AG Sharpston’s view, however, was that the legal analysis of the individuals’ situations did not constitute their personal data. Her reasoning – complete with illustrative examples – is as follows:

“55. I am not convinced that the phrase ‘any information relating to an identified or identifiable natural person’ in Directive 95/46 should be read so widely as to cover all of the communicable content in which factual elements relating to a data subject are embedded.

56. In my opinion, only information relating to facts about an individual can be personal data. Except for the fact that it exists, a legal analysis is not such a fact. Thus, for example, a person’s address is personal data but an analysis of his domicile for legal purposes is not.

57. In that context, I do not find it helpful to distinguish between ‘objective’ facts and ‘subjective’ analysis. Facts can be expressed in different forms, some of which will result from assessing whatever is identifiable. For example, a person’s weight might be expressed objectively in kilos or in subjective terms such as ‘underweight’ or ‘obese’. Thus, I do not exclude the possibility that assessments and opinions may sometimes fall to be classified as data.

58. However, the steps of reasoning by which the conclusion is reached that a person is ‘underweight’ or ‘obese’ are not facts, any more than legal analysis is.”

Interestingly, her conclusion did touch upon the underlying connection between personal data and privacy. At paragraph 60, she observed that “… legal analysis as such does not fall within the sphere of an individual’s right to privacy. There is therefore no reason to assume that that individual is himself uniquely qualified to verify and rectify it and ask that it be erased or blocked. Rather, it is for an independent judicial authority to review the decision for which that legal analysis was prepared.”

In any event, legal analysis does not amount to “processing” for data protection purposes

The AG considered that legal analysis such as this was neither ‘automatic’ nor part of a ‘relevant filing system’. “Rather, it is a process controlled entirely by individual human intervention through which personal data (in so far as they are relevant to the legal analysis) are assessed, classified in legal terms and subjected to the application of the law, and by which a decision is taken on a question of law. Furthermore, that process is neither automatic nor directed at filing data” (paragraph 63).

Entitlement to data, but not in a set form

The AG also says that what matters is that individuals are provided with their data – data controllers are not, under the Directive, required to provide it in any particular form. For example, they can extract or transcribe rather than photocopy the relevant minute:

“74. Directive 95/46 does not require personal data covered by the right of access to be made available in the material form in which they exist or were initially recorded. In that regard, I consider that a Member State has a considerable margin of discretion to determine, based on the individual circumstances in case, the form in which to make personal data accessible.

75. In making that assessment, a Member State should take account of, in particular: (i) the material form(s) in which that information exists and can be made available to the data subject, (ii) the type of personal data and (iii) the objectives of the right of access.”

If the legal analysis is personal data, then the exemptions do not apply

Under the Directive, Article 12 provides the subject access right. Article 13 provides exemptions. The AG’s view was that if, contrary to her opinion, the legal analysis is found to be personal data, then exemptions from the duty to communicate that data would not be available. Of particular interest was her view concerning the exemption under Article 13(1)(g) for the “protection of the data subject or of the rights and freedoms of others”. Her view is that (paragraph 84):

“the protection of rights and freedoms of others (that is, other than the data subject) cannot be read as including rights and freedoms of the authority processing personal data. If a legal analysis is to be categorised as personal data, that must be because it is related to the private interests of an identified or identifiable person. Whilst the public interest in protecting internal advice in order to safeguard the administration’s ability to exercise its functions may indeed compete with the public interest in transparency, access to such advice cannot be restricted on the basis of the first of those two interests, because access covers only what falls within the private interest.”

If the Court agrees with the AG’s view, the case will be an important addition to case law offering guidance on the limits of personal data. It would also appear to limit, at least as regards the exemption outlined above, the data controller’s ability to rely on its own interests or on public interests to refuse subject access requests. That said, there is of course the exemption under Article 9 of the Directive for freedom of expression.

Robin Hopkins @hopkinsrobin

Facebook fan pages: data protection buck stops with Facebook, not page owners

October 22nd, 2013 by Robin Hopkins

In Re Facebook, VG, Nos. 8 A 37/12, 8 A 14/12, 8 A 218/11, 10/9/13 the Schleswig-Holstein Administrative Court has allowed Facebook’s appeals against rulings of the regional data protection authority (the ULD), Thilo Weichert.

The case involved a number of companies’ use of Facebook fan pages. The ULD’s view was that Facebook breached German privacy law, including through its use of cookies, facial recognition and other data processing. He considered that, by using Facebook fan pages, the companies were facilitating Facebook’s violations by processing users’ personal data on those pages. He ordered them to shut down the fan pages or face fines of up to €50,000.

The appellant companies argued that they could not be held responsible for data protection violations (if any) allegedly committed by Facebook, as they had no control over how that data on the pages was processed and used by the social networking site. The Administrative Court agreed.

The case raises interesting questions about where the buck stops in terms of data processing – both in terms of who controls the processing, and in terms of where they are based. Facebook is based in Ireland, without a substantive operational presence in Germany. Earlier this year, the Administrative Court found – again against the Schleswig-Holstein ULD’s ruling – that Facebook’s ‘real names’ policy (i.e. a ban on pseudonymised profiles) was a matter for Irish rather than German law.

The ULD is unlikely to be impressed by the latest judgment, given that he is reported as having said in 2011 that:

“We see a much bigger privacy issue behind the Facebook case: the main business model of Google, Apple, Amazon and others is based on privacy law infringements. This is the reason why Facebook and all the other global internet players are so reluctant in complying with privacy law: they would lose their main profit resource.”

For more on this story, see links here and here.

Robin Hopkins

Fingerprints requirement for passport does not infringe data protection rights

October 22nd, 2013 by Robin Hopkins

Mr Schwarz applied to his regional authority, the city of Bochum, for a passport. He was required to submit a photograph and fingerprints. He did not like the fingerprint part. He considered it unduly invasive. He refused. So Bochum refused to give him a passport. He asked the court to order it to give him one. The court referred to the Court of Justice of the European Union questions about whether the requirement to submit fingerprints in addition to photographs complied with the Data Protection Directive 95/46/EC.

Last week, the Fourth Chamber of the CJEU gave its judgment: the requirement is data protection-compliant.

The requirement had a legal basis, namely Article 1(2) of Council Regulation 2252/2004, which set down minimum security standards for identity-confirmation purposes in passports.

This pursued a legitimate aim, namely preventing illegal entry into the EU.

Moreover, while the requirements entailed the processing of personal data and an interference with privacy rights, the ‘minimum security standards’ rules continued to “respect the essence” of the individual’s right to privacy.

The fingerprint requirement was proportionate because while the underlying technology is not 100% successful in fraud-detection terms, it works well enough. The only real alternative as an identity-verifier is an iris scan, which is no less intrusive and is technologically less robust. The taking of fingerprints is not very intrusive or intimate – it is comparable to having a photograph taken for official purposes, which people don’t tend to complain about when it comes to passports.

Importantly, the underlying Regulation provided that the fingerprints could only be used for identity-verification purposes and that there would be no central database of fingerprints (instead, each set is stored only in the passport).

This is all common-sense stuff in terms of data protection compliance. Data controllers take heart!

Robin Hopkins

Refusal to destroy part of a ‘life story’ justified under Article 8(2) ECHR

October 4th, 2013 by Robin Hopkins

The High Court of Justice (Northern Ireland) has today given judgment In the matter of JR60’s application for judicial review [2013] NIQB 93. The applicant sought to challenge the right of the two Social Care Trusts to keep and use various records generated when she was a resident of children’s homes and a training school between the years 1978-1991.

In most cases of challenges to the retention of records, the applicant seeks to expunge information which suggests they have done wrong. This application is interesting because it focused (though not exclusively) on what the applicant had suffered, as opposed to what she had done. In short, she wished to erase from the record a part of her life story which was painful for her to recall. The application failed: there were weightier reasons for retaining those records, and in any event whatever her current wish to forget matters of such import, she might come to change her mind.

The applicant was described as having had a very difficult childhood, to which those records relate. It was not known who her father was. She had grown up to achieve impressive qualifications. Horner J described her as having “survived the most adverse conditions imaginable and triumphed through the force of her will. By any objective measurement she is a success”.

She wished to move on, and to have the records about her childhood expunged. The Trusts refused; their policy was to retain such information for a 75-year period. The applicant challenged this refusal on Article 8 ECHR grounds. Horner J readily agreed that the retention of such information interfered with her rights under Article 8, but dismissed her application on the grounds that the interference was justified.

The applicant had argued that (i) she did not intend to make any claim for ill-treatment or abuse while she was in care, (ii) she did not want to retrieve information about her life story, (iii) she did not want the records to be used to carry out checks on her, as persons who were not in care would not be burdened by such records in respect of their early lives, and (iv) she did not want others, including her own child, to be able to access these records.

In response to the applicant’s assertion that she did not want and did not envisage wanting access to her records, Horner J said this at paragraph 19:

“Even if the applicant does not want to know at present what is in her records, it does not follow that she may not want to find out in the future what they contain for all sorts of reasons. She may, following the birth of a grandchild, be interested in her personal history for that grandchild’s sake. She may want to find out about her genetic inheritance because she may discover, for example, that she, or her off-spring, is genetically predisposed to a certain illness whether mental or physical. She may want to know whether or not this has been passed down through her mother’s side or her father’s side. There may be other reasons about which it is unnecessary to speculate that will make her want to seek out her lost siblings. There are any number of reasons why she may change her mind in the future about accessing her care records. Of course, if the records are destroyed then the opportunity to consider them is lost forever.”

The Trusts argued that they needed to retain such records for the purposes of their own accountability, any background checks on the applicant or related individuals which may become necessary, for the purposes of (hypothetical) public interest issues such as inquiries, and for responding to subject access requests under the Data Protection Act 1998. Horner J observed that the “right for an individual to be able to establish details of his or her identity applies not just to the Looked After Child but also, inter alia, to that child’s offspring”.

In the circumstances, the application failed; the Trusts’ interference with the applicant’s Article 8 rights was justified.

Horner J added a short concluding observation about the DPA (paragraph 29):

“It is significant that no challenge has been made to the Trust’s storage of personal information of the applicant on the basis that such storage constitutes a breach of the Data Protection Act 1998. This act strengthens the safeguards under the 1984 Act which it replaced. The Act protects “personal data which is data relating to a living individual who can be identified from data whether taken alone or read with other information which is the possession (or is likely to come into possession) of the data controller: see 12-63 of Clayton and Tomlinson on The Law of Human Rights (2nd Edition). It will be noted that “personal” has been interpreted as almost meaning the same as “private”: see Durant v Financial Services Authority [2004] FSR 28 at paragraph [4].”

Robin Hopkins