Tentative Trilogu-ery

December 1st, 2015 by Christopher Knight

Those of you (all of you, surely?) who are keenly following the nail-biting, cheek-clenching progress of the Trilogue’s negotiations over the General Data Protection Regulation will be overjoyed to read this 370 page official EU document, dated 20 November 2015, summarising the original Commission proposal, the Parliament’s position, the approach of the Council and the “tentative” agreement reached thus far in Trilogue (or, where there is no tentative agreement, the suggestions of the Council’s Presidency).

There is limited purpose in analysing in detail all of the changes and proposals at this stage – enough ink has already been wasted on overtaken drafts – but what the tentative agreements do indicate is that a final text is getting closer. Will it beat Christmas? Who knows. Somehow, it is unlikely that Santa is keen on having to lug a new Regulation around to try and squeeze it into your stockings, but progress is progress.

Christopher Knight

The Independent Commission on FOI – Update

November 23rd, 2015 by Christopher Knight

Did we all make submissions to the Independent Commission on Freedom of Information last week? It sounds as though many of you did. Lord Burns, Chair of the Commission, has announced that they received some 30,000 responses to their consultation. Not surprisingly, reading those and thinking about them is something the Commission does not now feel it can do before Christmas. Indeed, Lord Burns has announced he will call oral evidence from some respondents on 20 and 25 January 2016, and the Commission will write their report after that. Hopefully this is a sign that the Commission wants its work to be evidence-based. We wait to see who the lucky individuals are who have been invited to the oral evidence party.

The announcement is here.

Christopher Knight

Expectations of privacy abroad

November 23rd, 2015 by Paul Greatorex

As all celebrities know, to get the High Court to stop paparazzi pictures of you from being published, the first thing you have to do is show you had a reasonable expectation of privacy.  But what if you were snapped outside of the jurisdiction and whilst English law principles suggest that you did have such an expectation, the local law where the photographs were taken says you do not?

The answer given by the Court of Appeal in Weller v Associated Newspapers [2015] EWCA Civ 1176 is that the local law is not determinative and the weight to be given to it is a matter for the judge.

Readers of Panopticon may recall a similar issue arose in Douglas v Hello [2005] EWCA Civ 595 where the Court of Appeal said that the provisions of New York law, which had entitled Michael Douglas and Catherine Zeta-Jones to arrange their wedding there in private, had no direct application since the question of whether the information was private was one of English law.  However, it had also expressed the view that the reverse was not necessarily true, saying that if New York law had permitted any member of the public to be present at a hotel wedding and to take and publish photographs of that wedding, then the photographs “would have been unlikely to have satisfied the test of privacy”.

Ten years later, the decision in Weller suggests the position is not necessarily that simple.  The case concerned a claim by the children of Paul Weller for an injunction and damages for misuse of private information and/or breach of the Data Protection Act 1998, arising out of the publication by the Mail Online of unpixellated photographs of them taken on a street and in a cafe in California.

The Court of Appeal agreed with the judge below that, applying ordinary principles of English law, the children did have a reasonable expectation of privacy and the fact (found by the judge and unchallenged on appeal) that that under Californian law there was no such expectation, did not mean the claim must fail.  The Court of Appeal said the position under local law was not determinative and the weight to give to it had been for the judge to determine: see [67-71].

On the facts it was held that there was no error by the judge in giving it the very little weight to it that he had: the connection of the two youngest children (aged 10 months) with California was slight, and certainly so when compared with their parents’ connection with England where the photographs were unlawfully published, and it had heard “very little, if any, argument” about the impact of the fact that the eldest child was living in California at the time.  Challenging matters of weight on appeal is always very difficult although this brief reasoning at [70] suggests a particular reluctance to interfere with the decision below.

There are three other points of interest in the judgment.

The first is the summary provided by the Court of Appeal at [29-30] of the case law governing children and privacy:

  • a child does not have a separate right to privacy merely by virtue of being a child;
  • the broad approach to reasonable expectation of privacy is the same for children and adults but as there are several considerations relevant to children but not to adults, a child may in a particular case have such an expectation where an adult does not;
  • in the case of children (as in the case of adults) all the circumstances of the case should be taken into account in deciding whether there is a reasonable expectation of privacy, which should include those listed in Murray v MGN [2008] EWCA Civ 446 at [36] (attributes of the claimant, nature of activity in claimant was engaged, place at which it was happening, nature and purpose of intrusion, absence of consent and whether known or inferred, effect on the claimant, and circumstances/purposes surrounding information coming into hands of publisher).

The second is that at [81-88] the Court of Appeal upheld the grant of an injunction restraining further publication of the photographs even though the judge had originally found there was no evidence that this would happen, simply on the basis that the Mail subsequently refused to give an undertaking to this effect.  This was said to satisfy the requirement that there be reason to apprehend further publication and complaints about the adverse consequences for freedom of expression were dismissed, although again the terms of the judgment suggest a real reluctance to interfere with the judge’s discretion.

The third is to note that the judgment does not record any appeal against the awards of damages (£5,000 for the eldest child and £2,500 for each of the twins).  Since the claim under the Data Protection Act 1998 was said to stand or fall with the claim for misuse of private information, it remains to be seen whether these awards are used as guidance in nascent case law concerning damages in “pure” DPA claims.

Paul Greatorex


Tweet Tweet? #silencingFOIontwitter

November 17th, 2015 by Christopher Knight

Is a request for information made in a tweet a valid request within the meaning of sections 1 and 8 FOIA? Not in Ghafoor v Information Commissioner (EA/2015/0140). The FTT held that section 8(1) requires the request for information to be made using the “real name” of the person making it, and that the provision of an address for correspondence must one which is “suitable for correspondence” between the requestor and the public authority about the request. In Mr Ghafoor’s case, his Twitter handle does not contain his real name (it is the well-known @FOIkid account, tweeting about all matters information rights), and the public authority should not, in the view of the FTT, be obliged to look anywhere else for it (even in the Twitter profile itself below the handle). Moreover, a 140 character tweet is not a suitable method of correspondence concerning the request. The FTT did agree that if a request has been validly made through one address, section 11 obliges the public authority to respond to that address and not insist on doing so via some other sort of address (posting a letter when the request was in an email, for example).

The case is fact-specific, and does not necessarily preclude a request being made from a Twitter account with a ‘real name’ in the handle, at least where the information can be properly responded to in tweet form. However, the emphasis on provision of the requestor’s real name – to enable the proper consideration of the use of sections 12 and 14 the FTT held – is problematic. What if an email request is made from an email address which does not clearly show a name, or it is a name but the public authority has no way of checking whether dave.smith@email.com is really a Mr David Smith or is in fact called David Jones? What proof of the real name is required? What if a request is made from a company which does not provide its full registered company name? The judgment might suggest public authorities can too readily answer that the request is invalid, and the reading in of “real” into section 8(1)(b) may be a word too far. There is an argument that the FTT has switched the focus too much onto the requestor rather than the request. Whether Twitter is a suitable method of communication may also be open to argument in some requests, although it plainly would be difficult to properly respond in others and there is no legal obligation on the public authority to publish its answer and link to it (not least because that would reveal the ‘real name’ of the requestor). It will be interesting to see if the issue is re-litigated in other circumstances.

In the meantime, it appears the FTT is fighting back against the social media age. No #ff for the First-tier Tribunal.

Christopher Knight

GDPR & the media – words of warning

November 12th, 2015 by Anya Proops

Since the CJEU gave judgment in Google Spain, there has been much discussion on the conference  circuit about whether the judgment rides rough shod over free speech rights. Certainly the lack of any procedural protections for the media within the right to be forgotten regime has been the subject of much heated debate. For those of you wishing to understand how Article 10 rights are likely to fare under the new General Data Protection Regulation, you would do well to start with this excellent article by Daphne Keller, Director for Intermediary Liability at Stanford Law’s Center for Internet and Society (and notably former Assistant General Counsel to Google).

As Daphne makes clear, the GDPR does not offer the media much by way of solace. Quite the contrary, what we see with the new Regulation is a continuing failure on the part of European legislators to accommodate free speech rights within the data protection regime in a structured and systematic manner. To a large extent this lack of protection for Article 10 rights is a product of the fact that historically data protection and the media have rarely crossed swords. Certainly within our own jurisdiction, it is only over the last 18 months or so that an awareness of the potentially very substantial areas of tension have begun to surface (see further not least the discussion of the Steinmetz case on this blog). However, the reality is that the European quest to place data privacy rights centre-stage, in the online world and beyond, now  poses serious challenges for the media. This is something which will hopefully start to register at least with those EU regulators who will in due course be charged with applying the GDPR.

Anya Proops

Navigating the Harbours: The Commission Awakens

November 7th, 2015 by Christopher Knight

Like everyone else who operates in the field, this blog may have touched once or twice on the issues arising out of Schrems. Both Robin (here) and Tim (here) have provided some summaries of the sorts of alternatives data controllers will need to think about, and the guidance issued by the Article 29 Working Party as a result. But what, everyone has been asking, does the European Commission have to say about all this?

Happily, the heavy lids of ignorance may be lifted as the Commission has awoken. (Whether it more closely resembles the Force or a Kraken is perhaps a matter of personal preference.) It has produced a lengthy document which is actually both helpful and readily understandable. Not adding umpteen recitals probably helps. It draws together a lot of the practical issues and much of the existing guidance from the Article 29 WP already discussed for a sort of cheat-sheet document to help you navigate the ongoing choppy waters. You can find and download it here.

By way of precis, it informs us that the Commission has now “intensified” discussions with the US about a new Safe Harbour agreement, and that it hopes to have an outcome in three months. That would indeed require a considerable intensification, but there is nothing like ongoing illegality to concentrate the mind.

In the meantime, the Commission reminds us that Binding Corporate Rules are an option only for internal group company data transfers (something often overlooked), summarises what the Article 29 WP have suggested need to be included and, rather optimistically, noted that the process has been facilitated and sped up by inter-Data Protection Authority liaison. Unfortunately, the reality is that in the UK, the ICO has always warned that BCR approval can take 12 months, and many readers will have had the experience of it taking considerably longer. The ICO has a lot of balls to juggle and not many hands, and there has been a deafening silence from the multinationals who want BCRs of suggestions of paying for the resources to get them more quickly.

Outside of the BCR context, the Commission stresses its own approved contractual solution between controllers: the Standard Contract Clauses. There are currently four approved sets: two as between controllers and two as between controller and processor. They include obligations as regards security measures, information to the data subject in case of transfer of sensitive data, notification to the data exporter of access requests by the third countries’ law enforcement authorities or of any accidental or unauthorised access, and the rights of data subjects to the access, rectification and erasure of their personal data, as well as rules on compensation for the data subject in case of damage arising from a breach by either party to the SCCs. The model clauses also require EU data subjects to have the possibility to invoke before a DPA and/or a court of the Member State in which the data exporter is established the rights they derive from the contractual clauses as a third party beneficiary. What the Commission adds is to point out that Commission decisions are binding in Member States, and SCCs are a result of Commission decisions. The presumption is, therefore, that the SCCs provide adequate protection (although they can be challenged in a court and referred to the CJEU if necessary). DPAs will want to check any boutique amendments to the SCCs for compliance.

The Commission points out that under the new Regulation the proposal is that neither SCCs nor BCRs will require further authorisation by a national authority.

The third option is, of course, the derogations in Article 26(1). The Commission goes through each, highlighting the existing guidance on them and attempting the balance of making them look like workable solutions whilst stressing the need to construe them strictly. It may well be that much of the routine transfer businesses have used – because of banking transfers or international travel – will be covered by the contractual derogations providing, of course, that the transfer is necessary. The Article 29 Working Party considers that there has to be a “close and substantial connection”, a “direct and objective link” between the data subject and the purposes of the contract or the pre-contractual measure as an aspect of the necessity test. The derogation cannot be applied to transfers of additional information not necessary for the purpose of the transfer, or transfers for a purpose other than the performance of the contract (for example, follow-up marketing). If consent is relied upon it must be “unambiguous”, and so cannot be implied.

What the Commission does not really discuss is the ability of controllers to carry out their own adequacy assessment and rely on that. It is theoretically possible, but inevitably it is a risky route to adopt in this new-found atmosphere of data protection litigation.

The Commission also accepts that all of its other adequacy decisions are open to challenge in courts, but does not consider any to be at immediate risk.

By way of update on global reactions, readers may be aware that the German DPA has taken the most restrictive post-Schrems line; it has declined to approve any new BCRs or amended SCCs for the time being, although it has not said it will invalidate existing agreements. It has also taken a very restrictive line on consent. In Ireland, the remittal by the CJEU to the Irish Courts has led to the start of the domestic process of investigation into adequacy, but those proceedings are at a very early stage still. The passing of the Judicial Redress Bill by the US House of Representatives is being seen as one step closer to the possibility of remedying one hole in the Safe Harbour scheme, which was the difficulty of EU citizens vindicating their rights in the US. Under the new Bill they could, in theory, be designated so that vindication was more plausible, but that is a long way from resolving all of the issues. There are also likely to be implications for the TTIP negotiations, although the sense is that data protection will be carved out of TTIP altogether and left to the new Regulation. However, it is also of interest that the impact has been wider than just the EU-US relationship. Israel – currently subject to an adequacy decision itself – has revoked its own decision giving prior authorisation for the transfer of data from Israel to US companies signed-up to the Safe Harbor, doubtless to ensure that the EU-Israel adequacy decision is not undermined by proxy.

None of this is likely to be the last word, or post, on the subject. January 2016, by which time a solution has to have been found or the DPAs will start enforcing, seems awfully close…

Christopher Knight

Crime and Justice and Data Protection. Oh My.

October 29th, 2015 by Christopher Knight

This is not a lengthy analytical post; it is by way of quick update on the much overlooked younger sibling of the proposed General Data Protection Regulation: the Data Protection Directive for the police and criminal justice sector. Most practitioners are understandably focussing on the Regulation: that is the instrument which will affect most of us most of the time. But the EU is proposing to harmonise the rules across sectors and, at the same, implement a new Directive applicable to the police and criminal justice sectors. The existing Directive does not, of course, apply to that arena by virtue of article 3(2) (although the DPA 1998 is unlimited in its scope, so the point has rarely been of much relevance domestically).

A couple of weeks ago the Commission proudly announced that the Directive was full steam ahead following agreement in the Council, and that the drafting was moving into trilogue (dread term). The EDPS has now issued Opinion 6/2015, which is not quite so enthusiastic and sets out a number of areas that it believes should be taken into account, not least ensuring compliance with the CJEU’s evident dislike of blanket surveillance in Schrems (Panopticon passim) and the Charter rights of data subjects. It also stresses the need for there to be ‘joined up regulation’ between the Directive and the new Regulation to ensure a coherent and consistent system of data protection across the EU and across all fields of public and private sector data handling.

The final text of both legislative measures will inevitably give rise to plenty of questions (Who you gonna call? Counsel.) but it will be important for those working in the data protection and privacy field not to overlook the Directive in the headline grabbing of its more high-profile sibling. We shall see what trilogue brings us.

Christopher Knight

It’s Good to TalkTalk About Increased Fines

October 27th, 2015 by Christopher Knight

As if TalkTalk don’t have enough to think about at the moment, the House of Commons yesterday discussed the sanctions available to the Information Commissioner for significant data breaches. Responding to an urgent question on the TalkTalk incident, the Minister for Culture and the Digital Economy (wasn’t that one of Gladstone’s titles once?), Ed Vaizey, made a number of interesting comments. He mentioned that he understood TalkTalk had reported the breach to the ICO on Thursday 22 October and he expressed delight that the Culture Select Committee would be inquiring into the incident. In response to an SNP question that a maximum £500,000 fine was too small to be “terrifying“, the Minister indicated that the existing monetary penalty regime was significant but that he would discuss with the ICO whether more could be done. Oddly, he did not mention the genuinely terrifyingly large maximum fine proposals under the General Data Protection Regulation (ranging from 5% to 2% of global annual turnover, depending on which draft you read), although he did later state that the Regulation negotiations were “almost at the point of being completed“. He completed the urgent question procedure by suggesting that some sort of kitemark for cyber-security was something he would look into.

Whether or not it is really worth increasing the maximum levels of the monetary penalty notice regime before the new Regulation increases them anyway is a matter for debate. Given that the ICO has only rarely imposed fines at the top of the range, there probably has not been much internal appetite for pushing it higher. But, as we all know, there is nothing like shutting the stable door after the unencrypted horse has been ridden away by a 15 year old from County Antrim (allegedly).

Anyone wishing to read the debate (which does not contain very much by way of careful consideration of data protection law but a good deal by way of assumption that TalkTalk should be hung, drawn and quartered) can do on Hansard here.

Christopher Knight

Safe Harbour and the European regulators

October 26th, 2015 by Timothy Pitt-Payne QC

On 6th October 2015 the CJEU declared the Commission’s Safe Harbor Decision invalid, in Case C-362/14 Schrems.  Since then, data protection specialists have discussed little else; and Panopticon has hosted comments by Chris Knight, Anya Proops, and Robin Hopkins.

How have EU data protection regulators responded to the judgment?

The ICO’s immediate response came in a statement from Deputy Commissioner David Smith.  This struck a careful and measured tone, emphasising that the Safe Harbour is not the only basis on which transfers to the US can be made, and referring to the ICO’s earlier guidance on the range of ways in which overseas transfers can be made.

On 16th October the Article 29 Working Party issued a statement taking a rather more combative line.  Here are the main points.

  1. The question of massive and indiscriminate surveillance (i.e. in the US) was a key element of the CJEU’s analysis. The Court’s judgment required that any adequacy analysis implied a broad analysis of the third country domestic laws and international commitments.
  1. The Working Party urgently called on Member States and European institutions to open discussions with the US authorities to find suitable solutions. The current negotiations around a new Safe Harbour could be part of the solution.
  1. Meanwhile the Working Party would continue its analysis of how the CJEU judgment affected other transfer tools. During this period Standard Contractual Clauses and Binding Corporate Rules could still be used.  If by the end of January 2016 no appropriate solution with the US had been found, the EU regulators would take “appropriate actions”.
  1. Transfers still taking place based on the Safe Harbour decision were unlawful.

There are a couple of key messages here.  One is that it seems doubtful that the Article 29 Working Party would regard an adequacy assessment by a data controller as being a proper basis for transfer to the US:  see point 1.  A second is that there is a hint that even standard clauses and BCRs might not be regarded a safe basis for transfer (see point 3): the answer will depend on the outcome of the Working Party’s further analysis of the implications of Schrems.

The rise of the Ubermensch

October 23rd, 2015 by Timothy Pitt-Payne QC


In May 2012, Transport for London licensed Uber London Limited as an operator of private hire vehicles in London.

Uber is controversial.  It’s a good example of how new technology can disrupt existing business models in unexpected ways.  One controversy is addressed by Ouseley J in Transport for London v Uber London Limited and others [2015] EWHC 2918 (Admin):  whether the way in which the Uber fare is calculated infringes the criminal prohibition on the use of a taximeter in a London private hire vehicle. Answer – it doesn’t.

What does any of this have to do with Panopticon?  Our usual concerns, broadly speaking, are with access to public sector information, and with information privacy (including its interaction with freedom of expression).  But these fields are fundamentally shaped by developments in the technology that is used for collecting, sharing and using information.  A wider understanding of the legal issues to which those developments can give rise is valuable, even if it takes us a little outside the usual ambit of this blog.

So:  in London there are black cabs, and there are private hire vehicles (PHVs).  PHVs are subject to three-fold licensing:  the operator, the vehicle, and the driver must all be licensed.  One of the restrictions under which PHVs operate is that it is a criminal offence for the vehicle to be equipped with a taximeter: see section 11(1) of the Private Hire Vehicles (London) Act 1998.  A taximeter is defined by section 11(3) as “a device for calculating the fare to be charged in respect of any journey by reference to the distance travelled or time elapsed since the start of the journey (or a combination of both)”.

Uber operates in London as a licensed PHV operator (though the vehicles in its network include both PHVs and black cabs).  It uses technology that – as Ouseley J points out – was not envisaged when the relevant legislation was introduced in 1998.  “As was agreed, the changes brought about by the arrival of Google, the Smartphone equipped with accurate civilian use GPS, mobile internet access and in-car navigation systems, would not have been within the contemplation of Parliament in 1998.” (Google was in fact incorporated in 1998, and what it has to do with the case is obscure, but let that pass).

In order for the Uber system to operate, both the driver and the customer must have a smartphone, and must download the Uber Driver App and Customer App respectively.  The customer makes a booking using the Customer App.  The booking is transmitted to Uber’s servers in the US, and thence to the smartphone of the driver of the nearest vehicle in London – if that driver does not accept the booking, it is sent to the next nearest vehicle.  When the driver picks up the customer, the driver presses the “begin trip” icon on the Driver App.  At the end of the journey he presses “end trip”.  Signals are then sent to Uber’s servers in the US by the driver’s Smartphone, providing them with GPS data from the driver’s smartphone and time details.  One of the servers (“Server 2”) obtains information from another server about the relevant fare structure, and then calculates the fare and transmits information to the Driver App and the Customer App about the amount charged.  The customer’s credit or debit card is charged for the journey.

Does all this mean that the vehicle is equipped with a taximeter?

No, said Ouseley J, in proceedings brought by Transport for London seeking a declaration that PHVs in the Uber network are not equipped with a taximeter.

The argument before Ouseley J was that the driver’s smartphone, operating using the Driver App, was a taximeter.  But the fatal objection to this argument was that the fare was calculated by Server 2 not by the smartphone, and hence the calculation was done remotely and not in the vehicle itself.  To contravene section 11, it was not sufficient that the calculation was done using information uploaded from the smartphone, and that the calculation was then transmitted to and received on the smartphone.  Hence the smartphone was not a device falling within section 11(3). Moreover, even if the smartphone was a relevant device, the vehicle was not equipped with it; it was the driver who was equipped, and so the prohibition in section 11(1) was not infringed in any event.

Ousely J considered the case-law about the need to adopt an updating or “always speaking” construction of legislation, to take account of technological or scientific developments: see R (Quintavalle) v Secretary of State for Health [2003] UKHL 13, [2003] 2 AC 687.  This case law had no bearing, since the section 11 was in general terms and entirely capable of being applied to modern technology; there was no need to adopt any updating construction of the section.

The Uber case is a useful reminder that controversies about the implications of developments such as big data, cloud computing, and mobile internet access, are not just about privacy and data protection.  Rather, the issues are pervasive and can be expected to affect every corner of the law (and of politics, the economy, and society).

The mobile data devices that we use are constantly interacting with other devices and information storage facilities, including servers.  For the purpose of our daily lives, usually all we are interested in is specific transactions (like booking and paying for a PHV): we do not need to think about the different stages of information processing that underpin the transaction.  But for regulatory purposes, breaking down a transaction into those stages, and understanding when and how each stage takes place, can be essential.  Uber drivers and customers don’t need to think about Server 2:  but if you want to know whether Uber breaks the law, Server 2 is crucial.