Facebook, child protection and outsourced monitoring

July 22nd, 2015 by Robin Hopkins

Facebook is no stranger to complaints about the content of posts. Usually, one user complains to Facebook about what other users’ posts say about him. By making the offending posts available, Facebook is processing the complainant’s personal data, and must do so in compliance with data protection law.

More unusually, a user could also complain about their own Facebook posts. Surely a complainant cannot make data protection criticisms about information they deliberately posted about themselves? After all, Facebook processes those posts with the author’s consent, doesn’t it?

Generally, yes – but that will not necessarily be true in every instance, especially when it comes to Facebook posts by children. This is the nature of the complaint in striking litigation currently afoot before the High Court in Northern Ireland.

The case is HL v Facebook Inc, Facebook Ireland Ltd, the Northern Health & Social Care Trust and DCMS [2015] NIQB 61. It is currently only in its preliminary stages, but it raises very interesting and important issues about Facebook’s procedures for preventing underage users from utilising the social network. Those issues are illuminated in the recent judgment of Stephen J, who is no stranger to claims against Facebook – he heard the recent case of CG v Facebook [2015] NIQB 11, concerning posts about a convicted paedophile.

From the age of 11 onwards, HL maintained a Facebook page on which she made posts of an inappropriate sexual nature. She was exposed to responses from sexual predators. She says that Facebook is liable for its failure to prevent her from making these posts. She alleges that Facebook (i) unlawfully processed her sensitive personal data, (ii) facilitated her harassment by others, and (iii) was negligent in failing to have proper systems in place to minimise the risks of children setting up Facebook accounts by lying about their age.

The data protection claim raises a number of issues of great importance to the business of Facebook and others with comparable business models. One is the extent to which a child can validly consent to the processing of their personal data – especially sensitive personal data. Minors are (legitimately or not) increasingly active online, and consent is a cornerstone of online business. The consent issue is of one of wide application beyond the HL litigation.

A second issue is whether, in its processing of personal data, Facebook does enough to stop minors using their own personal data in ways which could harm them. In her claim, for example, HL refers to evidence given to a committee of the Australian Parliament – apparently by a senior privacy advisor to Facebook (though Facebook was unable to tell Stephens J who he was). That evidence apparently said that Facebook removes 20,000 under-age user profiles a day.

Stephens J was also referred to comments apparently made by a US Senator to Mark Zuckerberg about the vulnerability of underage Facebook users.

Another element of HL’s case concerns Facebook’s use of an outsourcing company called oDesk, operating for example from Morocco, to moderate complaints about Facebook posts. She calls into question the adequacy of these oversight measures: ‘where then is the oversight body for these underpaid global police?’ (to quote from a Telegraph article referred to in the recent HL judgment). Facebook says that – given its number of users in multiple languages across the globe – effective policing is a tall order (an argument J summed up at paragraph 22 as ‘the needle in a haystack argument, there is just too much to monitor, the task of dealing with underage users is impossible’).

In short, HL says that Facebook seems to be aware of the scale and seriousness of the problem of underage use of its network and has not done enough to tackle that problem.

Again, the issue is one of wider import for online multinationals for whom personal data is stock-in-trade.

The same goes for the third important data protection issue surfacing in the HL litigation. This concerns jurisdiction, cross-border data controllers and section 5 of the Data Protection Act 1998. For example, is Facebook Ireland established in the UK by having an office, branch or agency, and does it process the personal data in Facebook posts in the context of that establishment?

These issues are all still to be decided. Stephens J’s recent judgment in HL was not about the substantive issues, but about HL’s applications for specific discovery and interrogatories. He granted those applications. In addition to details of HL’s Facebook account usage, he ordered the Facebook defendants to disclose agreements between them and Facebook (UK) Ltd and between them and o-Desk (to whom some moderating processes were outsourced). He has also ordered the Facebook defendants to answer interrogatory questions about their procedures for preventing underage Facebook use.

In short, the HL litigation has – thus far – raised difficult data protection and privacy issues which are fundamental to Facebook’s business, and it has required Facebook to lay bare internal details of its safeguarding practices. The case is only just beginning. The substantive hearing, which is listed for next term, could groundbreaking.

Robin Hopkins @hopkinsrobin

DRIPA 2014 declared unlawful

July 17th, 2015 by Robin Hopkins

In a judgment of the Divisional Court handed down this morning, Bean LJ and Collins J have declared section 1 of the Data Retention and Investigatory Powers Act 2014 (DRIPA) to be unlawful.

For the background to that legislation, see our posts on Digital Rights Ireland and then on the UK’s response, i.e. passing DRIPA in an attempt to preserve data retention powers.

That attempt has today suffered a serious setback via the successful challenges brought by the MPs David Davis and Tom Watson, as well as Messrs Brice and Lewis. The Divisional Court did, however, suspend the effect of its order until after 31 March 2016, so as to give Parliament time to consider how to put things right.

Analysis to follow in due course, but for now, here is the judgment: Davis Watson Judgment.

Robin Hopkins @hopkinsrobin

Google and the ordinary person’s right to be forgotten

July 15th, 2015 by Robin Hopkins

The Guardian has reported today on data emerging from Google about how it has implemented the Google Spain ‘right to be forgotten’ principle over the past year or so: see this very interesting article by Julia Powles.

While the data is rough-and-ready, it appears to indicate that the vast majority of RTBF requests actioned by Google have concerned ‘ordinary people’. By that I mean people who are neither famous nor infamous, and who seek not to have high-public-interest stories erased from history, but to have low-public-interest personal information removed from the fingertips of anyone who cares to Google their name. Okay, that explanation here is itself rough-and-ready, but you get the point: most RTBF requests come not from Max Mosley types, but from Mario Costeja González types (he being the man who brought the Google Spain complaint in the first place).

As Julia Powles points out, today’s rough-and-ready is thus far the best we have to go on in terms of understanding how the RTBF is actually working in practice. There is very little transparency on this. Blame for that opaqueness cannot fairly be levelled only at Google and its ilk – though, as the Powles articles argues, they may have a vested interest in maintaining that opaqueness. Opaqueness was inevitable following a judgment like Google Spain, and European regulators have, perhaps forgivably, not yet produced detailed guidance at a European level on how the public can expect such requests to be dealt with. In the UK, the ICO has given guidance (see here) and initiated complaints process (see here).

Today’s data suggests to me that a further reason for this opaqueness is the ‘ordinary person’ factor: the Max Mosleys of the world tend to litigate (and then settle) when they are dissatisfied, but the ordinary person tends not to (Mr González being an exception). We remain largely in the dark about how this web-shaping issue works.

So: the ordinary person is most in need of transparent RTBF rules, but least equipped to fight for them.

How might that be resolved? Options seem to me to include some combination of (a) clear regulatory guidance, tested in the courts, (b) litigation by a Max Mosley-type figure which runs its course, (c) litigation by more Mr González figures (i.e. ordinary individuals), (d) litigation by groups of ordinary people (as in Vidal Hall, for example) – or perhaps (e) litigation by members of the media who object to their stories disappearing from Google searches.

The RTBF is still in its infancy. Google may be its own judge for now, but one imagines not for long.

Robin Hopkins @hopkinsrobin

Austria will not host Europe vs Facebook showdown

July 6th, 2015 by Robin Hopkins

As illustrated by Anya Proops’ recent post on a Hungarian case currently before the CJEU, the territorial jurisdiction of European data protection law can raise difficult questions.

Such questions have bitten hard in the Europe vs Facebook litigation. Max Schrems, an Austrian law graduate, is spearheading a massive class action in which some 25,000 Facebook users allege numerous data protection violations by the social media giant. Those include: unlawful obtaining of personal data (including via plug-ins and “like” buttons); invalid consent to Facebook’s processing of users’ personal data; use of personal data for impermissible purposes, including the unlawful analysing of data/profiling of users (“the Defendant analyses the data available on every user and tries to explore users’ interests, preferences and circumstances…”); unlawful sharing of personal data with third parties and third-party applications. The details of the claim are here.

Importantly, however, the claim is against Facebook Ireland Ltd, a subsidiary of the Californian-based Facebook Inc. The class action has been brought in Austria.

Facebook challenged the Austrian court’s jurisdiction. Last week, it received a judgment in its favour from the Viennese Regional Civil Court. The Court said it lacks jurisdiction in part because Mr Schrems is not deemed to be a ‘consumer’ of Facebook’s services. In part also, it lacks jurisdiction because Austria is not the right place to be bringing the claim. Facebook argued that the claim should be brought either in Ireland or in California, and the Court agreed.

Mr Schrems has announced his intention to appeal. In the meantime, the Austrian decision will continue to raise both eyebrows and questions, particularly given that a number of other judgments in recent years have seen European courts accepting jurisdiction to hear claims against social media companies (such as Google: see Vidal-Hall, for example) based elsewhere.

The Austrian decision also highlights the difficulties of the ‘one-stop shop’ principle which remains part of the draft Data Protection Regulation (albeit in more nuanced and complicated formulation than had earlier been proposed). In short, why should an Austrian user have to sue in Ireland?

Panopticon will report on any developments in this case in due course. It will also report on the other strand of Mr Schrems’ privacy campaign, namely his challenge to the lawfulness of the Safe Harbour regime for the transferring of personal data to the USA. That challenge has been heard by the CJEU, and the Advocate General’s opinion is imminent. The case will have major implications for those whose business involves transatlantic data transfers.

Robin Hopkins @hopkinsrobin

Google Spain, freedom of expression and security: the Dutch fight back

March 13th, 2015 by Robin Hopkins

The Dutch fighting back against the Spanish, battling to cast off the control exerted by Spanish decisions over Dutch ideologies and value judgments. I refer of course to the Eighty Years’ War (1568-1648), which in my view is a sadly neglected topic on Panopticon.

The reference could also be applied, without too much of a stretch, to data protection and privacy rights in 2015.

The relevant Spanish decision in this instance is of course Google Spain, which entrenched what has come to be called the ‘right to be forgotten’. The CJEU’s judgment on the facts of that case saw privacy rights trump most other interests. The judgment has come in for criticism from advocates of free expression.

The fight-back by free expression (and Google) has found the Netherlands to be its most fruitful battleground. In 2014, a convicted criminal’s legal battle to have certain links about his past ‘forgotten’ (in the Google Spain sense) failed.

This week, a similar challenge was also dismissed. This time, a KPMG partner sought the removal of links to stories about him allegedly having to live in a container on his own estate (because a disgruntled builder, unhappy over allegedly unpaid fees, changed the locks on the house!).

In a judgment concerned with preliminary relief, the Court of Amsterdam rejected his application, finding in Google’s favour. There is an excellent summary on the Dutch website Media Report here.

The Court found that the news stories to which the complaint about Google links related remained relevant in light of public debates on this story.

Importantly, the Court said of Google Spain that the right to be forgotten “is not meant to remove articles which may be unpleasant, but not unlawful, from the eyes of the public via the detour of a request for removal to the operator of a search machine.”

The Court gave very substantial weight to the importance of freedom of expression, something which Google Spain’s critics say was seriously underestimated in the latter judgment. If this judgment is anything to go by, there is plenty of scope for lawyers and parties to help Courts properly to balance privacy and free expression.

Privacy rights wrestle not only against freedom of expression, but also against national security and policing concerns.

In The Hague, privacy has recently grabbed the upper hand over security concerns. The District Court of The Hague has this week found that Dutch law on the retention of telecommunications data should be down due to its incompatibility with privacy and data protection rights. This is the latest in a line of cases challenging such data retention laws, the most notable of which was the ECJ’s judgment in Digital Rights Ireland, on which see my post here. For a report on this week’s Dutch judgment, see this article by Maarten van Tartwijk in The Wall Street Journal.

As that article suggests, the case illustrates the ongoing tension between security and privacy. In the UK, security initially held sway as regards the retention of telecoms data: see the DRIP Regulations 2014 (and Panopticon passim). That side of the argument has gathered some momentum of late, in light of (for example) the Paris massacres and revelations about ‘Jihadi John’.

Just this week, however, the adequacy of UK law on security agencies has been called into question: see the Intelligence and Security Committee’s report entitled “Privacy and Security: a modern and transparent legal framework”. There are also ongoing challenges in the Investigatory Powers Tribunal – for example this one concerning Abdul Hakim Belhaj.

So, vital ideological debates continue to rage. Perhaps we really should be writing more about 17th century history on this blog.

Robin Hopkins @hopkinsrobin

Googling Orgies – Thrashing out the Liability of Search Engines

January 30th, 2015 by Christopher Knight

Back in 2008, the late lamented News of the World published an article under the headline “F1 boss has sick Nazi orgy with 5 hookers”. It had obtained footage of an orgy involving Max Mosley and five ladies of dubious virtue, all of whom were undoubtedly (despite the News of the World having blocked out their faces) not Mrs Mosley. The breach of privacy proceedings before Eady J (Mosley v News Group Newspapers Ltd [2008] EWHC 687 (QB)) established that the ‘Nazi’ allegation was unfounded and unfair, that the footage was filmed by a camera secreted in “such clothing as [one of the prostitutes] was wearing” (at [5]), and also the more genteel fact that even S&M ‘prison-themed’ orgies stop for a tea break (at [4]), rather like a pleasant afternoon’s cricket, but with a rather different thwack of willow on leather.

Since that time, Mr Mosley’s desire to protect his privacy and allow the public to forget his penchant for themed tea breaks has led him to bring or fund ever more litigation, whilst simultaneously managing to remind as many people as possible of the original incident. His latest trip to the High Court concerns the inevitable fact of the internet age that the photographs and footage obtained and published by the News of the World remain readily available for those in possession of a keyboard and a strong enough constitution. They may not be on a scale of popularity as last year’s iCloud hacks, but they can be found.

Alighting upon the ruling of the CJEU in Google Spain that a search engine is a data controller for the purposes of the Data Protection Directive (95/46/EC) (on which see the analysis here), Mr Mosley claimed that Google was obliged, under section 10 of the Data Protection Act 1998, to prevent processing of his personal data where he served a notice requesting it to do so, in particular by not blocking access to the images and footage which constitute his personal data. He also alleged misuse of private information. Google denied both claims and sought to strike them out. The misuse of private information claim being (or soon to be) withdrawn, Mitting J declined to strike out the DPA claim: Mosley v Google Inc [2015] EWHC 59 (QB). He has, however, stayed the claim for damages under section 13 pending the Court of Appeal’s decision in Vidal-Hall v Google (on which see the analysis here).

Google ran a cunning defence to what, post-Google Spain, might be said to be a strong claim on the part of a data subject. It relied on Directive 2000/31/EC, the E-Commerce Directive. Article 13 protects internet service providers from liability for the cached storage of information, providing they do not modify the information. Mitting J was content to find that by storing the images as thumbnails, Google was not thereby modifying the information in any relevant sense: at [41]. Article 15 of the E-Commerce Directive also prohibits the imposition of a general obligation on internet service providers to monitor the information they transmit or store.

The problem for Mitting J was how to resolve the interaction between the E-Commerce Directive and the Data Protection Directive; the latter of which gives a data subject rights which apparently extend to cached information held by internet service providers which the former of which apparently absolves them of legal responsibility for. It was pointed out that recital (14) and article 1.5(b) of the E-Commerce Directive appeared to make that instrument subject to the Data Protection Directive. It was also noted that Google’s argument did not sit very comfortably with the judgment (or at least the effect of the judgment) of the CJEU in Google Spain.

Mitting J indicated that there were only two possible answers: either the Data Protection Directive formed a comprehensive code, or the two must be read in harmony and given full effect to: at [45]. His “provisional preference is for the second one”: at [46]. Unfortunately, the judgment does not then go on to consider why that is so, or more importantly, how both Directives can be read in harmony and given full effect to. Of course, on a strike out application provisional views are inevitable, but it leaves rather a lot of legal work for the trial judge, and one might think that it would be difficult to resolve the interaction without a reference to the CJEU. What, for example, is the point of absolving Google of liability for cached information if that does not apply to any personal data claims, which will be a good way of re-framing libel/privacy claims to get around Article 13?

The Court also doubted that Google’s technology really meant that it would have to engage in active monitoring, contrary to Article 15, because they may be able to do so without “disproportionate effort or expense”: at [54]. That too was something for the trial judge to consider.

So, while the judgment of Mitting J is an interesting interlude in the ongoing Mosley litigation saga, the final word certainly awaits a full trial (and/or any appeal by Google), and possibly a reference. All the judgment decides is that Mr Mosley’s claim is not so hopeless it should not go to trial. Headlines reading ‘Google Takes a Beating (with a break for tea)’ would be premature. But the indications given by Mitting J are not favourable to Google, and it may well be that the footage of Mr Mosley will not be long for the internet.

Christopher Knight

Data protection: three developments to watch

January 15th, 2015 by Robin Hopkins

Panopticon likes data protection, and it likes to keep its eye on things. Here are three key developments in the evolution of data protection law which, in Panopticon’s eyes, are particularly worth watching.

The right to be forgotten: battle lines drawn

First, the major data protection development of 2014 was the CJEU’s ‘right to be forgotten’ judgment in the Google Spain case. Late last year, we received detailed guidance from the EU’s authoritative Article 29 Working Party on how that judgment should be implemented: see here.

In the view of many commentators, the Google Spain judgment was imbalanced. It gave privacy rights (in their data protection guise) undue dominance over other rights, such as rights to freedom of expression. It was clear, however, that not all requests to be ‘forgotten’ would be complied with (as envisaged by the IC, Chris Graham, in an interview last summer) and that complaints would ensue.

Step up Max Moseley. The BBC reported yesterday that he has commenced High Court litigation against Google. He wants certain infamous photographs from his past to be made entirely unavailable through Google. Google says it will remove specified URLs, but won’t act so as to ensure that those photographs are entirely unobtainable through Google. According to the BBC article, this is principally because Mr Moseley no longer has a reasonable expectation of privacy with respect to those photographs.

The case has the potential to be a very interesting test of the boundaries of privacy rights under the DPA in a post-Google Spain world.

Damages under the DPA

Second, staying with Google, the Court of Appeal will continue its consideration of the appeal in Vidal-Hall and Others v Google Inc [2014] EWHC 13 (QB) in February. The case is about objections against personal data gathered through Apple’s Safari browser. Among the important issues raised by this case is whether, in order to be awarded compensation for a DPA breach, one has to establish financial loss (as has commonly been assumed). If the answer is no, this could potentially lead to a surge in DPA litigation.

The General Data Protection Regulation: where are we?

I did a blog post last January with this title. A year on, the answer still seems to be that we are some way off agreement on what the new data protection law will be.

The latest text of the draft Regulation is available here – with thanks to Chris Pounder at Amberhawk. As Chris notes in this blog post, the remaining disagreements about the final text are legion.

Also, Jan Philipp Albrecht, the vice-chairman of the Parliament’s civil liberties committee, has reportedly suggested that the process of reaching agreement may even drag on into 2016.

Perhaps I will do another blog post in January 2016 asking the same ‘where are we?’ question.

Robin Hopkins @hopkinsrobin

How to apply the DPA

January 15th, 2015 by Robin Hopkins

Section 40 of FOIA is where the Freedom of Information Act (mantra: disclose, please) intersects with the Data Protection Act 1998 (mantra: be careful how you process/disclose, please).

When it comes to requests for the disclosure of personal data under FOIA, the DPA condition most commonly relied upon to justify showing the world the personal data of a living individual is condition 6(1) from Schedule 2:

The processing is necessary for the purposes of legitimate interests pursued by the data controller or by the third party or parties to whom the data are disclosed, except where the processing is unwarranted in any particular case by reason of prejudice to the rights and freedoms or legitimate interests of the data subject.

That condition has multiple elements. What do they mean, and how do they mesh together? In Goldsmith International Business School v IC and Home Office (GIA/1643/2014), the Upper Tribunal (Judge Wikeley) has given its view. See here Goldsmiths. This comes in the form of its endorsement of the following 8 propositions (submitted by the ICO, represented by 11KBW’s Chris Knight).

Proposition 1: Condition 6(1) of Schedule 2 to the DPA requires three questions to be asked:

(i) Is the data controller or the third party or parties to whom the data are disclosed pursuing a legitimate interest or interests?

(ii) Is the processing involved necessary for the purposes of those interests?

(iii) Is the processing unwarranted in this case by reason of prejudice to the rights and freedoms or legitimate interests of the data subject?

Proposition 2: The test of “necessity” under stage (ii) must be met before the balancing test under stage (iii) is applied.

Proposition 3: “Necessity” carries its ordinary English meaning, being more than desirable but less than indispensable or absolute necessity.

Proposition 4: Accordingly the test is one of “reasonable necessity”, reflecting the European jurisprudence on proportionality, although this may not add much to the ordinary English meaning of the term.

Proposition 5: The test of reasonable necessity itself involves the consideration of alternative measures, and so “a measure would not be necessary if the legitimate aim could be achieved by something less”; accordingly, the measure must be the “least restrictive” means of achieving the legitimate aim in question.

Proposition 6: Where no Article 8 privacy rights are in issue, the question posed under Proposition 1 can be resolved at the necessity stage, i.e. at stage (ii) of the three-part test.

Proposition 7: Where Article 8 privacy rights are in issue, the question posed under Proposition 1 can only be resolved after considering the excessive interference question posted by stage (iii).

The UT also added this proposition 8, confirming that the oft-cited cases on condition 6(1) were consistent with each other (proposition 8: The Supreme Court in South Lanarkshire did not purport to suggest a test which is any different to that adopted by the Information Tribunal in Corporate Officer).

Those who are called upon to apply condition 6(1) will no doubt take helpful practical guidance from that checklist of propositions.

Robin Hopkins @hopkinsrobin

Campaigning journalism is still journalism: Global Witness and s.32 DPA

December 23rd, 2014 by Peter Lockley

In an important development in the on-going saga of Steinmetz and others v Global Witness, the ICO has decided that the campaigning NGO is able to rely on the ‘journalism’ exemption under s.32 of the Data Protection Act 1998 (DPA).

The decision has major implications for journalists working both within and outside the mainstream media, not least because it makes clear that those engaged in campaigning journalism can potentially pray in aid the s. 32 exemption. Importantly, it also confirms that the Article 10 right to freedom of expression remains a significant right within the data protection field, notwithstanding recent developments, including Leveson and Google Spain, which have tended to place privacy rights centre-stage (Panopticons passim, maybe even ad nauseam).

Loyal readers will be familiar with the background to the Global Witness case, for which see original post by Jason Coppel QC.

In brief: Global Witness is an NGO which reports and campaigns on natural resource related corruption around the world. Global Witness is one of a number of organisations which has been reporting on allegations that a particular company, BSG Resources Ltd (“BSGR”), secured a major mining concession in Guinea through corrupt means. A number of individuals who are all in some way connected with BSGR (including Benny Steinmetz, reported to be its founder) brought claims against Global Witness under the DPA. The claims included a claim under s. 7 (failure to respond to subject access requests); s. 10 (obligation to cease processing in response to a damage and distress notification); s. 13 (claim for compensation for breach of the data protection principles) and s. 14 (claim for rectification of inaccurate data). Significantly, Mr Steinmetz alleged, amongst other things, that because he was personally so closely connected to BSGR, any information about BSGR amounted to his own personal data. If successful, the claims would have the effect of preventing Global Witness from investigating or publishing further reports on the Guinea corruption controversy.

Global Witness’s primary line of defence in the High Court proceedings was that all of the claims were misconceived because it was protected by the ‘journalism’ exemption provided for by s. 32 of the DPA. After a procedural spat in March (Panopticon report here), Global Witness’s application for a stay of the claims under s. 32(4) DPA was allowed by the High Court. The matter was then passed to the ICO for a possible determination under s.45 DPA. (In summary, such a determination will be made if the ICO concludes, against the data controller, either: (a) that the data controller is not processing the personal data only for the purposes of journalism or (b) it is not processing the data with a view to future publication of journalistic material).

In fact, the ICO declined to make a determination under s. 45. Moreover, he decided that, with respect to the subject access requests made by the claimants, Global Witness had been entitled to rely on the exemption afforded under s. 32. With respect to the latter conclusion, the ICO held that there were four questions which fell to be considered:

(1) whether the personal data is processed only for journalism, art or literature (s.32(1))

When dealing with this question, the ICO referred to his recent guidance Data Protection and journalism: a guide for the media, in which he accepted that non-media organisations could rely on the s.32 exemption, provided that the specific data in question were processed solely with a view to publishing information, opinions or ideas for general public consumption (p.30). He went on to conclude that this requirement could be met even where the publication is part of a wider campaign, provided that the data is not also used directly for the organisation’s other purposes (e.g. research or selling services). The ICO was satisfied that this condition was met for the data in question.

(2) whether that processing is taking place with a view to publication of some material (s.32(1)(a))

It is apparent from the decision letter that Global Witness was able to point to articles it had already published on the Simandou controversy, and since the controversy was on-going, to show it intended to publish more such articles. The ICO was satisfied that, in the circumstances, this second question should be answered in the affirmative.

(3) whether the data controller has a reasonable belief that publication is in the public interest (s.32(1)(b))

The ICO emphasised that the question he had to ask himself was not whether, judged objectively, the publication was in the public interest, but rather whether Global Witness reasonably believed publication was in the public interest. In the circumstances of this case – small NGO shines a spotlight on activities of large multinational in one of the world’s poorest countries amid allegations of serious corruption – he readily accepted that Global Witness held such a belief, particularly as the data related to the data subjects’ professional activities, for which they in any event had a lower expectation of privacy than in relation to their private lives.

(4) whether the data controller has a reasonable belief that compliance is incompatible with journalism. (s.32(1)(c))

Again, the focus here was on Global Witness’ reasonable beliefs. The ICO accepted that Global Witness had reasonable concerns that complying with the subject access requests which had been made by the claimants would prejudice its journalistic activity in two ways:, first, by giving the data subjects advance warning of the nature and direction of Global Witness’ investigations, which could be used to thwarting effect and, second, by creating an environment in which the organisation’s sources might lose confidence in Global Witness’ ability to protect their identities.

The decision will no doubt substantially reassure campaigning and investigative journalists everywhere. Unsurprisingly, it has been widely reported in the media (see e.g. Guardian article, Times article and FT article here). Notably, the FT reports that the claimants are asserting that they intend to challenge the decision. We will have to wait until the New Year to discover whether these assertions translate into action and, if they do translate into action, what form that action will take.

Anya Proops of 11KBW acts for Global Witness.

Peter Lockley

Monetary penalty for marketing phonecalls: Tribunal upholds ‘lenient’ penalty

December 16th, 2014 by Robin Hopkins

A telephone call made for direct marketing purposes is against the law when it is made to the number of a telephone subscriber who has registered with the Telephone Preference Service (‘TPS’) as not wishing to receive such calls on that number, unless the subscriber has notified the caller that he does not, for the time being, object to such calls being made on that line by that caller: see regulation 21 of the Privacy and Electronic Communications (EC Directive) Regulations 2003, as amended (‘PECR’).

The appellant in Amber UPVC Fabrications v IC (EA/2014/0112) sells UPVC windows and the like. It relies heavily on telephone calls to market its products and services. It made nearly four million telephone calls in the period May 2011 to April 2013, of which approximately 80% to 90% were marketing calls.

Some people complained to the Information Commissioner about these calls. The Commissioner found that the appellant had committed serious PECR contraventions – he relied on 524 unsolicited calls made in contravention of PECR. The appellant admitted that it made 360 of the calls. The appellant was issued with a monetary penalty under section 55A of the Data Protection Act 1998, as incorporated into PECR.

The appellant was issued with a monetary penalty to the value of £50,000. It appealed to the Tribunal. Its appeal did not go very well.

The Tribunal found the appellant’s evidence to be “rather unsatisfactory in a number of different ways. They took refuge in broad assertions about the appellant’s approach to compliance with the regulations, without being able to demonstrate that they were genuinely familiar with the relevant facts. They were able to speak only in general terms about the changes to the appellant’s telephone systems that had been made from time to time, and appeared unfamiliar with the detail. They had no convincing explanations for the numerous occasions when the appellant had failed to respond to complaints and correspondence from TPS or from the Commissioner. The general picture which we got was of a company which did as little as possible as late as possible to comply with the regulations, and only took reluctant and belated action in response to clear threats of legal enforcement.”

The Tribunal set out in detail the flaws with the appellant’s evidence. It concluded that “the penalty was appropriate (or, indeed, lenient) in the circumstances, and the appellant has no legitimate complaint concerning its size”.

This decision is notable not only for its detailed critique (in terms of PECR compliance) of the appellant’s business practices and evidence on appeal, but also more widely for its contribution to the developing jurisprudence on monetary penalties and the application of the conditions under section 55A DPA. Thus far, the cases have been Scottish Borders (DPA appeal allowed, in a decision largely confined to the facts), Central London Community Healthcare NHS Trust (appeal dismissed at both First-Tier and Upper Tribunal levels) and Niebel (PECR appeal allowed and upheld on appeal).

The Amber case is most closely linked to Niebel, which concerned marketing text messages. The Amber decision includes commentary on and interpretation of the binding Upper Tribunal decision in Niebel on how the section 55A conditions for issuing a monetary penalty should be applied. For example:

PECR should be construed so as to give proper effective to the Directive which it implements – see the Tribunal’s discussion of the Marleasing principle.

The impact of the ‘contravention’ can be assessed cumulatively, i.e. as the aggregate effect of the contraventions asserted in the penalty notice. In Niebel, the asserted contravention was a specified number of text messages which had been complained about, but the Tribunal in Amber took the view that, in other cases, the ICO need not frame the relevant contravention solely by reference to complaints – it could extrapolate, where the evidence supported this, to form a wider conclusion on contraventions.

Section 55A requires an assessment of the “likely” consequences of the “kind” of contravention. “Likely” has traditionally been taken to mean “a significant and weighty chance”, but the Tribunal in Amber considered that, in this context, it might mean “more than fanciful”, ie, “a real, a substantial rather than merely speculative, possibility, a possibility that cannot sensibly be ignored”.

The “kind” of contravention includes the method of contravention, the general content and tenor of the communication, and the number or scale of the contravention.

“Substantial” (as in “substantial damage or substantial distress”) probably means “more than trivial, ie, real or of substance”. Damage or distress can be substantial on a cumulative basis, i.e. even if the individual incidents do not themselves cause substantial damage or substantial distress.

“Damage” is different to “distress” but is not confined to financial loss – for example, personal injury or property interference could suffice.

“Distress” means something more than irritation.

The significant and weighty chance of causing substantial distress to one person is sufficient for the threshold test to be satisfied.

Where the number of contraventions is large, there is a higher inherent chance of affecting somebody who, because of their particular unusual circumstances, is likely to suffer substantial damage or substantial distress due to the PECR breach.

The Amber decision is, to date, the most developed analysis at First-Tier Tribunal level, of the monetary penalty conditions. The decision will no doubt be cited and discussed in future cases.

11KBW’s James Cornwall appeared for the ICO in both Amber and Niebel.

Robin Hopkins @hopkinsrobin