Going Unnoticed

Last week, I came across an interview with Elizabeth Denham on a Canadian website called The Walrus that was published in April. There are some interesting nuggets – Denham seems to out herself as a Remainer in the third paragraph (a tad awkward given that she has only enforced on the other side) and also it turns out that the Commissioner has framed pictures of herself taking on Facebook in her office. More important is the comparison she draws between her Canadian jobs and her current role: “That’s why I like being where I am now,” she says, settling herself at a boardroom table. “To actually see people prosecuted.”

Denham probably wasn’t thinking of the run of legitimate but low-key prosecutions of nosy admin staff and practice managers which her office has carried out in recent months, which means she was up to her old tricks of inaccurately using the language of crime and prosecution to describe powers that are civil (or more properly, administrative). Since GDPR came in, she’s even less likely to prosecute than before, given that she no longer has the power to do so for an ignored enforcement or information notice. I don’t know whether she genuinely doesn’t understand how her powers work or is just using the wrong words because she thinks it makes for a better quote.

Publicity certainly plays a far greater part in the ICO’s enforcement approach than it should. A few months back, I made an FOI request to the ICO asking about a variety of enforcement issues and the information I received was fascinating. The response was late (because of course it was), but it was very thorough and detailed, and what it reveals is significant.

ICO enforcement breaks down into two main types. Enforcement notices are used where the ICO wants to stop unlawful practices or otherwise put things right. Monetary penalties are a punishment for serious breaches. Occasionally, they are used together, but often the bruised organisation is willing to go along with whatever the ICO wants, or has already put things right, so an enforcement notice is superfluous. The ICO is obliged to serve a notice of intent (NOI) in advance of a final penalty notice, giving the controller the opportunity to make representations. There is no equivalent requirement for preliminary enforcement notices, but in virtually every case, the ICO serves a preliminary notice anyway, also allowing for representations.

According to my FOI response, in 2017, the ICO issued 8 preliminary enforcement notices (PENs), but only 4 were followed up by a final enforcement notice; in 2018, 5 PENs were issued, and only 3 resulted in a final notice. The ratio of NOIs to final penalties is much closer; in 2017, there were 19 NOIs, and only one was not followed up with a penalty. In 2018, 21 NOIs were issued, 20 of which resulted in a penalty. Nevertheless, the PEN / NOI stage is clearly meaningful. In multiple cases, whatever the controller said stopped the intended enforcement in its tracks. In the light of many GDPR ‘experts’ confusion about when fines are real or proposed, the fact that not every NOI results in a fine is worth noting.

The response shows the risks of neglecting to issue a PEN. In July 2018, the ICO issued Aggregate IQ (AKA AIQ) with the first GDPR enforcement notice (indeed, it was the first GDPR enforcement action altogether). My FOI reveals that it was one of only a few cases where a preliminary notice was not issued. The AIQ EN was unenforceable, ordering them to cease processing any personal data about any UK or EU “citizens” obtained from UK political organisations “or otherwise for the purposes of data analytics, political campaigning or any other advertising purposes”. AIQ was forbidden from ever holding personal data about any EU citizen for any advertising purpose, even if that purpose was entirely lawful, and despite the fact that the GDPR applies to residents, not citizens. AIQ appealed, but before that appeal could be heard, the ICO capitulated and replaced the notice with one that required AIQ to delete a specific dataset, and only after the conclusion of an investigation in Canada. It cannot be a coincidence that this badly written notice was published as part of the launch of the ICO’s first report into Data Analytics. It seems that ICO rushed it, ignoring the normal procedure, so that the Commissioner had things to announce.

The ICO confirmed to me that it hasn’t served a penalty without an NOI, which is as it should be, but the importance of the NOI stage is underlined by another case announced with the first AIQ EN. The ICO issued a £500,000 penalty against Facebook, except that what was announced in July 2018 was the NOI, rather than the final penalty. Between July and October, the ICO would have received representations from Facebook, and as a result, the story in the final penalty was changed. The NOI claims that a million UK Facebook users’ data was passed to Cambridge Analytica and SCL among others for political purposes, but the final notice acknowledges that the ICO has no evidence that any UK users data was used for campaigning. As an aside, this means that ICO has no evidence Cambridge Analytica used Facebook data in the Brexit referendum. The final notice is based on a hypothetical yarn about the risk of a US visitor’s data being processed while passing through the UK, and an assertion that even though UK Facebook users’ data wasn’t abused for political purposes (the risk did not “eventuate“), it could have been, so there. I’ve spent years emphasising that the incident isn’t the same as a breach, but going for the maximum penalty on something that didn’t happen, having said previously that it did, is perhaps the wrong time to listen to me.

If you haven’t read the final Facebook notice, you really should. ICO’s argument is that UK users data could have been abused for political purposes even though it wasn’t, and the mere possibility would cause people substantial distress. I find this hard to swallow. I suspect ICO felt they had effectively announced the £500,000 penalty; most journalists reported the NOI as such. Despite Facebook’s representations pulling the rug out from under the NOI, I guess that the ICO couldn’t back down. There had to be a £500,000 penalty, so they worked backwards from there. The Commissioner now faces an appeal on a thin premise, as well as accusations from Facebook that Denham was biased when making her decision.

Had the NOI not been published (like virtually every other NOI for the past ten years), the pressure of headlines would have been absent. Facebook have already made the not unreasonable point in the Tribunal that as the final penalty has a different premise than the NOI, the process is unfair. Without a public NOI, Facebook could have put this to the ICO behind closed doors, and an amended NOI could have been issued with no loss of face. If Facebook’s representations were sufficiently robust, the case could have been dropped altogether, as happened in other cases in both 2017 and 2018. For the sake of a few days’ headlines, Denham would not be facing the possibility of a career-defining humiliation at the hands of Facebook of all people, maybe even having to pay their costs. It’s not like there aren’t a dozen legitimate cases to be made against Facebook’s handling of personal data, but this is the hill the ICO has chosen to die on. Maybe I’m wrong and Facebook will lose their appeal, but imagine if they win and this farrago helps them to get there.

The other revelation in my FOI response is an area of enforcement that the ICO does not want to publicise at all. In 2016, the ICO issued a penalty on an unnamed historical society, and in 2017, another was served on an unnamed barrister. I know this because the ICO published the details, publicly confirming the nature of the breach, amount of the penalty as well as the type of organisation. One might argue that they set a precedent in doing so. What I didn’t know until this FOI request is that there have been a further 3 secret monetary penalties, 1 in 2017 and 2 in 2018. The details have not been published, and the ICO refused to give me any information about them now.

The exemptions set out the ICO’s concerns. They claim that it might be possible for me to identify individual data subjects, even though both the barrister and historical society breaches involved very limited numbers of people but were still published. They also claim that disclosure will prejudice their ability to enforce Data Protection law, using this justification:

“We are relying on this exemption to withhold information from you where the disclosure of that information is held for an ongoing regulatory process (so, we are yet to complete our regulatory process and our intentions could still be affected by the actions of a data controller) or the information is held in relation to sensitive matters and its disclosure would adversely affect relationships which we need to maintain with the organisations involved. It is essential that organisations continue to engage with us in a constructive and collaborative way without fear that the information they provide to us will be made public prematurely, or at a later date, if it is inappropriate to do so. Disclosure of the withheld information at this time would therefore be likely to prejudice our ability to effectively carry out our regulatory function”

The ICO routinely releases the names of data controllers she has served monetary penalties and enforcement notices on without any fears about the damage to their relationship. Just last week, she was expressing how “deeply concerned” she is about the use of facial recognition by the private sector, despite being at the very beginning of her enquiries into one such company. And if maintaining working relationships at the expense of transparency is such a vital principle, how can they justify the publication of the Facebook NOI for no more lofty reason than to sex up the release of the analytics report? They say “It is essential that organisations continue to engage with us in a constructive and collaborative way without fear that the information they provide to us will be made public prematurely”, and yet the Facebook NOI was published prematurely despite the fact that it was a dud. What will that have done to the ICO’s relationship with a controller as influential and significant as Facebook? What incentive do FB have to work with Wilmslow in a constructive and collaborative way now? And if identifying the subjects is an issue, what is to stop the ICO from saying ‘we fined X organisation £100,000’ but refusing to say why, or alternatively, describing the incident but anonymising the controller?

It doesn’t make sense to publicise enforcement when it’s not finished, and it doesn’t make sense to keep it secret when it’s done. Every controller that has been named and shamed by the ICO should be demanding to know why these penalties have been kept secret, while Facebook have every right to demand that the Commissioner account for the perverse and ill-judged way in which she took action against them. Meanwhile, we should all ask why the information rights regulator is in such a mess.

And one final question: did she bring the framed pictures with her or did we pay to get them done?

Lateral Thinking

Last week, I wrote a blog about the ‘personal data agency’ Yo-Da, outlining my concerns about their grandiose claims, the lack of detail about how their service works and their hypocritical decision to ignore a subject access request I made to them. Predictably, this led to further online tussles between myself and Benjamin Falk, the company’s founder and ‘chief talker’. As a result of our final conversation, Yo-Da has effectively disappeared from the internet. Clearly, I touched a nerve.

Yo-Da’s website made concrete claims about what their service did, and in fact had done. There were testimonials from satisfied users, and three case studies. Although it was clear that the service wasn’t operating yet, the testimonials were unambiguous: here is what Yo-Da has done for me. There was no hint that they were fictional, nothing to suggest that the service couldn’t do what the site said.

Yo-Da systematically and automatically exercises your data rights

+

Use Yo-Da to ask any company in Europe to delete your personal information

User ‘Samuel’ claimed “Now I go to Yo-Da, search for the company whose (sic) been breached, and with 1-click find out what is happening with my personal information”, while ‘Nathan’ said “Yo-Da was simple to use and helped me understand just how many businesses in Europe have my data.

None of this is true. Yo-Da do not have a working product that does these things. As Falk put it to me “Our technology is still under development” and “We have some ideas that are working. They aren’t perfect.” I am not saying that Yo-Da aren’t developing an automated data rights service; I’m certain that they are. I’m not saying a product will never launch; I expect that it will and I am looking forward to it, though perhaps not for the same reason as Samuel and Nathan. The point is, it doesn’t exist now and the website said that it did.

Originally, Falk claimed that he had deliberately ignored my subject access request because it was unfounded. ‘Unpleasant’ people like me don’t have data rights, he claimed. This didn’t sound right, especially as after I published my blog, Yo-Da’s DPO (Trilateral Research) suddenly woke up and tried to process my request, as if this was the first they’d heard of it. During our correspondence, they made it clear that they agreed with Falk’s decision that my request was unfounded, but were silent on the decision to ignore it.

But in my argument with Falk, he admitted the truth “We have an outsourced DPO for a reason; we can’t afford a full time one. That’s why the SAR went ignored; our service isn’t live yet and so we didn’t expect to receive any requests, because we aren’t collecting any personal data on anyone

In a single tweet, Falk said a lot. He was admitting that all of the testimonials and case studies were fake (he ultimately said to me that they were “obviously fake”). At the same time, he was also not telling the truth. Falk said that the website was a “dummy” to “gauge interest”. In other words, the site exists as an advert for a theoretical service, but its other purpose is to persuade people to sign up to Yo-Da’s mailing list. It was designed to collect personal data. Yo-Da were saying ‘sign up with us to use this service that actually works’. I believe that this is a direct breach of the first GDPR principle on fairness and transparency. I want to know why Trilateral Research acted as a DPO for an organisation that did this.

Falk said that he was joking when he said that he ignored my request on purpose, but Trilateral didn’t acknowledge that. They wrote of a ‘delay’ in acknowledging my request, but concurred with Falk’s unfounded decision. That decision was never made; my SAR was just missed. Nobody was checking the ‘dpo@yo-da.co’ email account – Falk wasn’t, and neither were they, despite being the putative DPO. Either they didn’t know what had happened, or they didn’t care. They definitely backed up their client rather than digging into why a SAR had been received and ignored on spurious grounds without their involvement. Let’s be generous and assume that they didn’t know that Falk was bullshitting. Their client had taken a controversial and disputable decision in a SAR case, and he hadn’t consulted them before he did it, but they didn’t acknowledge that. They backed the unfounded refusal.

Even if Yo-Da one day launches a product that successfully facilitates automated data rights requests to every company in Europe (prediction: this will never happen), they definitely don’t have that product now, and their website claimed that they did. Either Trilateral didn’t know that this is the case, which means that they failed to do basic due diligence on their client, or they knew that the Yo-Da website was soliciting personal data on the basis of false claims.

When I pointed out to Falk that all of the sign-up data had been collected unlawfully (it’s not fair and transparent to gather data about a service that doesn’t exist), the conversation ended. The Yo-Da website instantly vanished, and their Twitter account was deactivated minutes later. I’m certain that Falk will be back, his little spat with me considered to be no more than a bump in the road to world domination. But forget him; what does this say about Trilateral? The best defence I can think of is that they took Falk’s money to be in-name-only DPO but didn’t scrutinise the company or their claims. This is bad. If they had any idea that Yo-Da doesn’t currently do what the website claimed, it’s worse.

According to the European Data Protection Board, the professional qualities that must be demonstrated by a Data Protection Officer include “integrity and high professional ethics”. I seriously question whether Trilateral have demonstrated integrity and high professional ethics in this case. It’s plainly unethical to be named as DPO for an organisation, and then ignore what comes into the DPO email address. Article 38(4) of the GDPR states “Data subjects may contact the data protection officer with regard to all issues related to processing of their personal data and to the exercise of their rights under this Regulation” but Trilateral weren’t even listening. It’s unethical to take on a client without knowing in detail how their services work (or even whether their services work), and that’s the only defence I can see in this case. It’s unethical to be DPO for an organisation that is making false or exaggerated claims to obtain personal data.

I regularly get asked by clients if I can recommend an outsourced DPO or a company who can do the kind of sustained consultancy work that a solo operator like me doesn’t have the capacity for. There are a few names I’m happy to give. I have no hesitation in saying that on the basis of this shoddy episode, I wouldn’t touch Trilateral Research with a bargepole.

The Curse of the Padlock

One of the dangers of working in Data Protection is the risk of becoming a pedant. Precision matters; court cases have turned on the meaning of individual words like ‘likely’ and ‘distress’. The legislation is a maze of definitions and concepts that the competent practitioner needs to get to grips with. Lazy thinking can be revealed by an inability to get the details right, so it’s possible to become obsessed with the detail. Even the BCS Data Protection exam has a question which requires you to list the elements of the definition of consent in the right order. It’s easy to lapse into pedantry, to point out every wrongly quoted article, every jumbled phrase.

Nevertheless, getting a simple thing right is often important. GDPR does not cover ‘personal identifiable information’; it covers ‘personal data’ and the definition of the two is not the same. A person who talks about PII in the context of European Data Protection is starting in the wrong place (the US), and can make mistakes as a result. Another error that seems to be creeping in all over the place is more profound, and risks entrenching one of the biggest misconceptions about how data protection works, a misconception many of us have spent years trying to break down.

The problem is the phrase ‘data privacy’.

I see it everywhere – on LinkedIn naturally, in news coverage of the sector, and predictably, the ICO has fallen for it. They describe themselves as “The UK’s independent authority set up to uphold information rights in the public interest, promoting openness by public bodies and data privacy for individuals.” Look at the Data Privacy Advisory Service, who summarise their services as “At DPAS we help organisations safeguard the fundamental human right to have data kept private by putting in place the best possible protection to keep it secure. This is delivered in line with the General Data Protection Regulation (GDPR) and The Data Protection Act 2018.”

The idea is nonsense. It doesn’t exist. There is no right to data privacy – there is certainly no fundamental right ‘to have data kept private’. This isn’t a snide dig at someone quoting the wrong article. The concept of ‘data privacy’ is a complete misunderstanding of what Data Protection is for, and everyone who promotes it is actively thwarting the efforts of the rest of us to implement data protection in a practical way.

Article 8 of the European Convention on Human Rights says: ‘Everyone has the right to respect for his private and family life, his home and his correspondence“. This right is not absolute; it can be interfered with (only when necessary) in the interests of “national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others“. The right is not just about data – it certainly can be, as is evidenced by cases where celebrities and others use the privacy right to prevent the use of images that breach their right to privacy. But the right to privacy doesn’t have to be about data at all – you can breach a person’s right to privacy by simply observing them, by being in a place where they expect privacy, or by denying them the opportunity to do something privately. Data doesn’t have to come into it.

Clearly, if you did a Venn diagram, there would be circumstances where privacy and data protection overlap. By following the Data Protection principles when you handle a person’s private correspondence for example, you probably also do what’s necessary to protect their privacy. The same is true for confidentiality – not all confidential data is personal data, but a decent stab at the principles will probably respect both. There is, however, a significant portion of the Venn diagram where Data Protection and Privacy do not meet, and the DP part of that is important.

The notion of ‘Data Privacy’ obscures two vital elements of Data Protection. First, data protection is not only about private data. It is covers all personal data, private, secret, and public. For years, I have been banging my head against the brick wall of ‘it’s not personal data, it’s in the public domain’. Trying to explain to people that data like photographs, email addresses and other publicly available data is still personal data, just available and easier to use than some other data has long been a difficulty. There was a chink of light in Article 14 of the GDPR which clearly states that a person should be informed even when their data is accessed from ‘publicly accessible sources’. This explicit recognition that public data is still personal data is very helpful, but the notion that ‘data protection’ and ‘data privacy’ are interchangeable muddies the waters again.

Second, in related news, GDPR is not about keeping data private; it is about ensuring that personal data processing is properly regulated. For years, Data Protection has been plagued by the padlock. The Information Commissioner used it as a logo (‘but the padlock is unlocked’ is a defence that umpteen different ICO folk have used when I complained about it), and when I did a Google image search for ‘Data Protection’ today, this is the top set of results:

Screenshot 2019-05-26 at 09.17.53

The problem with the Data Protection Padlock is that it presents the legislation as something that locks data up, keeps it away from people. This understanding of data protection leads directly to the belief that disclosure of personal data is inherently problematic and exceptional, and that belief is toxic. I’m not persuaded that Victoria Climbie or Peter Connelly died solely because data about them wasn’t shared, but the pervasive fear of data sharing didn’t help. The GDPR says that ‘the protection of natural persons in relation to the processing of personal data is a fundamental right‘. The word ‘privacy‘ isn’t mentioned anywhere beyond a reference in a footnote to the ePrivacy Directive, and the processing of personal data is firmly put in the context of operating the EU’s internal market: “This regulation is intended to contribute to the accomplishment of an area of freedom, security and justice, and of an economic union“.

You can’t achieve the economic union by locking all the data away, by keeping it private. To characterise data protection law as being about ‘data privacy’ is to misrepresent its purpose completely. European Data Protection is a compromise – trade is underpinned by the use, even the exploitation of personal data, but people have rights, they have control over their data in some (but not all) circumstances, and the legislation built on foundations of transparency and fairness, not privacy. Arguably, the GDPR tries to even up the power imbalance in some circumstances, but it is not designed to lock up data and keep it private.

Of course, some people might be using ‘privacy’ as a synonym for ‘secure’ – the DPAS statement above seems to elide the two. Only a fool would want to play down the importance of security in the context of using any personal data, but the reduction of Data Protection solely to security is as destructive to a proper understanding of it as the privacy / protection mess. We’ve managed to drag Data Protection out of the IT department, and we need to stamp on this idea that security is the exemplar of good DP practice. Your data can be private and secure, but kept for no good reason, for too long, in an inaccurate state, and there could be too much of it.

Some personal data is private and should remain so. In many situations, the processing of personal data without an eye on people’s legitimate expectations of privacy, especially when monitoring, watching or listening to them, is likely to be unfair and so unlawful. There is a strong link between Data Protection and Privacy, and any attempt to divorce them would be stupid. But the use of ‘data privacy’ as a synonym for data protection is misleading and dangerous – it perpetuates a fundamental misreading of what the legislation is for, and makes the lives of everyone trying to make GDPR work effectively a thousands times harder. It’s time to take this nonsense, lock it up and throw away the key.

Home, James

A few months ago, I wrote a blog about data protection and nonsense, highlighting inaccurate claims made by training companies, marketers and pressure groups. A bad tempered spat ensued in comments on LinkedIn between myself and Russell James, the marketer behind the lobbying attempt to change the ICO’s funding model to include cost recovery. James insisted that it didn’t matter that a letter sent by four MPs to the DCMS asking for the change, apparently at his instigation, contained inaccurate claims (the description of DP breaches as ‘crimes’) and embarrassingly got the name of the Information Commissioner wrong (it’s the Independent Commissioner of Information, according to the distinguished Parliamentarians, or whoever actually wrote it).

I asked James what the Information Commissioner’s Office themselves thought of his plan to allow the ICO to recoup the costs of investigations from those “found guilty of data crimes” (which I think means those who are in the receiving end of enforcement from Wilmslow, although it’s hard to be 100% certain). The idea that someone would persuade MPs to lobby the ICO’s sponsor department to change their funding mechanism without at least the tacit approval of the Commissioner or her staff seemed ridiculous, but the normally prolix Mr James was silent on the matter. So I decided to ask the Information Commissioner.

I made an FOI request including all of the following information:
1) Any recorded information about approaches made by Russell James or others to the ICO about the idea of the ICO adopting a cost-recovery model, including any correspondence with Mr James or his associates.
2) Any responses provided to James or others about the ICO adopting a cost-recovery model.
3) Any correspondence with Tom Tugendhat, Yvette Cooper, Dominic Grieve or Damian Collins, or their staff about the idea of a cost-recovery model, or the letter sent to the DCMS
4) Any internal discussion of the cost-recovery model.
5) Any correspondence, notes of meetings or other records of meetings between Mr James and any ICO member of staff, including the names of the staff. (this was subsequently clarified to cover only the cost recovery model, and not any other correspondence Mr James might have had with the ICO.)

Whatever the ICO made of Mr James’ ambitious plan, I was certain that this request would capture their thoughts. At worst, the ICO might refuse to disclose their internal discussions of the idea, but at least I might get some sense of the extent of them.

The ICO provided me with three paragraphs from a letter sent to them by Mr James around the time the MPs wrote to the DCMS. James told me that ICI letter was written by the office of Tom Tugendhat, but this one was remarkably similar in tone, and had the same lack of understanding of how the Data Protection enforcement regime works. James told the ICO that they were about to “leverage significant revenue“. Greatly increased income for the DCMS via the huge sums GDPR fines paid to them would, James asserted, result in much more cash for Wilmslow. This sounds great, if it wasn’t for the the fact that the ICO hasn’t issued a single penalty under the GDPR yet. More importantly, he is confused about what happens to the penalties, and how the ICO is funded. DP penalties have always been paid into the Treasury’s consolidated fund, bypassing the DCMS altogether. Moreover, the ICO doesn’t receive any funding from the DCMS for its Data Protection work. As this document (freely available on the ICO’s website) states, all the ICO’s DP work is paid for by DP fees collected from Data Controllers, as has been the case for many years. The ICO could do a CNIL-style €50 million penalty every week, and neither they nor the DCMS would see a cent of it.

James also claims in his letter that his campaign has “ministerial support from government officials“; I don’t know if that he’s claiming the support of ministers, or the support of government officials, but the phrase itself sounds like it was written by someone who doesn’t know the difference between the two. I’d ask him which it was, but I sent him a single direct message asking for comments before publishing the last blog I wrote this issue. He ignored me, but later pretended that I had deluged him with many such messages. If Tugendhat hadn’t tweeted the ICI letter, I’d think it was fake.

Whatever the shortcomings of Mr James’ insights into Data Protection (when I told him I was making an FOI about his plan, he thought it was the same as a SAR), his confidence in the success of the James Tax is hard to fault. According to him, it is now “a short time before your department (ICO) will have a more resilient financial footing“. Given this thrilling news, one can only speculate at how excited the fine folk of the ICO would be at the impending cash bonanza.

Alas, apart from a copy of the ICI letter, which the ICO sensibly chose not to provide to me as it was plainly in the public domain, they held no data about the James Tax. None. Nothing. Nada. Indeed, they made a point of telling me: “For clarity, I can confirm that we do not hold any information which falls within the scope of the other parts of your request“.  This means that they did not have any recorded discussions about it, share the letter internally, or even reply to that part of Mr James’ letter. If anyone had anything to say about the James Tax, they didn’t want to write it down.

Mr James has set himself up as the doughty defender of “Liz and the crew” as he once described his surprisingly reticent friends in Wilmslow to me. He has launched a campaign to change the law and roped four two highly respectable MPs in to support it. I think it is reasonable to ask whether someone with such a misbegotten understanding of how Data Protection works is the right person to change it. Given that the ICO has seemingly offered no support, not even a comment on his plan, I assume that they do not welcome the idea. It’s not hard to imagine why – calculating the costs of an investigation is extra work and bureaucracy. Moreover, if the ICO is entitled to claim the costs of victory, surely it should be forced to foot the bill for defeat – every time the ICO’s enforcement team’s investigation results in no action, the ICO should contribute to the time the controller spent in answering the many letters and information notices for which the office is celebrated.

If a case goes to appeal, while the James Tax would presumably allow the costs of going to the Tribunal to be recouped if successful, for fairness’ sake, the same logic must apply the other way around. If the Tribunal vindicates the ICO’s target (and losses at the Tribunal are not unknown, especially in recent times), presumably the ICO would have to pay the legal bills too. There are already financial incentives and advantages for the Commissioner. If the ICO issues a financial penalty, the controller gets a 20% discount if they choose not to appeal. If a controller’s actions are truly misbegotten and they choose to appeal, the Tribunal and the courts above can award costs against the recalcitrant data controller. To change the relationship further in the ICO’s interests should not just be one-way.

If the James Tax includes recouping costs of dealing with appeals (and my arguments with him on LinkedIn suggests that it does), this will also have a negative effect on one of the most important parts of the DP enforcement system. Any controller who has been fined will, according to the James Tax, already face the added cost of the ICO’s investigation. Appealing – already a roll of dice in many cases – will be that much more of a risk. As well as their own costs, controllers will have to factor in the additional ICO tally.

We already have Denham grumbling about appeals, even using a speech by Mark Zuckerberg about possible regulation in the US as an excuse to demand he drops his appeal against the Facebook fine in the UK. James’ ideas might further suppress the possibility of appealing against ICO decisions. For everyone involved in the sector, this would be a disaster. To borrow James’ inaccurate criminal characterisation of DP enforcement, the ICO is already the investigator, prosecutor and judge – I don’t want to strengthen that hand any more. Moreover, in the interview above, Denham signalled disdain for the concerns of ordinary people, stating that they don’t complain about the right things. As part of its analytics investigation, the ICO has enforced on cases where there have been no complaints. Denham’s ICO need to be challenged, and challenged regularly. The tribunals and the courts frequently give detailed and helpful explanations of how the law works – ICO never produced guidance on consent as useful as the Tribunal’s decision in Optical Express, and whether the ICO wins or loses, all sorts of insights are available in Tribunal decisions.

Nobody appeals lightly. Combine Denham’s hostility to challenge with the James Tax, and we might lose vital opportunities for debate and caselaw. You can dismiss this blog as just an opportunity for me to take the piss out of another GDPR certified professional, but James has set himself up as a public campaigner. He wants to change how the ICO is funded and how all controllers are potentially treated. This cannot just pass without scrutiny, especially as he appears to lack both an understanding of the system he wants to change, and the support of the regulator whose powers he wants to alter. If the people arguing for changes don’t even think it’s important what the ICO is called or whether it’s a ‘department’ or not, we should wonder what other important details they have missed.

Head in the Sandbox

The Information Commissioner’s Office recently held a workshop about their proposed Regulatory Sandbox. The idea of the sandbox is that organisations can come to the ICO with new proposals in order to test out their lawfulness in a safe environment. The hoped-for outcome is that products and services that are at the same time innovative and compliant will emerge.

There is no mention of a sandbox process in the GDPR or the DPA 2018. There is a formal mechanism for controllers to consult the ICO about new ideas that carry high risk (prior consultation) but the circumstances where that happens are prescribed. It’s more about managing risk than getting headlines. Unlike Data Protection Impact Assessments, prior consultation or certification, the design and operation of the sandbox is entirely within the ICO’s control. It is important to know who is having an influence its development, especially as the sandbox approach is not without risk.

Although Mrs Denham is not above eye-catching enforcement when it suits her, the ICO is often risk averse, and has shown little appetite for challenging business models. For example, the UK’s vibrant data broking market – which is fundamentally opaque and therefore unlawful – has rarely been challenged by Wilmslow, especially not the bigger players. They often get treated as stakeholders. The sandbox could make this worse – big organisations will come with their money-making wheezes, and it’s hard to imagine that ICO staff will want to tell them that they can’t do what they want. The sandbox could leave the ICO implicated, having approved or not prevented dodgy practices to avoid the awkwardness of saying no.

Even if you disagree with me about these risks, it’s surely a good thing that the ICO is transparent about who is having an influence on the process. So I made an FOI request to the ICO, requesting the names and companies or organisations of those who attended the meeting. As is tradition, they replied on the 20th working day to refuse to tell me. According to Wilmslow, disclosure of the attendees’ identities is exempt for four different reasons. Transparency will prejudice the ICO’s ability to carry out its regulatory functions, disclosure of the names of the attendees is a breach of data protection, revealing the names of the organisations will cause them commercial damage, and finally, the information was supplied with an expectation of confidentiality, and so disclosure will breach that duty.

These claims are outrageous. DPIAs and prior disclosure exist, underpinned both by the law and by European Data Protection Board guidance. Despite the obvious benefits of developing a formal GDPR certification process (both allowing controllers to have their processing assessed, and the creation of a new industry at a time when the UK needs all the economic activity it can get), the ICO’s position on certification is supremely arrogant: “The ICO has no plans to accredit certification bodies or carry out certification at this time“. A process set out in detail in the GDPR is shunned, with the ICO choosing instead to spend huge amounts of time and money on a pet project which has no legal basis. Certification could spread expertise across the UK; the sandbox will inevitably be limited to preferred stakeholders. If they’re hiding the identities of those who show up to the workshop, it’s hard to imagine that the actual process will be any more transparent.

The ICO’s arguments about commercial prejudice under S43 of FOI are amateurish: “To disclose that a company has sent delegates to the event may in itself indicate to the wider sector and therefore potential competitors that they are in development of, or in the planning stages of a new innovative product which involves personal data“. A vital principle of FOI is that when using a prejudice-based exemption, you need to show cause and effect. Disclosure will or will be likely to lead to the harm described. How on earth could a company lose money, or become less competitive, purely because it was revealed that they attended an ICO event (which is what using S43 means)?

The ICO’s personal data and confidentiality arguments are equally weak – everyone who attended the meeting would know the identities of everyone else, and all were acting in an official or commercial capacity. This was not a secret or private meeting about a specific project; anyone with an interest was able to apply to attend. Revealing their attendance is not unfair, and there is plainly a legitimate interest in knowing who the ICO is talking to about a project into which the office is putting significant resources, and which will have an impact on products or services that may affect millions of people. The determination to hide this basic information and avoid scrutiny of the sandbox process undermines the credibility of the project itself, and makes the ICO’s claim to be an effective defender of public sector transparency ever more hypocritical.

Worst of all, if disclosure of the attendees’ identity was the calamity for commercial sensitivity and personal data that the ICO claims it to be, there should be an immediate and thorough investigation of how the information I requested came to be revealed on the ICO’s website and twitter account. The entire event was recorded and a promotional video was released. Several attendees (whose names and companies I cannot be given because of confidentiality, data protection and commercial prejudice) are identified and interviewed on camera, while there are numerous shots of other attendees who are clearly identifiable. Either the ICO has betrayed the confidentiality and personal data rights of these people, putting their companies at direct commercial risk, or their FOI response is a cack-handed attempt to avoid legitimate scrutiny. Either way, I strongly recommend that the left hand and the right hand in Wilmslow make some rudimentary attempts to get to know one another.

Long ago, I was one of a number of online commentators described by the ICO’s comms people as a ‘driver of negative sentiment’. More recently, one of Denham’s more dedicated apologists accused me of being one of the regulator’s “adversaries”. I’m not a fan of the ICO, and I never have been. But this stinks. The determination to throw every conceivable exemption at a simple request to know who the ICO is talking to suggests that the office is afraid of scrutiny, afraid of having to justify what they’re doing and how they’re doing it. The incompetence of refusing to give me information that is on display on their website and Twitter account shows contempt for their obligations as an FOI regulator. The ICO has its head in the sand; as we drift out of the European mainstream into a lonely future on the fringes, their secrecy and incompetence should be matters of concern for anyone who cares about Data Protection.