A Boy’s Best Friend is his Data

Just over a month ago, I enjoyed a series of bad-tempered Twitter exchanges with Benjamin Falk, Founder and “Chief Talker” of the personal data outfit Yo-Da. Falk has an interesting perspective on Data Protection. Instead of coming to DP through the traditional routes of information management, security, governance or the law, Falk is an ‘information economist’. He doesn’t see the subject as an issue of human rights, instead looking at it through the prism of economics. Because Data Protection is concerned with information, and there are other contexts where information is a commodity traded in a market, Falk has had the revelation that the processing of personal data is just another market, and this is the only way to understand it. Falk perceives this market as a ‘dumpster fire‘, and he alone has the solution. He has founded what he calls the “world’s first Personal Data agency” and hopes to lure people into signing up for an ill-defined service that he asserts will put them in control of their information. Somewhere along the way, money will be made.

Falk has some eye-catching ways to explain the ‘market’ he seeks to disrupt:

personal data is best understood as a newspaper that we publish about ourselves, whether we like it or not“.

Sometimes, he thinks personal data is “a really really boring autobiography, it’s just information about yourself written down somewhere“.

Falk’s view of data subjects is that they are “an author with an information rights management problem

I can imagine that if a person had, say, an AI program and they had to persuade gullible investors to buy into a wheeze that hadn’t really been worked out properly, this kind of eye-catching guff might get them going. However, it’s nonsense. Most personal data isn’t published or created for public consumption like a newspaper (indeed, many people have laboured for years under the misapprehension that personal data in the public domain isn’t personal data at all). Equally, a lot of personal data doesn’t fit Falk’s favourite analogy of a ‘robo-biography‘ because it is generated by people and not machines. You can’t simplify a million different controllers doing things for themselves in a million different ways. It’s complicated.

As Yo-Da’s website says, users will be able to “discover, fetch, control and erase” personal data from “any company operating in Europe”. However, the first thing you see on Yo-Da’s homepage is the following: “who earns from your personal data? everyone but you“. Falk also wants people to monetise their data. There’s not much detail however, making me wonder Falk has got this far by saying ‘AI’ a lot without a clear idea of how that will translate to the power he claims to put in subjects’ hands. After all, in order to work, Yo-Da needs to be able to successfully obtain and amalgamate data held on millions of different systems, in thousands of formats, processed for a host of different reasons by a multitude of businesses as varied as Apple, Tesco, 2040 Training and the Friendly Furry Shop. I’d like to see this in action.

The idea of individuals monetising their data is common to survey platforms like YouGov and CitizenMe, while Paul Olivier-Dehaye has been touting the automation of SARs and other data rights for years. A mock-up of the Yo-Da app shows data obtained from Starbucks (including how many coffees the user has drunk) with a suggestion at the bottom that this data be combined with that information from Transport for London or NHS England. Rather than selling data at scale like most data brokers, Yo-Da seems to encourage subjects to obtain vast quantities of data about themselves (the app shows a user having obtained data from 1200 companies) to create a “rich personal database” which presumably the user will then sell with Yo-Da’s assistance.

Falk’s ambitions are not limited to data monetisation. Yo-da, he claims, will stop subject’s rights from being infringed. The ‘dumpster fire’ of poor data protection practice in the UK is the fault of greedy consultants like me who ensure that our clients don’t actually comply with the law so we can keep charging them. Like Hercules diverting rivers to sluice the Augean stables, Falk’s tweets demonstrated a belief that Yo-Da will wipe Data Protection clean. Solving DP’s many problems is “easy to do“, he says, it’s just that nobody has actually tried (take that, Liz Denham). I don’t see how, but even if you believe that Yo-Da’s data jumble sale could change the face of DP forever, it can surely only do so if millions of people participate. Even if thousands of people sign up, Yo-Da will barely scratch the surface of how much personal data is processed across the UK and the EU. People will still be obliged to provide their data to pay their taxes, claim their benefits, use the NHS, set up a bank account or a mortgage, or be employed. The ability to get a slice of your data (it won’t be all of it) and possibly hawk it to dodgy data brokers (about the only people who I can imagine might buy it) won’t change that, and would do nothing to stop DeepMind, lost discs, the Met Police’s use of facial recognition or hospitals letting TV companies film vulnerable women without consent.

Moreover, just imagine how Yo-Da could go wrong if it actually works. At the moment, the fact that the different aspects of your life are often held in silos is wholly to your advantage from a data protection perspective. Capitalism is trying to connect the various loose ends of your life, but there are limits.  As a middle aged man with middling health, the NHS doesn’t know how often I drink coffee at Starbucks, or how regularly I get the Tube in London rather than using a TfL bike (I would like to confirm to my GP that I never drink coffee in Starbucks and I have only used the Tube once this year on my many visits to the capital). But what else could be added? Could Yo-Da include how many orders from Beers of Europe I make? How often I go to SoLita for a burger? Yo-Da is selling a seductive idea – one might almost paraphrase it as ‘take back control’, but it probably contains the same risk of unintended consequences as that rancid propaganda. Falk positions his company as the saviour of privacy rights, but he’s encouraging people to conspire in their own exploitation by creating an intrusive and potentially prejudicial data cocktail and then flogging it to the highest bidder.

I’m ignoring the practical problem that the key to driving his plan is subject access requests, and SARs rarely provide a seamless, rich repository of information, ready to be amalgamated and exploited. SAR disclosures are often messy and incomplete, a patchwork left behind by the removal of third parties and exempt data, and often delivered in PDFs. Only data supplied direct to the controller by the subject or obtained under observation has to be supplied in a portable form. There are legitimate reasons to refuse requests altogether. Falk has asserted repeatedly that “ownership and rights mean the same thing“, and so subjects own their data, but this won’t be any help to his business model. Subjects own the copy of the data that they receive from their SAR, but that doesn’t give them automatic access to any and all data held. They don’t own the data held by the controller. The promises of control and erasure made on the Yo-Da website are embarrassingly simplistic – you can’t object to a controller processing your data under contract or legal obligation or ask them to erase the data. They can resist an erasure request because they need to establish, exercise or defend a legal claim. Only someone who doesn’t understand how limited the GDPR rights of objection are would make the grandiose claim that “Yo-Da… lets you control who processes your information“. No, it doesn’t. It never will, because the GDPR doesn’t do that.

I think Falk’s claims are hype and his understanding of data protection is fundamentally flawed. Moreover, I don’t trust him. During the period that I spent arguing with the Yo-Da Twitter account, it became clear that I wasn’t just dealing with one person. There were two distinct personalities, inverted versions of the dual identities in Hitchcock’s Psycho. The Norman Bates character – relentlessly polite no matter what the provocation, endlessly ingratiating – is fake, a bot unleashed by Falk to fool people into thinking they’re dealing with a real person. Mrs Bates – the bitter, angry and resentful persona that occasionally lashes out – is real, presumably Falk himself, unable to let the upbeat-to-the-point-of-being-deranged program do all of the talking. Falk called me a jerk for accusing him of being a bot when actually, he was just being “unswervingly polite“. In the end, he had to admit that I was right and that he was using a bot. Ethics is Data Protection’s flavour of the month, and I’m not sure that such duplicitous behaviour will fit in.

Despite the fact that Yo-Da hasn’t launched yet, the website mysteriously features testimonials from happy users, while one of the three case studies highlighting how the service works using happy Yo-Da customers is actually just Falk himself. Falk wants to charge people to use their DP rights. Somewhere in our bickering, either Falk or the bot told me that Yo-Da would be a monthly subscription based on what users can afford, but there’s no hint of that on the website. It’s the same model that Dehaye originally proposed for PersonalData.Io – just as GDPR makes personal data rights free in most cases, in come some chancers hoping to charge you for using them. And I have one last piece of evidence that when it comes to upholding data protection, in giving people “transparency into this secretive ecosystem“, Falk isn’t the champion of data rights he purports to be.

After five days of arguing and provoking whoever / whatever was running the Yo-Da account, on June 4th, I made a subject access request to the company via the Data Protection Officer’s email address on the Yo-Da website (i.e. the specific address they direct you to make SARs to). I explicitly ruled out any personal data processed on the public Twitter account – that is available to me already and besides, I’ve already seen it. I wanted to see any direct messages, emails or other correspondence generated by my spat with Falk and his bot. Of course, there may not be any data at all. It’s quite possible that Falk didn’t talk to anyone about me or what I was saying, but he could have done. Several times, I questioned the fundamentals of Falk’s interpretation and I also asked whether Trilateral Research, the consultancy he has engaged to be Yo-Da’s DPO, agreed with his views. I wouldn’t be surprised if Falk contacted them about what I was saying, or just complained to his colleagues about what a jerk I was.

However you slice it, the deadline for compliance has passed, and Yo-Da has not responded to my request. I have received no data, no confirmation that data is not held, no request for ID, not even an acknowledgement. Nothing, nada, zip. Benjamin Falk proclaims that he seeks to land a knock-out blow for data subjects through the use of the GDPR rights, but the vehicle for this glorious revolution can’t even be arsed to answer a simple SAR. I wondered before why Trilateral wanted to be associated with Falk’s hyperbolic nonsense, but now he has coupled it with contempt for the law he claims to defend, I wonder if they’ll think again? In any case, everyone who receives one of Yo-Da’s SARs when the service launches knows what they can do.

Ignore it, you can.

 

The Curse of the Padlock

One of the dangers of working in Data Protection is the risk of becoming a pedant. Precision matters; court cases have turned on the meaning of individual words like ‘likely’ and ‘distress’. The legislation is a maze of definitions and concepts that the competent practitioner needs to get to grips with. Lazy thinking can be revealed by an inability to get the details right, so it’s possible to become obsessed with the detail. Even the BCS Data Protection exam has a question which requires you to list the elements of the definition of consent in the right order. It’s easy to lapse into pedantry, to point out every wrongly quoted article, every jumbled phrase.

Nevertheless, getting a simple thing right is often important. GDPR does not cover ‘personal identifiable information’; it covers ‘personal data’ and the definition of the two is not the same. A person who talks about PII in the context of European Data Protection is starting in the wrong place (the US), and can make mistakes as a result. Another error that seems to be creeping in all over the place is more profound, and risks entrenching one of the biggest misconceptions about how data protection works, a misconception many of us have spent years trying to break down.

The problem is the phrase ‘data privacy’.

I see it everywhere – on LinkedIn naturally, in news coverage of the sector, and predictably, the ICO has fallen for it. They describe themselves as “The UK’s independent authority set up to uphold information rights in the public interest, promoting openness by public bodies and data privacy for individuals.” Look at the Data Privacy Advisory Service, who summarise their services as “At DPAS we help organisations safeguard the fundamental human right to have data kept private by putting in place the best possible protection to keep it secure. This is delivered in line with the General Data Protection Regulation (GDPR) and The Data Protection Act 2018.”

The idea is nonsense. It doesn’t exist. There is no right to data privacy – there is certainly no fundamental right ‘to have data kept private’. This isn’t a snide dig at someone quoting the wrong article. The concept of ‘data privacy’ is a complete misunderstanding of what Data Protection is for, and everyone who promotes it is actively thwarting the efforts of the rest of us to implement data protection in a practical way.

Article 8 of the European Convention on Human Rights says: ‘Everyone has the right to respect for his private and family life, his home and his correspondence“. This right is not absolute; it can be interfered with (only when necessary) in the interests of “national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others“. The right is not just about data – it certainly can be, as is evidenced by cases where celebrities and others use the privacy right to prevent the use of images that breach their right to privacy. But the right to privacy doesn’t have to be about data at all – you can breach a person’s right to privacy by simply observing them, by being in a place where they expect privacy, or by denying them the opportunity to do something privately. Data doesn’t have to come into it.

Clearly, if you did a Venn diagram, there would be circumstances where privacy and data protection overlap. By following the Data Protection principles when you handle a person’s private correspondence for example, you probably also do what’s necessary to protect their privacy. The same is true for confidentiality – not all confidential data is personal data, but a decent stab at the principles will probably respect both. There is, however, a significant portion of the Venn diagram where Data Protection and Privacy do not meet, and the DP part of that is important.

The notion of ‘Data Privacy’ obscures two vital elements of Data Protection. First, data protection is not only about private data. It is covers all personal data, private, secret, and public. For years, I have been banging my head against the brick wall of ‘it’s not personal data, it’s in the public domain’. Trying to explain to people that data like photographs, email addresses and other publicly available data is still personal data, just available and easier to use than some other data has long been a difficulty. There was a chink of light in Article 14 of the GDPR which clearly states that a person should be informed even when their data is accessed from ‘publicly accessible sources’. This explicit recognition that public data is still personal data is very helpful, but the notion that ‘data protection’ and ‘data privacy’ are interchangeable muddies the waters again.

Second, in related news, GDPR is not about keeping data private; it is about ensuring that personal data processing is properly regulated. For years, Data Protection has been plagued by the padlock. The Information Commissioner used it as a logo (‘but the padlock is unlocked’ is a defence that umpteen different ICO folk have used when I complained about it), and when I did a Google image search for ‘Data Protection’ today, this is the top set of results:

Screenshot 2019-05-26 at 09.17.53

The problem with the Data Protection Padlock is that it presents the legislation as something that locks data up, keeps it away from people. This understanding of data protection leads directly to the belief that disclosure of personal data is inherently problematic and exceptional, and that belief is toxic. I’m not persuaded that Victoria Climbie or Peter Connelly died solely because data about them wasn’t shared, but the pervasive fear of data sharing didn’t help. The GDPR says that ‘the protection of natural persons in relation to the processing of personal data is a fundamental right‘. The word ‘privacy‘ isn’t mentioned anywhere beyond a reference in a footnote to the ePrivacy Directive, and the processing of personal data is firmly put in the context of operating the EU’s internal market: “This regulation is intended to contribute to the accomplishment of an area of freedom, security and justice, and of an economic union“.

You can’t achieve the economic union by locking all the data away, by keeping it private. To characterise data protection law as being about ‘data privacy’ is to misrepresent its purpose completely. European Data Protection is a compromise – trade is underpinned by the use, even the exploitation of personal data, but people have rights, they have control over their data in some (but not all) circumstances, and the legislation built on foundations of transparency and fairness, not privacy. Arguably, the GDPR tries to even up the power imbalance in some circumstances, but it is not designed to lock up data and keep it private.

Of course, some people might be using ‘privacy’ as a synonym for ‘secure’ – the DPAS statement above seems to elide the two. Only a fool would want to play down the importance of security in the context of using any personal data, but the reduction of Data Protection solely to security is as destructive to a proper understanding of it as the privacy / protection mess. We’ve managed to drag Data Protection out of the IT department, and we need to stamp on this idea that security is the exemplar of good DP practice. Your data can be private and secure, but kept for no good reason, for too long, in an inaccurate state, and there could be too much of it.

Some personal data is private and should remain so. In many situations, the processing of personal data without an eye on people’s legitimate expectations of privacy, especially when monitoring, watching or listening to them, is likely to be unfair and so unlawful. There is a strong link between Data Protection and Privacy, and any attempt to divorce them would be stupid. But the use of ‘data privacy’ as a synonym for data protection is misleading and dangerous – it perpetuates a fundamental misreading of what the legislation is for, and makes the lives of everyone trying to make GDPR work effectively a thousands times harder. It’s time to take this nonsense, lock it up and throw away the key.

A cure for blindness

The first time I read the GDPR properly, something leapt out at me. For years, the received wisdom about the subject access and other rights provided by the legislation was that they were ‘applicant blind’. You could ask the person for assistance in locating their data, but you could not ask them why they were asking. Even if you knew that the person wanted to wind you up, you had to ignore that. When I got to the GDPR articles about subject rights, it struck me that this was no longer the case.

The relevant text in the final version (Article 12.5) is as follows:

Where requests from a data subject are manifestly unfounded or excessive, in particular because of their repetitive character, the controller may either:

(a)  charge a reasonable fee taking into account the administrative costs of providing the information or communication or taking the action requested; or

(b)  refuse to act on the request

Looking at the foundation, the basis on which the request has been made, opens the door to the applicant’s motive. An unfounded request is one for which there is no legitimate basis, a request which is unwarranted. You cannot come to a conclusion that a request is either ‘unfounded’ and ‘excessive’ in many cases without looking at the person, why they have asked and what they intend to do with the data. The word ‘manifestly’ places a high threshold – it must very obviously be the case that the request is unfounded, but nevertheless, the words are there, and they must be there to allow the controller to refuse in some circumstances. If I’m wrong, tell me what those words are there for.

Believing that GDPR allows controllers to refuse requests because of the motives of the applicant often gets me into disagreements with other DP professionals. Perhaps because the ‘applicant blind’ idea is so basic to some people’s understand of how Data Protection works, or because they disapprove of the idea, a lot of people disagree. Last year, a controversy started when anti-abortion campaigners in Dublin filmed pro-choice demonstrators, and someone on Twitter provided a template SAR request for pro-choice people to use. The idea was to (in one Tweeter’s words) ‘swamp’ the anti-abortion campaign with SAR requests, even to show up and get yourself filmed solely so that you could make a SAR. More recently, pro-Remain campaigners, angry that they are receiving entirely legal election literature from the Brexit Party, suggested making SARs to the party to find out where their data had been sourced from. Virtually every time I pointed out that the data would have come from the electoral register, rendering the SAR pointless, they said they would do it anyway to annoy the Brexit Party and waste their time.

I support the idea of abortion without any hesitation, and I commend those who campaign in favour of the right to abortion. I am also what you might call a Hard Remainer – I wish we weren’t leaving the EU, and when we do, I would support a campaign to go back in on a Full Schengen, Join the Euro platform, partly because I think these things are good on balance, and partly because it would annoy people who voted Leave. Nevertheless, I think the anti-abortion campaign were perfectly within their rights to refuse SARs where they could identify a person’s Twitter comments saying that they intended to do a SAR to waste their time, and if the Brexit Party do the same now, I believe that this would be justified. I think GDPR allows for refusals of requests that are made for reasons other than concerns about personal data.

And if you don’t agree with me, you don’t agree with the Information Commissioner either.

For years, the failed FOI campaigner Alan Dransfield has been sending angry emails and complaints to various people at the Information Commissioner’s Office, usually late at night. I know this because as well as copying in various journalists, news organisations, and politicians, he also includes me. It’s hard to know what Dransfield hopes to achieve with these screeds, which blend an aggressive misreading of how the law works, defamatory accusations against ICO staff and RANDOM words in CAPITALS. Usually these emails come out of nowhere, but his most recent missive was in response to an email from the Information Commissioner, refusing to answer a subject access request he had made to them.

If you ever wanted an extreme case to test the limits of what is acceptable, it’s Dransfield. The ICO’s refusal says that since April 2016, Dransfield has sent them over 120 requests for information under the Data Protection Act 2018 (DPA 2018), the Freedom of Information Act 2000 (FOIA) and Environmental Information Regulations 2004 (EIR). In addition, the email contains this remarkable statement:

since May 2018 we have received in excess of 290 items of correspondence from you. Many of these communications have included unsubstantiated accusations of the ICO’s complicity in various crimes and have targeted members of ICO staff with the intention of causing distress

The ICO refusal points out that having previously refused his FOI and EIR requests as vexatious, they are now no longer even acknowledging them because they are about matters which have been dealt with (something which FOI plainly allows). They then go on to say this:

Your requests for information under Article 15 of the GDPR appear to be similarly motivated. We consider that these requests are not made to legitimately establish what information we hold and how we are handling your personal data, but part of a campaign to challenge the decisions that have already been concluded within due process

As well as copying me into his legally illiterate complaints, Dransfield sometimes emails me direct to call me a dickhead or spew out misogynistic and homophobic abuse, but it’s clear that ICO staff have it much worse than me. He’s a toxic character who thrives on causing discomfort and outrage. You might say that if ‘unfounded’ works on him, it’s only because he’s such an extreme case. But Dransfield is not alone. There are other vexatious, unpleasant people whose SARs will be made in the same vein of perpetuating a complaint or a campaign. Most importantly, look at the basis of the ICO’s refusal: we’re saying no because we don’t think you’re making this request for the right reasons. The ICO believes that an unfounded request is one made for the ‘wrong’ reasons.

Assuming this is correct (and obviously this is a rare case where I think the ICO has got it right), the next question is how far this goes. For years, the UK courts argued that using SARs to pursue litigation was an abuse of process – is that use of a SAR unfounded? I think that weaponised political SARs are unfounded, and even if you disagree, I don’t think you can tell me that it’s impossible. The net result of Dransfield’s adventures in FOI was establishing a principle that has been used to refuse many requests as vexatious – exactly the opposite of what he wanted. His campaign against the Commissioner may, ironically, have the same effect in GDPR.

The ICO rejects SARs they believe have been made for the wrong reasons. If they do this for themselves, there have to be circumstances where they will agree when other controllers do this. Pandora’s Box has been opened. Controllers who are dealing with vexatious applicants or orchestrated campaigns should think very seriously about whether denying a person their subject access right is an acceptable thing to do, but they should do so in the knowledge that the UK’s Data Protection regulator has already done it.

 

Home, James

A few months ago, I wrote a blog about data protection and nonsense, highlighting inaccurate claims made by training companies, marketers and pressure groups. A bad tempered spat ensued in comments on LinkedIn between myself and Russell James, the marketer behind the lobbying attempt to change the ICO’s funding model to include cost recovery. James insisted that it didn’t matter that a letter sent by four MPs to the DCMS asking for the change, apparently at his instigation, contained inaccurate claims (the description of DP breaches as ‘crimes’) and embarrassingly got the name of the Information Commissioner wrong (it’s the Independent Commissioner of Information, according to the distinguished Parliamentarians, or whoever actually wrote it).

I asked James what the Information Commissioner’s Office themselves thought of his plan to allow the ICO to recoup the costs of investigations from those “found guilty of data crimes” (which I think means those who are in the receiving end of enforcement from Wilmslow, although it’s hard to be 100% certain). The idea that someone would persuade MPs to lobby the ICO’s sponsor department to change their funding mechanism without at least the tacit approval of the Commissioner or her staff seemed ridiculous, but the normally prolix Mr James was silent on the matter. So I decided to ask the Information Commissioner.

I made an FOI request including all of the following information:
1) Any recorded information about approaches made by Russell James or others to the ICO about the idea of the ICO adopting a cost-recovery model, including any correspondence with Mr James or his associates.
2) Any responses provided to James or others about the ICO adopting a cost-recovery model.
3) Any correspondence with Tom Tugendhat, Yvette Cooper, Dominic Grieve or Damian Collins, or their staff about the idea of a cost-recovery model, or the letter sent to the DCMS
4) Any internal discussion of the cost-recovery model.
5) Any correspondence, notes of meetings or other records of meetings between Mr James and any ICO member of staff, including the names of the staff. (this was subsequently clarified to cover only the cost recovery model, and not any other correspondence Mr James might have had with the ICO.)

Whatever the ICO made of Mr James’ ambitious plan, I was certain that this request would capture their thoughts. At worst, the ICO might refuse to disclose their internal discussions of the idea, but at least I might get some sense of the extent of them.

The ICO provided me with three paragraphs from a letter sent to them by Mr James around the time the MPs wrote to the DCMS. James told me that ICI letter was written by the office of Tom Tugendhat, but this one was remarkably similar in tone, and had the same lack of understanding of how the Data Protection enforcement regime works. James told the ICO that they were about to “leverage significant revenue“. Greatly increased income for the DCMS via the huge sums GDPR fines paid to them would, James asserted, result in much more cash for Wilmslow. This sounds great, if it wasn’t for the the fact that the ICO hasn’t issued a single penalty under the GDPR yet. More importantly, he is confused about what happens to the penalties, and how the ICO is funded. DP penalties have always been paid into the Treasury’s consolidated fund, bypassing the DCMS altogether. Moreover, the ICO doesn’t receive any funding from the DCMS for its Data Protection work. As this document (freely available on the ICO’s website) states, all the ICO’s DP work is paid for by DP fees collected from Data Controllers, as has been the case for many years. The ICO could do a CNIL-style €50 million penalty every week, and neither they nor the DCMS would see a cent of it.

James also claims in his letter that his campaign has “ministerial support from government officials“; I don’t know if that he’s claiming the support of ministers, or the support of government officials, but the phrase itself sounds like it was written by someone who doesn’t know the difference between the two. I’d ask him which it was, but I sent him a single direct message asking for comments before publishing the last blog I wrote this issue. He ignored me, but later pretended that I had deluged him with many such messages. If Tugendhat hadn’t tweeted the ICI letter, I’d think it was fake.

Whatever the shortcomings of Mr James’ insights into Data Protection (when I told him I was making an FOI about his plan, he thought it was the same as a SAR), his confidence in the success of the James Tax is hard to fault. According to him, it is now “a short time before your department (ICO) will have a more resilient financial footing“. Given this thrilling news, one can only speculate at how excited the fine folk of the ICO would be at the impending cash bonanza.

Alas, apart from a copy of the ICI letter, which the ICO sensibly chose not to provide to me as it was plainly in the public domain, they held no data about the James Tax. None. Nothing. Nada. Indeed, they made a point of telling me: “For clarity, I can confirm that we do not hold any information which falls within the scope of the other parts of your request“.  This means that they did not have any recorded discussions about it, share the letter internally, or even reply to that part of Mr James’ letter. If anyone had anything to say about the James Tax, they didn’t want to write it down.

Mr James has set himself up as the doughty defender of “Liz and the crew” as he once described his surprisingly reticent friends in Wilmslow to me. He has launched a campaign to change the law and roped four two highly respectable MPs in to support it. I think it is reasonable to ask whether someone with such a misbegotten understanding of how Data Protection works is the right person to change it. Given that the ICO has seemingly offered no support, not even a comment on his plan, I assume that they do not welcome the idea. It’s not hard to imagine why – calculating the costs of an investigation is extra work and bureaucracy. Moreover, if the ICO is entitled to claim the costs of victory, surely it should be forced to foot the bill for defeat – every time the ICO’s enforcement team’s investigation results in no action, the ICO should contribute to the time the controller spent in answering the many letters and information notices for which the office is celebrated.

If a case goes to appeal, while the James Tax would presumably allow the costs of going to the Tribunal to be recouped if successful, for fairness’ sake, the same logic must apply the other way around. If the Tribunal vindicates the ICO’s target (and losses at the Tribunal are not unknown, especially in recent times), presumably the ICO would have to pay the legal bills too. There are already financial incentives and advantages for the Commissioner. If the ICO issues a financial penalty, the controller gets a 20% discount if they choose not to appeal. If a controller’s actions are truly misbegotten and they choose to appeal, the Tribunal and the courts above can award costs against the recalcitrant data controller. To change the relationship further in the ICO’s interests should not just be one-way.

If the James Tax includes recouping costs of dealing with appeals (and my arguments with him on LinkedIn suggests that it does), this will also have a negative effect on one of the most important parts of the DP enforcement system. Any controller who has been fined will, according to the James Tax, already face the added cost of the ICO’s investigation. Appealing – already a roll of dice in many cases – will be that much more of a risk. As well as their own costs, controllers will have to factor in the additional ICO tally.

We already have Denham grumbling about appeals, even using a speech by Mark Zuckerberg about possible regulation in the US as an excuse to demand he drops his appeal against the Facebook fine in the UK. James’ ideas might further suppress the possibility of appealing against ICO decisions. For everyone involved in the sector, this would be a disaster. To borrow James’ inaccurate criminal characterisation of DP enforcement, the ICO is already the investigator, prosecutor and judge – I don’t want to strengthen that hand any more. Moreover, in the interview above, Denham signalled disdain for the concerns of ordinary people, stating that they don’t complain about the right things. As part of its analytics investigation, the ICO has enforced on cases where there have been no complaints. Denham’s ICO need to be challenged, and challenged regularly. The tribunals and the courts frequently give detailed and helpful explanations of how the law works – ICO never produced guidance on consent as useful as the Tribunal’s decision in Optical Express, and whether the ICO wins or loses, all sorts of insights are available in Tribunal decisions.

Nobody appeals lightly. Combine Denham’s hostility to challenge with the James Tax, and we might lose vital opportunities for debate and caselaw. You can dismiss this blog as just an opportunity for me to take the piss out of another GDPR certified professional, but James has set himself up as a public campaigner. He wants to change how the ICO is funded and how all controllers are potentially treated. This cannot just pass without scrutiny, especially as he appears to lack both an understanding of the system he wants to change, and the support of the regulator whose powers he wants to alter. If the people arguing for changes don’t even think it’s important what the ICO is called or whether it’s a ‘department’ or not, we should wonder what other important details they have missed.

Head in the Sandbox

The Information Commissioner’s Office recently held a workshop about their proposed Regulatory Sandbox. The idea of the sandbox is that organisations can come to the ICO with new proposals in order to test out their lawfulness in a safe environment. The hoped-for outcome is that products and services that are at the same time innovative and compliant will emerge.

There is no mention of a sandbox process in the GDPR or the DPA 2018. There is a formal mechanism for controllers to consult the ICO about new ideas that carry high risk (prior consultation) but the circumstances where that happens are prescribed. It’s more about managing risk than getting headlines. Unlike Data Protection Impact Assessments, prior consultation or certification, the design and operation of the sandbox is entirely within the ICO’s control. It is important to know who is having an influence its development, especially as the sandbox approach is not without risk.

Although Mrs Denham is not above eye-catching enforcement when it suits her, the ICO is often risk averse, and has shown little appetite for challenging business models. For example, the UK’s vibrant data broking market – which is fundamentally opaque and therefore unlawful – has rarely been challenged by Wilmslow, especially not the bigger players. They often get treated as stakeholders. The sandbox could make this worse – big organisations will come with their money-making wheezes, and it’s hard to imagine that ICO staff will want to tell them that they can’t do what they want. The sandbox could leave the ICO implicated, having approved or not prevented dodgy practices to avoid the awkwardness of saying no.

Even if you disagree with me about these risks, it’s surely a good thing that the ICO is transparent about who is having an influence on the process. So I made an FOI request to the ICO, requesting the names and companies or organisations of those who attended the meeting. As is tradition, they replied on the 20th working day to refuse to tell me. According to Wilmslow, disclosure of the attendees’ identities is exempt for four different reasons. Transparency will prejudice the ICO’s ability to carry out its regulatory functions, disclosure of the names of the attendees is a breach of data protection, revealing the names of the organisations will cause them commercial damage, and finally, the information was supplied with an expectation of confidentiality, and so disclosure will breach that duty.

These claims are outrageous. DPIAs and prior disclosure exist, underpinned both by the law and by European Data Protection Board guidance. Despite the obvious benefits of developing a formal GDPR certification process (both allowing controllers to have their processing assessed, and the creation of a new industry at a time when the UK needs all the economic activity it can get), the ICO’s position on certification is supremely arrogant: “The ICO has no plans to accredit certification bodies or carry out certification at this time“. A process set out in detail in the GDPR is shunned, with the ICO choosing instead to spend huge amounts of time and money on a pet project which has no legal basis. Certification could spread expertise across the UK; the sandbox will inevitably be limited to preferred stakeholders. If they’re hiding the identities of those who show up to the workshop, it’s hard to imagine that the actual process will be any more transparent.

The ICO’s arguments about commercial prejudice under S43 of FOI are amateurish: “To disclose that a company has sent delegates to the event may in itself indicate to the wider sector and therefore potential competitors that they are in development of, or in the planning stages of a new innovative product which involves personal data“. A vital principle of FOI is that when using a prejudice-based exemption, you need to show cause and effect. Disclosure will or will be likely to lead to the harm described. How on earth could a company lose money, or become less competitive, purely because it was revealed that they attended an ICO event (which is what using S43 means)?

The ICO’s personal data and confidentiality arguments are equally weak – everyone who attended the meeting would know the identities of everyone else, and all were acting in an official or commercial capacity. This was not a secret or private meeting about a specific project; anyone with an interest was able to apply to attend. Revealing their attendance is not unfair, and there is plainly a legitimate interest in knowing who the ICO is talking to about a project into which the office is putting significant resources, and which will have an impact on products or services that may affect millions of people. The determination to hide this basic information and avoid scrutiny of the sandbox process undermines the credibility of the project itself, and makes the ICO’s claim to be an effective defender of public sector transparency ever more hypocritical.

Worst of all, if disclosure of the attendees’ identity was the calamity for commercial sensitivity and personal data that the ICO claims it to be, there should be an immediate and thorough investigation of how the information I requested came to be revealed on the ICO’s website and twitter account. The entire event was recorded and a promotional video was released. Several attendees (whose names and companies I cannot be given because of confidentiality, data protection and commercial prejudice) are identified and interviewed on camera, while there are numerous shots of other attendees who are clearly identifiable. Either the ICO has betrayed the confidentiality and personal data rights of these people, putting their companies at direct commercial risk, or their FOI response is a cack-handed attempt to avoid legitimate scrutiny. Either way, I strongly recommend that the left hand and the right hand in Wilmslow make some rudimentary attempts to get to know one another.

Long ago, I was one of a number of online commentators described by the ICO’s comms people as a ‘driver of negative sentiment’. More recently, one of Denham’s more dedicated apologists accused me of being one of the regulator’s “adversaries”. I’m not a fan of the ICO, and I never have been. But this stinks. The determination to throw every conceivable exemption at a simple request to know who the ICO is talking to suggests that the office is afraid of scrutiny, afraid of having to justify what they’re doing and how they’re doing it. The incompetence of refusing to give me information that is on display on their website and Twitter account shows contempt for their obligations as an FOI regulator. The ICO has its head in the sand; as we drift out of the European mainstream into a lonely future on the fringes, their secrecy and incompetence should be matters of concern for anyone who cares about Data Protection.