Going Unnoticed

Last week, I came across an interview with Elizabeth Denham on a Canadian website called The Walrus that was published in April. There are some interesting nuggets – Denham seems to out herself as a Remainer in the third paragraph (a tad awkward given that she has only enforced on the other side) and also it turns out that the Commissioner has framed pictures of herself taking on Facebook in her office. More important is the comparison she draws between her Canadian jobs and her current role: “That’s why I like being where I am now,” she says, settling herself at a boardroom table. “To actually see people prosecuted.”

Denham probably wasn’t thinking of the run of legitimate but low-key prosecutions of nosy admin staff and practice managers which her office has carried out in recent months, which means she was up to her old tricks of inaccurately using the language of crime and prosecution to describe powers that are civil (or more properly, administrative). Since GDPR came in, she’s even less likely to prosecute than before, given that she no longer has the power to do so for an ignored enforcement or information notice. I don’t know whether she genuinely doesn’t understand how her powers work or is just using the wrong words because she thinks it makes for a better quote.

Publicity certainly plays a far greater part in the ICO’s enforcement approach than it should. A few months back, I made an FOI request to the ICO asking about a variety of enforcement issues and the information I received was fascinating. The response was late (because of course it was), but it was very thorough and detailed, and what it reveals is significant.

ICO enforcement breaks down into two main types. Enforcement notices are used where the ICO wants to stop unlawful practices or otherwise put things right. Monetary penalties are a punishment for serious breaches. Occasionally, they are used together, but often the bruised organisation is willing to go along with whatever the ICO wants, or has already put things right, so an enforcement notice is superfluous. The ICO is obliged to serve a notice of intent (NOI) in advance of a final penalty notice, giving the controller the opportunity to make representations. There is no equivalent requirement for preliminary enforcement notices, but in virtually every case, the ICO serves a preliminary notice anyway, also allowing for representations.

According to my FOI response, in 2017, the ICO issued 8 preliminary enforcement notices (PENs), but only 4 were followed up by a final enforcement notice; in 2018, 5 PENs were issued, and only 3 resulted in a final notice. The ratio of NOIs to final penalties is much closer; in 2017, there were 19 NOIs, and only one was not followed up with a penalty. In 2018, 21 NOIs were issued, 20 of which resulted in a penalty. Nevertheless, the PEN / NOI stage is clearly meaningful. In multiple cases, whatever the controller said stopped the intended enforcement in its tracks. In the light of many GDPR ‘experts’ confusion about when fines are real or proposed, the fact that not every NOI results in a fine is worth noting.

The response shows the risks of neglecting to issue a PEN. In July 2018, the ICO issued Aggregate IQ (AKA AIQ) with the first GDPR enforcement notice (indeed, it was the first GDPR enforcement action altogether). My FOI reveals that it was one of only a few cases where a preliminary notice was not issued. The AIQ EN was unenforceable, ordering them to cease processing any personal data about any UK or EU “citizens” obtained from UK political organisations “or otherwise for the purposes of data analytics, political campaigning or any other advertising purposes”. AIQ was forbidden from ever holding personal data about any EU citizen for any advertising purpose, even if that purpose was entirely lawful, and despite the fact that the GDPR applies to residents, not citizens. AIQ appealed, but before that appeal could be heard, the ICO capitulated and replaced the notice with one that required AIQ to delete a specific dataset, and only after the conclusion of an investigation in Canada. It cannot be a coincidence that this badly written notice was published as part of the launch of the ICO’s first report into Data Analytics. It seems that ICO rushed it, ignoring the normal procedure, so that the Commissioner had things to announce.

The ICO confirmed to me that it hasn’t served a penalty without an NOI, which is as it should be, but the importance of the NOI stage is underlined by another case announced with the first AIQ EN. The ICO issued a £500,000 penalty against Facebook, except that what was announced in July 2018 was the NOI, rather than the final penalty. Between July and October, the ICO would have received representations from Facebook, and as a result, the story in the final penalty was changed. The NOI claims that a million UK Facebook users’ data was passed to Cambridge Analytica and SCL among others for political purposes, but the final notice acknowledges that the ICO has no evidence that any UK users data was used for campaigning. As an aside, this means that ICO has no evidence Cambridge Analytica used Facebook data in the Brexit referendum. The final notice is based on a hypothetical yarn about the risk of a US visitor’s data being processed while passing through the UK, and an assertion that even though UK Facebook users’ data wasn’t abused for political purposes (the risk did not “eventuate“), it could have been, so there. I’ve spent years emphasising that the incident isn’t the same as a breach, but going for the maximum penalty on something that didn’t happen, having said previously that it did, is perhaps the wrong time to listen to me.

If you haven’t read the final Facebook notice, you really should. ICO’s argument is that UK users data could have been abused for political purposes even though it wasn’t, and the mere possibility would cause people substantial distress. I find this hard to swallow. I suspect ICO felt they had effectively announced the £500,000 penalty; most journalists reported the NOI as such. Despite Facebook’s representations pulling the rug out from under the NOI, I guess that the ICO couldn’t back down. There had to be a £500,000 penalty, so they worked backwards from there. The Commissioner now faces an appeal on a thin premise, as well as accusations from Facebook that Denham was biased when making her decision.

Had the NOI not been published (like virtually every other NOI for the past ten years), the pressure of headlines would have been absent. Facebook have already made the not unreasonable point in the Tribunal that as the final penalty has a different premise than the NOI, the process is unfair. Without a public NOI, Facebook could have put this to the ICO behind closed doors, and an amended NOI could have been issued with no loss of face. If Facebook’s representations were sufficiently robust, the case could have been dropped altogether, as happened in other cases in both 2017 and 2018. For the sake of a few days’ headlines, Denham would not be facing the possibility of a career-defining humiliation at the hands of Facebook of all people, maybe even having to pay their costs. It’s not like there aren’t a dozen legitimate cases to be made against Facebook’s handling of personal data, but this is the hill the ICO has chosen to die on. Maybe I’m wrong and Facebook will lose their appeal, but imagine if they win and this farrago helps them to get there.

The other revelation in my FOI response is an area of enforcement that the ICO does not want to publicise at all. In 2016, the ICO issued a penalty on an unnamed historical society, and in 2017, another was served on an unnamed barrister. I know this because the ICO published the details, publicly confirming the nature of the breach, amount of the penalty as well as the type of organisation. One might argue that they set a precedent in doing so. What I didn’t know until this FOI request is that there have been a further 3 secret monetary penalties, 1 in 2017 and 2 in 2018. The details have not been published, and the ICO refused to give me any information about them now.

The exemptions set out the ICO’s concerns. They claim that it might be possible for me to identify individual data subjects, even though both the barrister and historical society breaches involved very limited numbers of people but were still published. They also claim that disclosure will prejudice their ability to enforce Data Protection law, using this justification:

“We are relying on this exemption to withhold information from you where the disclosure of that information is held for an ongoing regulatory process (so, we are yet to complete our regulatory process and our intentions could still be affected by the actions of a data controller) or the information is held in relation to sensitive matters and its disclosure would adversely affect relationships which we need to maintain with the organisations involved. It is essential that organisations continue to engage with us in a constructive and collaborative way without fear that the information they provide to us will be made public prematurely, or at a later date, if it is inappropriate to do so. Disclosure of the withheld information at this time would therefore be likely to prejudice our ability to effectively carry out our regulatory function”

The ICO routinely releases the names of data controllers she has served monetary penalties and enforcement notices on without any fears about the damage to their relationship. Just last week, she was expressing how “deeply concerned” she is about the use of facial recognition by the private sector, despite being at the very beginning of her enquiries into one such company. And if maintaining working relationships at the expense of transparency is such a vital principle, how can they justify the publication of the Facebook NOI for no more lofty reason than to sex up the release of the analytics report? They say “It is essential that organisations continue to engage with us in a constructive and collaborative way without fear that the information they provide to us will be made public prematurely”, and yet the Facebook NOI was published prematurely despite the fact that it was a dud. What will that have done to the ICO’s relationship with a controller as influential and significant as Facebook? What incentive do FB have to work with Wilmslow in a constructive and collaborative way now? And if identifying the subjects is an issue, what is to stop the ICO from saying ‘we fined X organisation £100,000’ but refusing to say why, or alternatively, describing the incident but anonymising the controller?

It doesn’t make sense to publicise enforcement when it’s not finished, and it doesn’t make sense to keep it secret when it’s done. Every controller that has been named and shamed by the ICO should be demanding to know why these penalties have been kept secret, while Facebook have every right to demand that the Commissioner account for the perverse and ill-judged way in which she took action against them. Meanwhile, we should all ask why the information rights regulator is in such a mess.

And one final question: did she bring the framed pictures with her or did we pay to get them done?

The Curse of the Padlock

One of the dangers of working in Data Protection is the risk of becoming a pedant. Precision matters; court cases have turned on the meaning of individual words like ‘likely’ and ‘distress’. The legislation is a maze of definitions and concepts that the competent practitioner needs to get to grips with. Lazy thinking can be revealed by an inability to get the details right, so it’s possible to become obsessed with the detail. Even the BCS Data Protection exam has a question which requires you to list the elements of the definition of consent in the right order. It’s easy to lapse into pedantry, to point out every wrongly quoted article, every jumbled phrase.

Nevertheless, getting a simple thing right is often important. GDPR does not cover ‘personal identifiable information’; it covers ‘personal data’ and the definition of the two is not the same. A person who talks about PII in the context of European Data Protection is starting in the wrong place (the US), and can make mistakes as a result. Another error that seems to be creeping in all over the place is more profound, and risks entrenching one of the biggest misconceptions about how data protection works, a misconception many of us have spent years trying to break down.

The problem is the phrase ‘data privacy’.

I see it everywhere – on LinkedIn naturally, in news coverage of the sector, and predictably, the ICO has fallen for it. They describe themselves as “The UK’s independent authority set up to uphold information rights in the public interest, promoting openness by public bodies and data privacy for individuals.” Look at the Data Privacy Advisory Service, who summarise their services as “At DPAS we help organisations safeguard the fundamental human right to have data kept private by putting in place the best possible protection to keep it secure. This is delivered in line with the General Data Protection Regulation (GDPR) and The Data Protection Act 2018.”

The idea is nonsense. It doesn’t exist. There is no right to data privacy – there is certainly no fundamental right ‘to have data kept private’. This isn’t a snide dig at someone quoting the wrong article. The concept of ‘data privacy’ is a complete misunderstanding of what Data Protection is for, and everyone who promotes it is actively thwarting the efforts of the rest of us to implement data protection in a practical way.

Article 8 of the European Convention on Human Rights says: ‘Everyone has the right to respect for his private and family life, his home and his correspondence“. This right is not absolute; it can be interfered with (only when necessary) in the interests of “national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others“. The right is not just about data – it certainly can be, as is evidenced by cases where celebrities and others use the privacy right to prevent the use of images that breach their right to privacy. But the right to privacy doesn’t have to be about data at all – you can breach a person’s right to privacy by simply observing them, by being in a place where they expect privacy, or by denying them the opportunity to do something privately. Data doesn’t have to come into it.

Clearly, if you did a Venn diagram, there would be circumstances where privacy and data protection overlap. By following the Data Protection principles when you handle a person’s private correspondence for example, you probably also do what’s necessary to protect their privacy. The same is true for confidentiality – not all confidential data is personal data, but a decent stab at the principles will probably respect both. There is, however, a significant portion of the Venn diagram where Data Protection and Privacy do not meet, and the DP part of that is important.

The notion of ‘Data Privacy’ obscures two vital elements of Data Protection. First, data protection is not only about private data. It is covers all personal data, private, secret, and public. For years, I have been banging my head against the brick wall of ‘it’s not personal data, it’s in the public domain’. Trying to explain to people that data like photographs, email addresses and other publicly available data is still personal data, just available and easier to use than some other data has long been a difficulty. There was a chink of light in Article 14 of the GDPR which clearly states that a person should be informed even when their data is accessed from ‘publicly accessible sources’. This explicit recognition that public data is still personal data is very helpful, but the notion that ‘data protection’ and ‘data privacy’ are interchangeable muddies the waters again.

Second, in related news, GDPR is not about keeping data private; it is about ensuring that personal data processing is properly regulated. For years, Data Protection has been plagued by the padlock. The Information Commissioner used it as a logo (‘but the padlock is unlocked’ is a defence that umpteen different ICO folk have used when I complained about it), and when I did a Google image search for ‘Data Protection’ today, this is the top set of results:

Screenshot 2019-05-26 at 09.17.53

The problem with the Data Protection Padlock is that it presents the legislation as something that locks data up, keeps it away from people. This understanding of data protection leads directly to the belief that disclosure of personal data is inherently problematic and exceptional, and that belief is toxic. I’m not persuaded that Victoria Climbie or Peter Connelly died solely because data about them wasn’t shared, but the pervasive fear of data sharing didn’t help. The GDPR says that ‘the protection of natural persons in relation to the processing of personal data is a fundamental right‘. The word ‘privacy‘ isn’t mentioned anywhere beyond a reference in a footnote to the ePrivacy Directive, and the processing of personal data is firmly put in the context of operating the EU’s internal market: “This regulation is intended to contribute to the accomplishment of an area of freedom, security and justice, and of an economic union“.

You can’t achieve the economic union by locking all the data away, by keeping it private. To characterise data protection law as being about ‘data privacy’ is to misrepresent its purpose completely. European Data Protection is a compromise – trade is underpinned by the use, even the exploitation of personal data, but people have rights, they have control over their data in some (but not all) circumstances, and the legislation built on foundations of transparency and fairness, not privacy. Arguably, the GDPR tries to even up the power imbalance in some circumstances, but it is not designed to lock up data and keep it private.

Of course, some people might be using ‘privacy’ as a synonym for ‘secure’ – the DPAS statement above seems to elide the two. Only a fool would want to play down the importance of security in the context of using any personal data, but the reduction of Data Protection solely to security is as destructive to a proper understanding of it as the privacy / protection mess. We’ve managed to drag Data Protection out of the IT department, and we need to stamp on this idea that security is the exemplar of good DP practice. Your data can be private and secure, but kept for no good reason, for too long, in an inaccurate state, and there could be too much of it.

Some personal data is private and should remain so. In many situations, the processing of personal data without an eye on people’s legitimate expectations of privacy, especially when monitoring, watching or listening to them, is likely to be unfair and so unlawful. There is a strong link between Data Protection and Privacy, and any attempt to divorce them would be stupid. But the use of ‘data privacy’ as a synonym for data protection is misleading and dangerous – it perpetuates a fundamental misreading of what the legislation is for, and makes the lives of everyone trying to make GDPR work effectively a thousands times harder. It’s time to take this nonsense, lock it up and throw away the key.

Home, James

A few months ago, I wrote a blog about data protection and nonsense, highlighting inaccurate claims made by training companies, marketers and pressure groups. A bad tempered spat ensued in comments on LinkedIn between myself and Russell James, the marketer behind the lobbying attempt to change the ICO’s funding model to include cost recovery. James insisted that it didn’t matter that a letter sent by four MPs to the DCMS asking for the change, apparently at his instigation, contained inaccurate claims (the description of DP breaches as ‘crimes’) and embarrassingly got the name of the Information Commissioner wrong (it’s the Independent Commissioner of Information, according to the distinguished Parliamentarians, or whoever actually wrote it).

I asked James what the Information Commissioner’s Office themselves thought of his plan to allow the ICO to recoup the costs of investigations from those “found guilty of data crimes” (which I think means those who are in the receiving end of enforcement from Wilmslow, although it’s hard to be 100% certain). The idea that someone would persuade MPs to lobby the ICO’s sponsor department to change their funding mechanism without at least the tacit approval of the Commissioner or her staff seemed ridiculous, but the normally prolix Mr James was silent on the matter. So I decided to ask the Information Commissioner.

I made an FOI request including all of the following information:
1) Any recorded information about approaches made by Russell James or others to the ICO about the idea of the ICO adopting a cost-recovery model, including any correspondence with Mr James or his associates.
2) Any responses provided to James or others about the ICO adopting a cost-recovery model.
3) Any correspondence with Tom Tugendhat, Yvette Cooper, Dominic Grieve or Damian Collins, or their staff about the idea of a cost-recovery model, or the letter sent to the DCMS
4) Any internal discussion of the cost-recovery model.
5) Any correspondence, notes of meetings or other records of meetings between Mr James and any ICO member of staff, including the names of the staff. (this was subsequently clarified to cover only the cost recovery model, and not any other correspondence Mr James might have had with the ICO.)

Whatever the ICO made of Mr James’ ambitious plan, I was certain that this request would capture their thoughts. At worst, the ICO might refuse to disclose their internal discussions of the idea, but at least I might get some sense of the extent of them.

The ICO provided me with three paragraphs from a letter sent to them by Mr James around the time the MPs wrote to the DCMS. James told me that ICI letter was written by the office of Tom Tugendhat, but this one was remarkably similar in tone, and had the same lack of understanding of how the Data Protection enforcement regime works. James told the ICO that they were about to “leverage significant revenue“. Greatly increased income for the DCMS via the huge sums GDPR fines paid to them would, James asserted, result in much more cash for Wilmslow. This sounds great, if it wasn’t for the the fact that the ICO hasn’t issued a single penalty under the GDPR yet. More importantly, he is confused about what happens to the penalties, and how the ICO is funded. DP penalties have always been paid into the Treasury’s consolidated fund, bypassing the DCMS altogether. Moreover, the ICO doesn’t receive any funding from the DCMS for its Data Protection work. As this document (freely available on the ICO’s website) states, all the ICO’s DP work is paid for by DP fees collected from Data Controllers, as has been the case for many years. The ICO could do a CNIL-style €50 million penalty every week, and neither they nor the DCMS would see a cent of it.

James also claims in his letter that his campaign has “ministerial support from government officials“; I don’t know if that he’s claiming the support of ministers, or the support of government officials, but the phrase itself sounds like it was written by someone who doesn’t know the difference between the two. I’d ask him which it was, but I sent him a single direct message asking for comments before publishing the last blog I wrote this issue. He ignored me, but later pretended that I had deluged him with many such messages. If Tugendhat hadn’t tweeted the ICI letter, I’d think it was fake.

Whatever the shortcomings of Mr James’ insights into Data Protection (when I told him I was making an FOI about his plan, he thought it was the same as a SAR), his confidence in the success of the James Tax is hard to fault. According to him, it is now “a short time before your department (ICO) will have a more resilient financial footing“. Given this thrilling news, one can only speculate at how excited the fine folk of the ICO would be at the impending cash bonanza.

Alas, apart from a copy of the ICI letter, which the ICO sensibly chose not to provide to me as it was plainly in the public domain, they held no data about the James Tax. None. Nothing. Nada. Indeed, they made a point of telling me: “For clarity, I can confirm that we do not hold any information which falls within the scope of the other parts of your request“.  This means that they did not have any recorded discussions about it, share the letter internally, or even reply to that part of Mr James’ letter. If anyone had anything to say about the James Tax, they didn’t want to write it down.

Mr James has set himself up as the doughty defender of “Liz and the crew” as he once described his surprisingly reticent friends in Wilmslow to me. He has launched a campaign to change the law and roped four two highly respectable MPs in to support it. I think it is reasonable to ask whether someone with such a misbegotten understanding of how Data Protection works is the right person to change it. Given that the ICO has seemingly offered no support, not even a comment on his plan, I assume that they do not welcome the idea. It’s not hard to imagine why – calculating the costs of an investigation is extra work and bureaucracy. Moreover, if the ICO is entitled to claim the costs of victory, surely it should be forced to foot the bill for defeat – every time the ICO’s enforcement team’s investigation results in no action, the ICO should contribute to the time the controller spent in answering the many letters and information notices for which the office is celebrated.

If a case goes to appeal, while the James Tax would presumably allow the costs of going to the Tribunal to be recouped if successful, for fairness’ sake, the same logic must apply the other way around. If the Tribunal vindicates the ICO’s target (and losses at the Tribunal are not unknown, especially in recent times), presumably the ICO would have to pay the legal bills too. There are already financial incentives and advantages for the Commissioner. If the ICO issues a financial penalty, the controller gets a 20% discount if they choose not to appeal. If a controller’s actions are truly misbegotten and they choose to appeal, the Tribunal and the courts above can award costs against the recalcitrant data controller. To change the relationship further in the ICO’s interests should not just be one-way.

If the James Tax includes recouping costs of dealing with appeals (and my arguments with him on LinkedIn suggests that it does), this will also have a negative effect on one of the most important parts of the DP enforcement system. Any controller who has been fined will, according to the James Tax, already face the added cost of the ICO’s investigation. Appealing – already a roll of dice in many cases – will be that much more of a risk. As well as their own costs, controllers will have to factor in the additional ICO tally.

We already have Denham grumbling about appeals, even using a speech by Mark Zuckerberg about possible regulation in the US as an excuse to demand he drops his appeal against the Facebook fine in the UK. James’ ideas might further suppress the possibility of appealing against ICO decisions. For everyone involved in the sector, this would be a disaster. To borrow James’ inaccurate criminal characterisation of DP enforcement, the ICO is already the investigator, prosecutor and judge – I don’t want to strengthen that hand any more. Moreover, in the interview above, Denham signalled disdain for the concerns of ordinary people, stating that they don’t complain about the right things. As part of its analytics investigation, the ICO has enforced on cases where there have been no complaints. Denham’s ICO need to be challenged, and challenged regularly. The tribunals and the courts frequently give detailed and helpful explanations of how the law works – ICO never produced guidance on consent as useful as the Tribunal’s decision in Optical Express, and whether the ICO wins or loses, all sorts of insights are available in Tribunal decisions.

Nobody appeals lightly. Combine Denham’s hostility to challenge with the James Tax, and we might lose vital opportunities for debate and caselaw. You can dismiss this blog as just an opportunity for me to take the piss out of another GDPR certified professional, but James has set himself up as a public campaigner. He wants to change how the ICO is funded and how all controllers are potentially treated. This cannot just pass without scrutiny, especially as he appears to lack both an understanding of the system he wants to change, and the support of the regulator whose powers he wants to alter. If the people arguing for changes don’t even think it’s important what the ICO is called or whether it’s a ‘department’ or not, we should wonder what other important details they have missed.

Head in the Sandbox

The Information Commissioner’s Office recently held a workshop about their proposed Regulatory Sandbox. The idea of the sandbox is that organisations can come to the ICO with new proposals in order to test out their lawfulness in a safe environment. The hoped-for outcome is that products and services that are at the same time innovative and compliant will emerge.

There is no mention of a sandbox process in the GDPR or the DPA 2018. There is a formal mechanism for controllers to consult the ICO about new ideas that carry high risk (prior consultation) but the circumstances where that happens are prescribed. It’s more about managing risk than getting headlines. Unlike Data Protection Impact Assessments, prior consultation or certification, the design and operation of the sandbox is entirely within the ICO’s control. It is important to know who is having an influence its development, especially as the sandbox approach is not without risk.

Although Mrs Denham is not above eye-catching enforcement when it suits her, the ICO is often risk averse, and has shown little appetite for challenging business models. For example, the UK’s vibrant data broking market – which is fundamentally opaque and therefore unlawful – has rarely been challenged by Wilmslow, especially not the bigger players. They often get treated as stakeholders. The sandbox could make this worse – big organisations will come with their money-making wheezes, and it’s hard to imagine that ICO staff will want to tell them that they can’t do what they want. The sandbox could leave the ICO implicated, having approved or not prevented dodgy practices to avoid the awkwardness of saying no.

Even if you disagree with me about these risks, it’s surely a good thing that the ICO is transparent about who is having an influence on the process. So I made an FOI request to the ICO, requesting the names and companies or organisations of those who attended the meeting. As is tradition, they replied on the 20th working day to refuse to tell me. According to Wilmslow, disclosure of the attendees’ identities is exempt for four different reasons. Transparency will prejudice the ICO’s ability to carry out its regulatory functions, disclosure of the names of the attendees is a breach of data protection, revealing the names of the organisations will cause them commercial damage, and finally, the information was supplied with an expectation of confidentiality, and so disclosure will breach that duty.

These claims are outrageous. DPIAs and prior disclosure exist, underpinned both by the law and by European Data Protection Board guidance. Despite the obvious benefits of developing a formal GDPR certification process (both allowing controllers to have their processing assessed, and the creation of a new industry at a time when the UK needs all the economic activity it can get), the ICO’s position on certification is supremely arrogant: “The ICO has no plans to accredit certification bodies or carry out certification at this time“. A process set out in detail in the GDPR is shunned, with the ICO choosing instead to spend huge amounts of time and money on a pet project which has no legal basis. Certification could spread expertise across the UK; the sandbox will inevitably be limited to preferred stakeholders. If they’re hiding the identities of those who show up to the workshop, it’s hard to imagine that the actual process will be any more transparent.

The ICO’s arguments about commercial prejudice under S43 of FOI are amateurish: “To disclose that a company has sent delegates to the event may in itself indicate to the wider sector and therefore potential competitors that they are in development of, or in the planning stages of a new innovative product which involves personal data“. A vital principle of FOI is that when using a prejudice-based exemption, you need to show cause and effect. Disclosure will or will be likely to lead to the harm described. How on earth could a company lose money, or become less competitive, purely because it was revealed that they attended an ICO event (which is what using S43 means)?

The ICO’s personal data and confidentiality arguments are equally weak – everyone who attended the meeting would know the identities of everyone else, and all were acting in an official or commercial capacity. This was not a secret or private meeting about a specific project; anyone with an interest was able to apply to attend. Revealing their attendance is not unfair, and there is plainly a legitimate interest in knowing who the ICO is talking to about a project into which the office is putting significant resources, and which will have an impact on products or services that may affect millions of people. The determination to hide this basic information and avoid scrutiny of the sandbox process undermines the credibility of the project itself, and makes the ICO’s claim to be an effective defender of public sector transparency ever more hypocritical.

Worst of all, if disclosure of the attendees’ identity was the calamity for commercial sensitivity and personal data that the ICO claims it to be, there should be an immediate and thorough investigation of how the information I requested came to be revealed on the ICO’s website and twitter account. The entire event was recorded and a promotional video was released. Several attendees (whose names and companies I cannot be given because of confidentiality, data protection and commercial prejudice) are identified and interviewed on camera, while there are numerous shots of other attendees who are clearly identifiable. Either the ICO has betrayed the confidentiality and personal data rights of these people, putting their companies at direct commercial risk, or their FOI response is a cack-handed attempt to avoid legitimate scrutiny. Either way, I strongly recommend that the left hand and the right hand in Wilmslow make some rudimentary attempts to get to know one another.

Long ago, I was one of a number of online commentators described by the ICO’s comms people as a ‘driver of negative sentiment’. More recently, one of Denham’s more dedicated apologists accused me of being one of the regulator’s “adversaries”. I’m not a fan of the ICO, and I never have been. But this stinks. The determination to throw every conceivable exemption at a simple request to know who the ICO is talking to suggests that the office is afraid of scrutiny, afraid of having to justify what they’re doing and how they’re doing it. The incompetence of refusing to give me information that is on display on their website and Twitter account shows contempt for their obligations as an FOI regulator. The ICO has its head in the sand; as we drift out of the European mainstream into a lonely future on the fringes, their secrecy and incompetence should be matters of concern for anyone who cares about Data Protection.

Yas Queen!

One of the features of the GDPR which is superficially similar to the old Data Protection Act but turns out to be quite different is the requirement to provide information about how personal data is being used. The word ‘transparency’ is an inherent part of the GDPR first principle, whereas it was absent from the previous version. The DPA 1998 allowed data controllers to decide what information data subjects needed to know, beyond who the controller was and what purposes their data was being processed for. The GDPR has two similar but distinct lists of information that must be provided, one for where data is obtained from the subject, the other where data is obtained from somewhere else, and they dictate what must be provided in scary detail.

When I first started looking at the GDPR, it was this element that I was most sceptical about. I simply couldn’t believe that organisations would admit where they obtained data from, or how long they were going to keep it. I have an almost completed blog on the boil (stay tuned) which is about the very subject of list brokers covering up where they get personal data from and who they sell it to. So when a friend passed me the ‘Data Protection Privacy Notice for Alumni and Supporters‘ from Queen Mary (University of London), I was amazed to see a clear, transparent explanation of what data was used, for what purposes, and under what legal basis. The only problem is that some of it is bollocks, and some of it deploys an attitude to data that requires a seatbelt and a helmet.

Ironically, because it is a relatively short and easy to read document (four pages of A4 in normal font, written in human English), the nonsense leaps out at you like a chucked spear in a 1950s 3D movie. The notice asserts that for a list of purposes, the University is relying on the legal basis of legitimate interests’. The purposes include:

furthering Queen Mary’s educational and charitable mission (which includes fundraising and securing the support of volunteers

This is, of course, direct marketing. The notice then says:

We may pursue these legitimate interests by contacting you by telephone, email, post, text or social media.

Which would be a PECR breach. The University cannot send emails or texts to alumni without consent, but according to the policy, they can. Of course, some clever person (I have a list of names here) will come along and tell me that since students pay for their education, surely the University can rely on the soft opt-in? Well, for one thing, these are alumni, some of whom may have attended the University decades ago (and Queen Mary freely admits to tracking down ex-students using the Royal Mail’s Change of Address Service). For anyone who didn’t substantially pay for their degree, it doesn’t fly. Moreover, I’ve trained a lot of universities who were understandably squeamish about the idea that a qualification like a degree can be reduced to a mere commodity, like a dishwasher or a new set of tyres.

And there’s more.

If you are registered with the Telephone Preference Service (TPS) but have provided us with a telephone number, we will assume we have your consent to call you on this number until notified otherwise

No. For Pity’s Sake, No. Have the last three years of the world and his dog banging on incessantly about consent (often insisting wrongly that you always need it but OK) been for nothing? There is no such thing as assumed consent. There is no such thing as assumed consent. MATE, ARE YOU HAVING A LAUGH?

It seems odd that because Queen Mary have done something really well, I’m criticising them. To be clear, it’s one of the clearest privacy notices I have ever seen. But it’s not just the unlawful bits that stick out like Madonna’s bra (happy 60th, Your Majesty). The rest of it is, to use my favourite euphemism for this kind of thing, is bold. Students’ personal data will be retained “in perpetuity“. The data held about alumni includes “occupation, professional activities and other life achievements“, “family and spouse / partner details and your relationships with other alumni, supporters and friends” and also “financial information relating to you and your family, including data and estimations around your income, assets and potential capacity to make a gift“. If anyone from Queen Mary is reading this, my friend says not to get your hopes up.

The gleeful description of what data they hold is an amuse bouche to the relish with which Queen Mary describe their use of research. The fundraiser Stephen Pidgeon once told me with great vehemence that fundraisers  couldn’t possibly be frank about the techniques that they deploy. Queen Mary, on the other hand, have more or less had shirts made: “we may gather information about you from trusted publicly available sources to help us understand more about you as an individual and your ability to support the university in ways financial or otherwise“. They explicitly say that they do wealth screening in some cases, and have a long list of possible data sources including Companies House, company websites, “rich lists“, Factiva, Lexis Nexis, “general internet and press searches“, Who’s Who, Debretts People of Today and LinkedIn.

Because I banged on about it so loudly a year or so ago, I should be the first to point out that despite all the bollocks talked about the ICO banning wealth screening, the ICO’s enforcement against charities did not such thing: it fined a number of high-profile charities for doing wealth screening without fair processing. Ostensibly, Queen Mary are simply doing what the ICO demanded by describing the process, but I have a sneaking suspicion that some of Our Friends in Wilmslow might be surprised to see wealth screening being carried out so enthusiastically.

To be frank, I do not believe that Queen Mary can justify processing the personal data of the spouses or family members of alumni in any circumstances, unless with consent. I think it is unfair, they do not have a legitimate interest in processing the data, and it is excessive. I think they and any institution who did the same deserve to be enforced against, or at the very least they should receive a shedload of Right to Be Forgotten Requests from mischievous family members. I am also sceptical about the depth of research that may be carried out into some alumni – it’s clear that it will only be a subset of the whole, but unless we’re talking about a handful of millionaires who might well expect this kind of thing to go on, I think this document is an inadequate way to meet the requirements of transparency. If a university is digging into a person’s background to this extent, it’s a form of processing that a person should directly know about and have a right to prevent. My friend only read this document because she’s in the business – Queen Mary should tell people if they’re subject to this level of profiling.

I know some fundraising consultants who will take issue with this and to be clear, I am not dogmatically saying that QM can’t do this. But seriously, can they do this? Is this what the brave new world of GDPR is all about? My instinct is HELL NO WITH AN AIRHORN FOR EMPHASIS but it would be hilarious if I was wrong, and the GDPR really doesn’t dent this kind of activity. I write this solely to see what other people think. Do you think this kind of thing is OK?

I don’t have a dynamite conclusion to this blog. I could kiss the person who wrote this privacy notice because it’s so plain and well-written, and yet the approach to consent and PECR is so misbegotten, I think whoever came up with it should be cast out into the Cursed Earth without a backwards glance. I don’t believe that Queen Mary can possibly justify the amount of data that they propose to process and the purposes for which they think legitimate interests is an adequate umbrella. But at the same time, the ICO looked at precisely this kind of activity and only really complained about the lack of transparency, which isn’t a problem here. All I can say for certain is that other people are going to get the fundamentals so enthusiastically arse-about-face, and do such interesting things, I demand that they do so with the same clarity.

 

A SMALL ADVERT – if you’d like to know more about this kind of thing, I’m running courses in September and November on GDPR, marketing, how to be a DPO and other big DP issues. Some of the September courses are already full, so book now: https://2040training.co.uk/gdprcourses/