SARmaggedon Days Are Here Again (Again)

Reading my emails, a headline leapt out at me: “The hidden cost of GDPR data access requests“. It led me to BetaNews, a website that looks like it is trapped in 1998, and a story describing research into SARs commissioned by Guardum, a purveyor of subject access request handling software. A sample of 100 Data Protection Officers were consulted, and you’ll never guess what the research uncovered.

SARs, it turns out, are time consuming and expensive. I award 10 GDPR points to the Guardum CTO for knowing that SARs weren’t introduced in 2018, but I have to take them away immediately because he goes on to claim that “There has also been a marked change in the way that lawyers are using DSARs as part of the data discovery process.” Apparently, lawyers are using SARs now. Imagine that. The article goes to say that “Fulfilling DSARs can involve finding, compiling and redacting data in digital and paper format across multiple departments both on company networks and in the cloud.“. There’s also a bit of a spoiler about whether the Pope is a Catholic.

According to Guardum, the average cost of a SAR is £4,884.53, the average DPO receives 27 SARs a month, and each one takes an average of 66 working hours to deal with. The article didn’t explain how these figures were arrived at, so I eagerly clicked the link to visit Guardum’s website for the full results. What I found was a fountain of guff. Strip out the endless bar and pie charts, and what Guardum wants to say is that 45% of the DPOs surveyed would like to automate some of the process because of a predicted landslide of SARs, provoked by angry furloughed and sacked staff.

I’m not sure about the logic of this – I can understand that everyone who loses their job will be upset and probably angry, and I’ve certainly dealt with lots of SARs related to a suspension or dismissal. But in those cases, the action taken was personal and direct – an individual was singled out by the employer for the treatment in question. I don’t see why people losing jobs in a pandemic will be so determined to send a SAR. It’s not like the reason for their predicament is a mystery.

The survey questions are opportunistic at best, and at worst, seem designed to allow Guardum to paint this picture of anxious DPOs uncertain about how they’re going to handle the post Covid-19 SARmageddon that the company is evidently desperate for. 75% of respondents are described as having difficulties dealing with SARs during the lockdown, though this actually translates as good news. 72% are coping but expect a SAR backlog when they get back to the office, while just 3% fearing a ‘mountain’ of requests. The headline on one slide is that 30% anticipate a ‘massive’ increase in SARs, but the reality is 55% expect the same as before and 15% think they’ll get less. 73% supposedly think that furloughed or laid off staff will be a ‘big factor’ in the predicted increase, even though the breakdown shows that only 20% think it will be the single biggest factor. To emphasise, these are requests that haven’t happened yet. The people who say that they will are the ones flogging the software to deal with the problem.

So far, so what? Guardum have software to sell and a cynical pitch about Covid-19 to achieve that. Does it matter? In the grand scheme of things, no, it doesn’t. I’m probably not the only person currently experiencing a crash course in What’s Really Important. But in the micro scheme of things, bullshit deserves to be called out, especially when it’s designed to exploit a crisis that’s causing misery and death across the world. Many of the revelations in this survey are staggeringly banal – nearly 50% of people find tracking the data down across multiple departments to be a slog, while 63% have to search both paper and electronic records. Who with any experience in Data Protection would think it was worth pointing this out? Meanwhile, the assertions about how long a SAR takes or how much it costs are wholly unexplained. It’s meaningless to claim that the mean cost of a SAR is £4,884.53 if you don’t explain how that was calculated (inevitably, the CTO is touting this figure on LinkedIn).

Guardum aren’t necessarily the experts at Data Protection that they might have us believe. For one thing, despite being a UK company, both the survey results and their website exclusively refer to ‘PII’ rather than personal data. For another, part of the criteria for participating in the survey was that the DPO needed to work for a company with more than 250 employees. This was, for a time, the threshold for a mandatory DPO but despite being changed, some dodgy training companies and consultants didn’t notice and ran courses which highlighted the 250 figure even when it was gone. Most importantly, nearly half of the people who responded to the survey don’t know what they’re doing. The survey was purportedly targeted at DPOs, but 44% of respondents are identified as being in ‘C-level’ jobs – perhaps this is to give a veneer of seniority, but C-level jobs are precisely the senior roles that are likely attract a conflict of interests. Guardum talked to people in the wrong jobs, and apparently didn’t realise this.

The ‘About’ page of Guardum’s website proclaims “Guardum supports privacy by design – where data privacy is engineered into your business processes during design rather than as an afterthought“, but the execution is less confident. There is a questionnaire that shows how much an organisation can save by using the Guardum product, but when you complete it, you have to fill in your name, company and email to get the results, and there’s no privacy policy or transparency information about how this information will be used. Moreover, if you try to use the contact form, clicking on the link to the terms and conditions results in ‘page not found’.

I have to declare my bias here – I don’t believe that any ‘solution’ can fully deal with the SAR response process, and I think people who tout AI gizmos that automatically redact “PII” are probably selling snake oil. Some of the SAR grind comes in finding the data, but a lot of it is about judgement – what should you redact? How much should you redact? Anyone who claims that they can replace humans when dealing with an HR, mental health or social care is writing cheques that no product I have ever seen can cash. So when I land on a website like Guardum’s, my back is up and my scepticism is turned all the way up. It would be nice if once, I saw a product that wasn’t sold with bullshit. But not only is Guardum’s pitch heavy with management buzzwords, they’re using fear as a marketing tool. Just last week, they ran a webinar about weathering the ‘Post Pandemic DSAR Storm‘.

Guardum claim that they provide “the only solution that can fully meet the DSAR challenge of responding in the tight 30-day deadline, giving you back control, time and money that are lost using other solutions“. Nowhere do they mention that you can extend the deadline by up to two months is a request is complex (and many are). But even if their claims are true, why do they need to sell their product via catastrophising? If their expertise goes back to the 1984 Act, why are they calling it PII and talking up the opinions of DPOs who are in the wrong job? Why oversell the results of their survey? Why hide the basis of the hours and cost calculations on which is all of this is being flogged?  And what on earth is a ‘Certified Blockchain Expert‘?

The future post-Covid is an uncertain place. I find the utopianism of some commentators hard to swallow, partly because people are still dying and partly because the much-predicted end of the office will have career-changing consequences for people like me. But at least the LinkedIn prophets are trying to explore positives for themselves and others in an undeniably grim situation. The people running Guardum seem only to want scare people into getting a demo of their software. If one is looking for positives, the fact that the ICO has waved the white flag means that no organisation needs to be unduly concerned about DP fines at the moment, and despite some of the concerns expressed in Guardum’s survey, nobody in the UK has ever been fined for not answering a SAR on time. The old advice about deleting data you don’t need and telling your managers not to slag people off in emails and texts will save you as much SAR misery as any software package, and I can give you that for free.

Blast from the past

As we all endure the lockdown and the uncertainty about when and how it might end, I have been trying to avoid thinking about the past. It’s tempting to dwell on the last time I went to the cinema (Home, Manchester ironically to watch ‘The Lighthouse’), the last time I went to a pub (Tweedies in Grasmere, just hours before Johnson closed them all), the last face-to-face training course I ran (lovely people, awful drive home). But thinking back to what I had, and the uncertainty about how, when and if I will get it back, doesn’t make the interminable Groundhog Days move any faster. I’d be better off just ploughing on and working out what to do next.

So it was a strange experience to be thrown backwards in time to the heady days of 2017, when the GDPR frenzy was at its height, and the world and his dog were setting up GDPR consultancies. People still make fun of the outdated nature of my company name, but I registered 2040 Training in 2008, and I’m proud of its pre-GDPR nomenclature. The list of GDPR-themed companies that are now dissolved is a melancholy roll call – goodbye GDPR Ltd, GDPR Assist (not that one), GDPR Assistance, GDPR Certification Group (got to admire their optimism), GDPR Claims, GDPR Compliance, GDPR Compliance Consulting, GDPR Compliance Consultancy, GDPR Compliance for SMEs and GDPR Consultants International (offices in New York, Paris and Peckham). You are all with the Angels now.

I was cast into this reverie by a friend who drew my attention to GDPR Legal, a relatively new GDPR company, and a few moments on their website was like climbing into a DeLorean. It was all there. The professional design, the ability to provide all possible services related to Data Protection (you can get a DPO for as little as £100 a month), and of course “qualified DPO’s (sic)”. I was disappointed that there was no mention of them being certified and nary a hint of the IBITGQ, but you can’t have everything. They still pulled out some crowdpleasers, including flatulent business speak and the obvious fact that they are trying to sell software, sometimes in the same couple of sentences: “Our service includes a comprehensive consult to help identify gaps and opportunities, a comprehensive report that includes a project plan with timelines and milestones, a cost analysis, and a schedule. We also offer a software suite that will help you get there quickly and smoothly.” Timelines and milestones, people. This is what we want.

The lack of any detail is possibly a matter for concern. The website claims that the company’s specialists have “over 50 years of experience delivering a pragmatic consulting service with qualified DPO’s and GDPR Practitioner skills” but it is difficult to find out who any of them are. There is no ‘meet the team’ or ‘our people’ section. I might be wrong, but I don’t think there’s a single human being’s name anywhere on there. If you had all these brilliant experienced professionals, wouldn’t you want to advertise who they are – I might make fun of them, but even the folk who have blocked me on LinkedIn aren’t ashamed of saying who their consultants are. Is it 50 people with a year’s experience each? Indeed, the only name I can associate with the company (via Companies House) is the Director, a man who has no experience in Data Protection, but is also director of a shedload of software and marketing companies. Any time the site needs to get into any detail, it hyperlinks to the ICO.

So far, so what? You probably think this blog is cruel. If someone wants to set up a company selling GDPR services, why do I care? Isn’t this just sour grapes at another disruptive entrant in the vibrant GDPR market?

There are two reasons why I call these people out. The first is their privacy policy. It’s not a good sign when a privacy policy page on a GDPR company’s website begins with ‘Privacy Policy coming soon’, but as it happens, immediately below is the company’s privacy policy. Well, I say it’s their’s. It’s oddly formatted, and when you click on the links that are supposed to take you to the policy’s constituent parts, you’re in fact redirected to the log-in page for GoDaddy, with whom the site was registered. All the way through, there are lots of brackets in places that they don’t belong. It didn’t take me long to work out what was going on – I think the brackets were the elements of the template policy that GDPR Legal has used which needed to be personalised, and they’ve forgotten to remove them. 50 collective years of experience, and nobody is competent enough to write the company’s own privacy policy, they just use someone else’s template. Indeed, if you search for the first part of the policy “Important information and who we are“, it leads you to dozens of websites using the same template, from Visit Manchester to NHS Improvement. I can’t find where it originated, but it’s an indictment of the quality of work here that they took it off the shelf and didn’t even format it properly. My Privacy Policy is smart-arsery of the first order, but at least I wrote it myself.

The other reason is worse. GDPR Legal has a blog with three posts on it. Two are bland and short, but the most recent, published just this week, is much longer and more detailed. It reads very differently from other parts of the site, and there was something about the tone and structure that was familiar to me. It didn’t take long to remember where I had seen something like this before. The blog is about GDPR and children, and this is the second paragraph:

Because kids are less aware of the risks involved in handing over their personal data, they need greater protection when you are collecting and processing their data.Here is a guide and checklist for what you need to know about GDPR and children’s data.”

This is the first sentence of the ICO’s webpage about GDPR and children:

Children need particular protection when you are collecting and processing their personal data because they may be less aware of the risks involved.

Coincidence, you think? This is the third line:

If a business processes children’s personal data then great care and thought should be given about the need to protect them from the outset, and any systems and processes should be designed with this in mind

This is the second line of the ICO’s page:

If you process children’s personal data then you should think about the need to protect them from the outset, and design your systems and processes with this in mind

Blog, fourth para:

Compliance with the data protection principles and in particular fairness should be central to all processing of children’s personal data. ”

ICO page, third line:

“Compliance with the data protection principles and in particular fairness should be central to all your processing of children’s personal data

They rejigged the first few elements a little, but after that, whoever was doing it evidently got bored and it’s pretty much word for word:

GDPR Legal Blog:

A business needs to have a lawful basis for processing a child’s personal data. Consent is one possible lawful basis for processing, but it is not the only option. Sometimes using an alternative basis is more appropriate and provides better protection for the child.

ICO page

You need to have a lawful basis for processing a child’s personal data. Consent is one possible lawful basis for processing, but it is not the only option. Sometimes using an alternative basis is more appropriate and provides better protection for the child.

GDPR Legal Blog

General Checklists

  • We comply with all the requirements of the GDPR, not just those specifically relating to children and included in this checklist. 
  • We design our processing with children in mind from the outset and use a data protection by design and by default approach. 
  • We make sure that our processing is fair and complies with the data protection principles. 
  • As a matter of good practice, we use DPIAs (data protection impact assessments) to help us assess and mitigate the risks to children. 
  • If our processing is likely to result in a high risk to the rights and freedom of children then we always do a DPIA. 
  • As a matter of good practice, we take children’s views into account when designing our processing.

ICO page: 

Checklists

General

  • We comply with all the requirements of the GDPR, not just those specifically relating to children and included in this checklist.
  • We design our processing with children in mind from the outset, and use a data protection by design and by default approach.
  • We make sure that our processing is fair and complies with the data protection principles.
  • As a matter of good practice, we use DPIAs to help us assess and mitigate the risks to children.
  • If our processing is likely to result in a high risk to the rights and freedom of children then we always do a DPIA.
  • As a matter of good practice, we take children’s views into account when designing our processing.”

NB: I’ve screenshotted all of it.

Someone at GDPR Legal lifted the whole thing uncredited and passed it off as their own work. A company that claims to be able to provide “practical and bespoke advice”, guiding “major projects in some of the UK’s largest businesses” nicked content from the ICO’s website. This kind of cutting and pasting gives plagiarism a bad name. At least GDPR’s previous Grand Master Plagiarist did it in style with some top-drawer endorsements.

The GDPR frenzy is over. Some of the new entrants have gone from strength to strength, and some of them are now selling kitchens. The current crisis will test everyone, and I doubt that the DP landscape will look the same in a year’s time. Nevertheless, while I hope the data protection sector remains robust enough to accommodate both the slick, corporate operations, and a few maniac artisans like me, it surely doesn’t need chancers any more? I hope we can all agree that a company that can’t even design its own privacy policy, that won’t admit who its experts are, and who steals from the regulator deserves to be shamed? I hope this blog might persuade a few unwary punters to do some due diligence before handing over their cash and perhaps pick a company who writes their own material. Whatever the LinkedIn blockers think of me, and I of them, surely we’re all better than this?

Going Unnoticed

Last week, I came across an interview with Elizabeth Denham on a Canadian website called The Walrus that was published in April. There are some interesting nuggets – Denham seems to out herself as a Remainer in the third paragraph (a tad awkward given that she has only enforced on the other side) and also it turns out that the Commissioner has framed pictures of herself taking on Facebook in her office. More important is the comparison she draws between her Canadian jobs and her current role: “That’s why I like being where I am now,” she says, settling herself at a boardroom table. “To actually see people prosecuted.”

Denham probably wasn’t thinking of the run of legitimate but low-key prosecutions of nosy admin staff and practice managers which her office has carried out in recent months, which means she was up to her old tricks of inaccurately using the language of crime and prosecution to describe powers that are civil (or more properly, administrative). Since GDPR came in, she’s even less likely to prosecute than before, given that she no longer has the power to do so for an ignored enforcement or information notice. I don’t know whether she genuinely doesn’t understand how her powers work or is just using the wrong words because she thinks it makes for a better quote.

Publicity certainly plays a far greater part in the ICO’s enforcement approach than it should. A few months back, I made an FOI request to the ICO asking about a variety of enforcement issues and the information I received was fascinating. The response was late (because of course it was), but it was very thorough and detailed, and what it reveals is significant.

ICO enforcement breaks down into two main types. Enforcement notices are used where the ICO wants to stop unlawful practices or otherwise put things right. Monetary penalties are a punishment for serious breaches. Occasionally, they are used together, but often the bruised organisation is willing to go along with whatever the ICO wants, or has already put things right, so an enforcement notice is superfluous. The ICO is obliged to serve a notice of intent (NOI) in advance of a final penalty notice, giving the controller the opportunity to make representations. There is no equivalent requirement for preliminary enforcement notices, but in virtually every case, the ICO serves a preliminary notice anyway, also allowing for representations.

According to my FOI response, in 2017, the ICO issued 8 preliminary enforcement notices (PENs), but only 4 were followed up by a final enforcement notice; in 2018, 5 PENs were issued, and only 3 resulted in a final notice. The ratio of NOIs to final penalties is much closer; in 2017, there were 19 NOIs, and only one was not followed up with a penalty. In 2018, 21 NOIs were issued, 20 of which resulted in a penalty. Nevertheless, the PEN / NOI stage is clearly meaningful. In multiple cases, whatever the controller said stopped the intended enforcement in its tracks. In the light of many GDPR ‘experts’ confusion about when fines are real or proposed, the fact that not every NOI results in a fine is worth noting.

The response shows the risks of neglecting to issue a PEN. In July 2018, the ICO issued Aggregate IQ (AKA AIQ) with the first GDPR enforcement notice (indeed, it was the first GDPR enforcement action altogether). My FOI reveals that it was one of only a few cases where a preliminary notice was not issued. The AIQ EN was unenforceable, ordering them to cease processing any personal data about any UK or EU “citizens” obtained from UK political organisations “or otherwise for the purposes of data analytics, political campaigning or any other advertising purposes”. AIQ was forbidden from ever holding personal data about any EU citizen for any advertising purpose, even if that purpose was entirely lawful, and despite the fact that the GDPR applies to residents, not citizens. AIQ appealed, but before that appeal could be heard, the ICO capitulated and replaced the notice with one that required AIQ to delete a specific dataset, and only after the conclusion of an investigation in Canada. It cannot be a coincidence that this badly written notice was published as part of the launch of the ICO’s first report into Data Analytics. It seems that ICO rushed it, ignoring the normal procedure, so that the Commissioner had things to announce.

The ICO confirmed to me that it hasn’t served a penalty without an NOI, which is as it should be, but the importance of the NOI stage is underlined by another case announced with the first AIQ EN. The ICO issued a £500,000 penalty against Facebook, except that what was announced in July 2018 was the NOI, rather than the final penalty. Between July and October, the ICO would have received representations from Facebook, and as a result, the story in the final penalty was changed. The NOI claims that a million UK Facebook users’ data was passed to Cambridge Analytica and SCL among others for political purposes, but the final notice acknowledges that the ICO has no evidence that any UK users data was used for campaigning. As an aside, this means that ICO has no evidence Cambridge Analytica used Facebook data in the Brexit referendum. The final notice is based on a hypothetical yarn about the risk of a US visitor’s data being processed while passing through the UK, and an assertion that even though UK Facebook users’ data wasn’t abused for political purposes (the risk did not “eventuate“), it could have been, so there. I’ve spent years emphasising that the incident isn’t the same as a breach, but going for the maximum penalty on something that didn’t happen, having said previously that it did, is perhaps the wrong time to listen to me.

If you haven’t read the final Facebook notice, you really should. ICO’s argument is that UK users data could have been abused for political purposes even though it wasn’t, and the mere possibility would cause people substantial distress. I find this hard to swallow. I suspect ICO felt they had effectively announced the £500,000 penalty; most journalists reported the NOI as such. Despite Facebook’s representations pulling the rug out from under the NOI, I guess that the ICO couldn’t back down. There had to be a £500,000 penalty, so they worked backwards from there. The Commissioner now faces an appeal on a thin premise, as well as accusations from Facebook that Denham was biased when making her decision.

Had the NOI not been published (like virtually every other NOI for the past ten years), the pressure of headlines would have been absent. Facebook have already made the not unreasonable point in the Tribunal that as the final penalty has a different premise than the NOI, the process is unfair. Without a public NOI, Facebook could have put this to the ICO behind closed doors, and an amended NOI could have been issued with no loss of face. If Facebook’s representations were sufficiently robust, the case could have been dropped altogether, as happened in other cases in both 2017 and 2018. For the sake of a few days’ headlines, Denham would not be facing the possibility of a career-defining humiliation at the hands of Facebook of all people, maybe even having to pay their costs. It’s not like there aren’t a dozen legitimate cases to be made against Facebook’s handling of personal data, but this is the hill the ICO has chosen to die on. Maybe I’m wrong and Facebook will lose their appeal, but imagine if they win and this farrago helps them to get there.

The other revelation in my FOI response is an area of enforcement that the ICO does not want to publicise at all. In 2016, the ICO issued a penalty on an unnamed historical society, and in 2017, another was served on an unnamed barrister. I know this because the ICO published the details, publicly confirming the nature of the breach, amount of the penalty as well as the type of organisation. One might argue that they set a precedent in doing so. What I didn’t know until this FOI request is that there have been a further 3 secret monetary penalties, 1 in 2017 and 2 in 2018. The details have not been published, and the ICO refused to give me any information about them now.

The exemptions set out the ICO’s concerns. They claim that it might be possible for me to identify individual data subjects, even though both the barrister and historical society breaches involved very limited numbers of people but were still published. They also claim that disclosure will prejudice their ability to enforce Data Protection law, using this justification:

“We are relying on this exemption to withhold information from you where the disclosure of that information is held for an ongoing regulatory process (so, we are yet to complete our regulatory process and our intentions could still be affected by the actions of a data controller) or the information is held in relation to sensitive matters and its disclosure would adversely affect relationships which we need to maintain with the organisations involved. It is essential that organisations continue to engage with us in a constructive and collaborative way without fear that the information they provide to us will be made public prematurely, or at a later date, if it is inappropriate to do so. Disclosure of the withheld information at this time would therefore be likely to prejudice our ability to effectively carry out our regulatory function”

The ICO routinely releases the names of data controllers she has served monetary penalties and enforcement notices on without any fears about the damage to their relationship. Just last week, she was expressing how “deeply concerned” she is about the use of facial recognition by the private sector, despite being at the very beginning of her enquiries into one such company. And if maintaining working relationships at the expense of transparency is such a vital principle, how can they justify the publication of the Facebook NOI for no more lofty reason than to sex up the release of the analytics report? They say “It is essential that organisations continue to engage with us in a constructive and collaborative way without fear that the information they provide to us will be made public prematurely”, and yet the Facebook NOI was published prematurely despite the fact that it was a dud. What will that have done to the ICO’s relationship with a controller as influential and significant as Facebook? What incentive do FB have to work with Wilmslow in a constructive and collaborative way now? And if identifying the subjects is an issue, what is to stop the ICO from saying ‘we fined X organisation £100,000’ but refusing to say why, or alternatively, describing the incident but anonymising the controller?

It doesn’t make sense to publicise enforcement when it’s not finished, and it doesn’t make sense to keep it secret when it’s done. Every controller that has been named and shamed by the ICO should be demanding to know why these penalties have been kept secret, while Facebook have every right to demand that the Commissioner account for the perverse and ill-judged way in which she took action against them. Meanwhile, we should all ask why the information rights regulator is in such a mess.

And one final question: did she bring the framed pictures with her or did we pay to get them done?

The Curse of the Padlock

One of the dangers of working in Data Protection is the risk of becoming a pedant. Precision matters; court cases have turned on the meaning of individual words like ‘likely’ and ‘distress’. The legislation is a maze of definitions and concepts that the competent practitioner needs to get to grips with. Lazy thinking can be revealed by an inability to get the details right, so it’s possible to become obsessed with the detail. Even the BCS Data Protection exam has a question which requires you to list the elements of the definition of consent in the right order. It’s easy to lapse into pedantry, to point out every wrongly quoted article, every jumbled phrase.

Nevertheless, getting a simple thing right is often important. GDPR does not cover ‘personal identifiable information’; it covers ‘personal data’ and the definition of the two is not the same. A person who talks about PII in the context of European Data Protection is starting in the wrong place (the US), and can make mistakes as a result. Another error that seems to be creeping in all over the place is more profound, and risks entrenching one of the biggest misconceptions about how data protection works, a misconception many of us have spent years trying to break down.

The problem is the phrase ‘data privacy’.

I see it everywhere – on LinkedIn naturally, in news coverage of the sector, and predictably, the ICO has fallen for it. They describe themselves as “The UK’s independent authority set up to uphold information rights in the public interest, promoting openness by public bodies and data privacy for individuals.” Look at the Data Privacy Advisory Service, who summarise their services as “At DPAS we help organisations safeguard the fundamental human right to have data kept private by putting in place the best possible protection to keep it secure. This is delivered in line with the General Data Protection Regulation (GDPR) and The Data Protection Act 2018.”

The idea is nonsense. It doesn’t exist. There is no right to data privacy – there is certainly no fundamental right ‘to have data kept private’. This isn’t a snide dig at someone quoting the wrong article. The concept of ‘data privacy’ is a complete misunderstanding of what Data Protection is for, and everyone who promotes it is actively thwarting the efforts of the rest of us to implement data protection in a practical way.

Article 8 of the European Convention on Human Rights says: ‘Everyone has the right to respect for his private and family life, his home and his correspondence“. This right is not absolute; it can be interfered with (only when necessary) in the interests of “national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others“. The right is not just about data – it certainly can be, as is evidenced by cases where celebrities and others use the privacy right to prevent the use of images that breach their right to privacy. But the right to privacy doesn’t have to be about data at all – you can breach a person’s right to privacy by simply observing them, by being in a place where they expect privacy, or by denying them the opportunity to do something privately. Data doesn’t have to come into it.

Clearly, if you did a Venn diagram, there would be circumstances where privacy and data protection overlap. By following the Data Protection principles when you handle a person’s private correspondence for example, you probably also do what’s necessary to protect their privacy. The same is true for confidentiality – not all confidential data is personal data, but a decent stab at the principles will probably respect both. There is, however, a significant portion of the Venn diagram where Data Protection and Privacy do not meet, and the DP part of that is important.

The notion of ‘Data Privacy’ obscures two vital elements of Data Protection. First, data protection is not only about private data. It is covers all personal data, private, secret, and public. For years, I have been banging my head against the brick wall of ‘it’s not personal data, it’s in the public domain’. Trying to explain to people that data like photographs, email addresses and other publicly available data is still personal data, just available and easier to use than some other data has long been a difficulty. There was a chink of light in Article 14 of the GDPR which clearly states that a person should be informed even when their data is accessed from ‘publicly accessible sources’. This explicit recognition that public data is still personal data is very helpful, but the notion that ‘data protection’ and ‘data privacy’ are interchangeable muddies the waters again.

Second, in related news, GDPR is not about keeping data private; it is about ensuring that personal data processing is properly regulated. For years, Data Protection has been plagued by the padlock. The Information Commissioner used it as a logo (‘but the padlock is unlocked’ is a defence that umpteen different ICO folk have used when I complained about it), and when I did a Google image search for ‘Data Protection’ today, this is the top set of results:

Screenshot 2019-05-26 at 09.17.53

The problem with the Data Protection Padlock is that it presents the legislation as something that locks data up, keeps it away from people. This understanding of data protection leads directly to the belief that disclosure of personal data is inherently problematic and exceptional, and that belief is toxic. I’m not persuaded that Victoria Climbie or Peter Connelly died solely because data about them wasn’t shared, but the pervasive fear of data sharing didn’t help. The GDPR says that ‘the protection of natural persons in relation to the processing of personal data is a fundamental right‘. The word ‘privacy‘ isn’t mentioned anywhere beyond a reference in a footnote to the ePrivacy Directive, and the processing of personal data is firmly put in the context of operating the EU’s internal market: “This regulation is intended to contribute to the accomplishment of an area of freedom, security and justice, and of an economic union“.

You can’t achieve the economic union by locking all the data away, by keeping it private. To characterise data protection law as being about ‘data privacy’ is to misrepresent its purpose completely. European Data Protection is a compromise – trade is underpinned by the use, even the exploitation of personal data, but people have rights, they have control over their data in some (but not all) circumstances, and the legislation built on foundations of transparency and fairness, not privacy. Arguably, the GDPR tries to even up the power imbalance in some circumstances, but it is not designed to lock up data and keep it private.

Of course, some people might be using ‘privacy’ as a synonym for ‘secure’ – the DPAS statement above seems to elide the two. Only a fool would want to play down the importance of security in the context of using any personal data, but the reduction of Data Protection solely to security is as destructive to a proper understanding of it as the privacy / protection mess. We’ve managed to drag Data Protection out of the IT department, and we need to stamp on this idea that security is the exemplar of good DP practice. Your data can be private and secure, but kept for no good reason, for too long, in an inaccurate state, and there could be too much of it.

Some personal data is private and should remain so. In many situations, the processing of personal data without an eye on people’s legitimate expectations of privacy, especially when monitoring, watching or listening to them, is likely to be unfair and so unlawful. There is a strong link between Data Protection and Privacy, and any attempt to divorce them would be stupid. But the use of ‘data privacy’ as a synonym for data protection is misleading and dangerous – it perpetuates a fundamental misreading of what the legislation is for, and makes the lives of everyone trying to make GDPR work effectively a thousands times harder. It’s time to take this nonsense, lock it up and throw away the key.

Home, James

A few months ago, I wrote a blog about data protection and nonsense, highlighting inaccurate claims made by training companies, marketers and pressure groups. A bad tempered spat ensued in comments on LinkedIn between myself and Russell James, the marketer behind the lobbying attempt to change the ICO’s funding model to include cost recovery. James insisted that it didn’t matter that a letter sent by four MPs to the DCMS asking for the change, apparently at his instigation, contained inaccurate claims (the description of DP breaches as ‘crimes’) and embarrassingly got the name of the Information Commissioner wrong (it’s the Independent Commissioner of Information, according to the distinguished Parliamentarians, or whoever actually wrote it).

I asked James what the Information Commissioner’s Office themselves thought of his plan to allow the ICO to recoup the costs of investigations from those “found guilty of data crimes” (which I think means those who are in the receiving end of enforcement from Wilmslow, although it’s hard to be 100% certain). The idea that someone would persuade MPs to lobby the ICO’s sponsor department to change their funding mechanism without at least the tacit approval of the Commissioner or her staff seemed ridiculous, but the normally prolix Mr James was silent on the matter. So I decided to ask the Information Commissioner.

I made an FOI request including all of the following information:
1) Any recorded information about approaches made by Russell James or others to the ICO about the idea of the ICO adopting a cost-recovery model, including any correspondence with Mr James or his associates.
2) Any responses provided to James or others about the ICO adopting a cost-recovery model.
3) Any correspondence with Tom Tugendhat, Yvette Cooper, Dominic Grieve or Damian Collins, or their staff about the idea of a cost-recovery model, or the letter sent to the DCMS
4) Any internal discussion of the cost-recovery model.
5) Any correspondence, notes of meetings or other records of meetings between Mr James and any ICO member of staff, including the names of the staff. (this was subsequently clarified to cover only the cost recovery model, and not any other correspondence Mr James might have had with the ICO.)

Whatever the ICO made of Mr James’ ambitious plan, I was certain that this request would capture their thoughts. At worst, the ICO might refuse to disclose their internal discussions of the idea, but at least I might get some sense of the extent of them.

The ICO provided me with three paragraphs from a letter sent to them by Mr James around the time the MPs wrote to the DCMS. James told me that ICI letter was written by the office of Tom Tugendhat, but this one was remarkably similar in tone, and had the same lack of understanding of how the Data Protection enforcement regime works. James told the ICO that they were about to “leverage significant revenue“. Greatly increased income for the DCMS via the huge sums GDPR fines paid to them would, James asserted, result in much more cash for Wilmslow. This sounds great, if it wasn’t for the the fact that the ICO hasn’t issued a single penalty under the GDPR yet. More importantly, he is confused about what happens to the penalties, and how the ICO is funded. DP penalties have always been paid into the Treasury’s consolidated fund, bypassing the DCMS altogether. Moreover, the ICO doesn’t receive any funding from the DCMS for its Data Protection work. As this document (freely available on the ICO’s website) states, all the ICO’s DP work is paid for by DP fees collected from Data Controllers, as has been the case for many years. The ICO could do a CNIL-style €50 million penalty every week, and neither they nor the DCMS would see a cent of it.

James also claims in his letter that his campaign has “ministerial support from government officials“; I don’t know if that he’s claiming the support of ministers, or the support of government officials, but the phrase itself sounds like it was written by someone who doesn’t know the difference between the two. I’d ask him which it was, but I sent him a single direct message asking for comments before publishing the last blog I wrote this issue. He ignored me, but later pretended that I had deluged him with many such messages. If Tugendhat hadn’t tweeted the ICI letter, I’d think it was fake.

Whatever the shortcomings of Mr James’ insights into Data Protection (when I told him I was making an FOI about his plan, he thought it was the same as a SAR), his confidence in the success of the James Tax is hard to fault. According to him, it is now “a short time before your department (ICO) will have a more resilient financial footing“. Given this thrilling news, one can only speculate at how excited the fine folk of the ICO would be at the impending cash bonanza.

Alas, apart from a copy of the ICI letter, which the ICO sensibly chose not to provide to me as it was plainly in the public domain, they held no data about the James Tax. None. Nothing. Nada. Indeed, they made a point of telling me: “For clarity, I can confirm that we do not hold any information which falls within the scope of the other parts of your request“.  This means that they did not have any recorded discussions about it, share the letter internally, or even reply to that part of Mr James’ letter. If anyone had anything to say about the James Tax, they didn’t want to write it down.

Mr James has set himself up as the doughty defender of “Liz and the crew” as he once described his surprisingly reticent friends in Wilmslow to me. He has launched a campaign to change the law and roped four two highly respectable MPs in to support it. I think it is reasonable to ask whether someone with such a misbegotten understanding of how Data Protection works is the right person to change it. Given that the ICO has seemingly offered no support, not even a comment on his plan, I assume that they do not welcome the idea. It’s not hard to imagine why – calculating the costs of an investigation is extra work and bureaucracy. Moreover, if the ICO is entitled to claim the costs of victory, surely it should be forced to foot the bill for defeat – every time the ICO’s enforcement team’s investigation results in no action, the ICO should contribute to the time the controller spent in answering the many letters and information notices for which the office is celebrated.

If a case goes to appeal, while the James Tax would presumably allow the costs of going to the Tribunal to be recouped if successful, for fairness’ sake, the same logic must apply the other way around. If the Tribunal vindicates the ICO’s target (and losses at the Tribunal are not unknown, especially in recent times), presumably the ICO would have to pay the legal bills too. There are already financial incentives and advantages for the Commissioner. If the ICO issues a financial penalty, the controller gets a 20% discount if they choose not to appeal. If a controller’s actions are truly misbegotten and they choose to appeal, the Tribunal and the courts above can award costs against the recalcitrant data controller. To change the relationship further in the ICO’s interests should not just be one-way.

If the James Tax includes recouping costs of dealing with appeals (and my arguments with him on LinkedIn suggests that it does), this will also have a negative effect on one of the most important parts of the DP enforcement system. Any controller who has been fined will, according to the James Tax, already face the added cost of the ICO’s investigation. Appealing – already a roll of dice in many cases – will be that much more of a risk. As well as their own costs, controllers will have to factor in the additional ICO tally.

We already have Denham grumbling about appeals, even using a speech by Mark Zuckerberg about possible regulation in the US as an excuse to demand he drops his appeal against the Facebook fine in the UK. James’ ideas might further suppress the possibility of appealing against ICO decisions. For everyone involved in the sector, this would be a disaster. To borrow James’ inaccurate criminal characterisation of DP enforcement, the ICO is already the investigator, prosecutor and judge – I don’t want to strengthen that hand any more. Moreover, in the interview above, Denham signalled disdain for the concerns of ordinary people, stating that they don’t complain about the right things. As part of its analytics investigation, the ICO has enforced on cases where there have been no complaints. Denham’s ICO need to be challenged, and challenged regularly. The tribunals and the courts frequently give detailed and helpful explanations of how the law works – ICO never produced guidance on consent as useful as the Tribunal’s decision in Optical Express, and whether the ICO wins or loses, all sorts of insights are available in Tribunal decisions.

Nobody appeals lightly. Combine Denham’s hostility to challenge with the James Tax, and we might lose vital opportunities for debate and caselaw. You can dismiss this blog as just an opportunity for me to take the piss out of another GDPR certified professional, but James has set himself up as a public campaigner. He wants to change how the ICO is funded and how all controllers are potentially treated. This cannot just pass without scrutiny, especially as he appears to lack both an understanding of the system he wants to change, and the support of the regulator whose powers he wants to alter. If the people arguing for changes don’t even think it’s important what the ICO is called or whether it’s a ‘department’ or not, we should wonder what other important details they have missed.