The Naked Truth

The story of Damian Green’s porn-clogged computer has several facets, with a surprising number of them related to data protection. Whether it was a breach for former Deputy Commissioner Bob Quick to reveal that there was porn on the computer is hard to say for certain – I think Quick has a journalistic defence in revealing hypocrisy given that the Government is current waging a moralistic war on adult websites, but you are welcome to disagree. The fact that Quick has form for revealing information that he shouldn’t have only adds spice to the mix.

The question of why Green’s other accuser Neil Lewis still has his police notebooks raises more serious questions. Did he keep them without authorisation from the Met? If he did, this could be a criminal offence under Data Protection’s Section 55 for which Lewis would be liable. Did the Met Police fail to recover them properly? This would be a serious breach of the seventh data protection principle, for which the Met should expect to answer. In any case, I have to agree with those who say that public servants should respect confidences even after they leave the service. Sensitive material should never be retained by former officers of any organisation. I know my reaction to the story is clouded by the entertaining spectacle of seeing a politician caught with his pants down, or at least, unzipped. The question of how the story came to light needs to be interrogated.

Green’s use of the Shaggy Defence to claim that he knows nothing about the porn begs more questions. If he didn’t download it, this means that someone else did (none of the Tories defending him seem to claim that it doesn’t exist). Part of Green’s outrage when his office was raided in 2008 was the threat to the sanctity of Parliamentary Privilege and the confidentiality due to his constituents. In the light of this, Green needs to explain how it was possible for someone else to download porn onto his computer. The best case scenario for him is that this was the result of malware, rather than someone else being able to log into his computer without his knowledge. Of course, malware infecting an MP’s computer is a story in itself. Regardless of whether this story should be in the public domain, we can’t be expected to ignore it now. As someone who processes highly sensitive data about his constituents (as well as possibly other sensitive information), at some point Green has to explain who had access to his computer and what they were doing downloading porn. Or he has to admit that it was him.

I don’t know what, if anything, Green is guilty of, but his fellow Tory Nadine Dorries’ spectacular contribution on Saturday doesn’t allow for any ambiguity. The MP for Mid Bedfordshire has a habit of deleting tweets when she (or someone else running her account) realises how stupid they make her look, so I have screengrabbed this one and I reproduce it in full here:

My staff log onto my computer on my desk with my login everyday. Including interns on exchange programmes. For the officer on @BBCNews just now to claim that the computer on Greens desk was accessed and therefore it was Green is utterly preposterous !!

UPDATE: There’s more:

All my staff have my login details. A frequent shout when I manage to sit at my desk myself is, ‘what is the password?

ANOTHER UPDATE: Robert Syms MP is at it as well

As a constituency MP, Dorries will be handling sensitive correspondence on a wide variety of matters, and she has publicly confirmed that access to information is open to a wide variety of people, including interns on exchange programmes. To this, there is no defence. The seventh data protection principle states that a data controller must have in place appropriate technical and organisational security measures to prevent “unauthorised or unlawful processing of personal data, and against accidental loss of or destruction of or damage to personal data“. This means a mix of technical measures like passwords and encryption and organisational measures like ensuring that passwords are not shared or written down. Dorries has confirmed she has authorised password sharing in her office – which is bad enough in itself because it means passwords are spoken aloud or written down, greatly increasing the chance of the password being known to someone nefarious. But worse than that, she says specifically that a wide group of people share her login. There is no way of knowing who has accessed what, because even if the intern has done it, it looks like Nadine was the person responsible.

The only way that Dorries has not admitted a clear breach of Data Protection’s security principle is if she (or whoever wrote the tweet) is lying in order to defend Green,  which is quite the stupidest thing I can imagine.

There are several possible breaches here – Quick’s original revelations about Green, Lewis’ retention of his notebooks / the Met’s failure to recover them when he left, Green’s insecure computer equipment and Dorries’ admission of her completely lax security. While Quick and Green’s problems are somewhat murky, Lewis / Met Police and Dorries present much more straightforward issues for the Information Commissioner. Both should be investigated as a matter of urgency.

Given Dorries’ casual admission of the insecure way in which her office operates, a much wider investigation might be required. Elizabeth Denham has put huge resources into investigating the possibility of political use of analytics and big data in an unlawful way, even though it’s hard to imagine anything coming of it. On the other hand, here we have a sitting MP openly admitting that constituents’ data is unsafe – how many more of Dorries’ colleagues operate in a similarly unlawful fashion? I cannot complain to the ICO about these matters, as I am not affected by them. However, the issues are serious, and Wilmslow should step in immediately. A bland press release reminding MPs to process data safely is not good enough; the ICO needs to demonstrate that Data Protection law applies to MPs just as it does to the rest of us.

Another fine mess

For those working in Data Protection, there are many interesting things to note about the forthcoming General Data Protection Regulation. There is the clarification of consent, which may send tawdry marketers into a spin. There is the tightening of the rules over criminal records. There is the helpful emphasis on risk. My current favourite thing is a sly anti-establishment streak – here and there, the GDPR returns to the theme of the power imbalance between the data subject and the big public institution, and seeks to even up the score.

For some, however, there is only one thing to talk about. All that matters is the fines. Fines fines fines, all day long. A conference held in London last week was Fine City as far as the tweets were concerned. COMPANIES MIGHT GO BUST, apparently. Meanwhile, the Register breathlessly reheated a press release from cyber security outfit NCC Group, featuring a magical GDPR calculator that claims ICO’s 2016 penalties would have been either £59 million or £69 million under GDPR (the figure is different in the Register’s headline and story, and I can’t be bothered to find the original because it’s all bullshit).

This is my prediction. There will never be a maximum GDPR penalty in the UK. Nobody will ever be fined €20 million (however we calculate it in diminishing Brexit Pounds), or 4% of annual turnover. There will be a mild swelling in the amount of fines, but the dizzy heights so beloved of the phalanx of new GDPR experts (TRANSLATION: people in shiny suits who were in sales and IT in 2015) will never be scaled. It’s a nonsense myth from people with kit to sell. I have something to sell, friends, and I’m not going to sell it like this.

I have no quibble with DP officers and IG managers hurling a blood-curdling depiction of the penalties at senior management when they’re trying to get more / some resources to deal with the GDPR onslaught – I would have done it. There is probably a proper term for the mistake NCC made with their calculation, but I’m calling it the Forgetting The ICO Has To Do It Syndrome. NCC say Pharmacy2U’s penalty would inflate from £130,000 to £4.4 million, ignoring the fact that the decision would not be made by a robot. Pharmacy2U flogged the data of elderly and vulnerable people to dodgy health supplement merchants, and ICO *only* fined them £130,000, despite having a maximum of £500,000. Of course, some penalties have caused genuine pain for cash-strapped public authorities, but when NCC say that their adjusted-for-GDPR Pharmacy2U fine represented “a significant proportion of its revenues and potentially enough to put it out of business“, they’re not adjusting their hot air for reality.

Take the example of a monetary penalty issued by the ICO in March against a barrister. The barrister was involved in proceedings at the Family Court and the Court of Protection, so her files contained sensitive information about children and vulnerable adults. Despite guidance issued by the Law Society in 2013, they were stored unencrypted on her home computer. While upgrading the software on the machine, her husband backed up the files to online storage. Some of the files were indexed by search engines, and were subsequently found by a local authority lawyer.

The ICO fined the barrister £1000, reduced to £800 if they paid on time. I don’t think all barristers are loaded, but most could pay a penalty of £800 without going bankrupt. £800 isn’t remotely enough for a breach as basic and avoidable as this. The aggravating factors are everywhere – the Law Society guidance, the lack of encryption, the fact that the husband had access to the data. If the ICO was capable of issuing a £4.4 million penalty, they’d fine a barrister more than £800 for this mess. And what’s worse, they redacted the barrister’s name from the notice. The ICO offered no explanation for this, so I made an FOI request for the barrister’s name and for information about why the name was redacted.

They refused to give me the name, but disclosed internal correspondence about their decision to redact. There is a lot in the response to be concerned about. For one thing, in refusing to give me the name, the ICO contradicts its own penalty notice. The notice describes an ongoing contravention from 2013 (when the Law Society guidance was issued) to 2016 (when the data was discovered). Nevertheless, the FOI response states that “this data breach was considered a one off error“, and a reference to this characterisation is also made in the notes they disclosed to me.

If it was a one-off error, ICO couldn’t have issued the penalty, because they don’t have the power to fine people for incidents, only for breaches (in this case, the absence of appropriate technical and organisation security measures required by the Seventh Data Protection principle). Given that the notice states explicitly that the breach lasted for years, the ICO’s response isn’t true. It’s bad enough that the ICO is still mixing up incidents and breaches four years after this confusion lost them the Scottish Borders Tribunal appeal, it’s even worse that they seem not to understand the point of fining Data Controllers.

In the notes disclosed to me about the decision to redact the notice, ICO officials discuss the “negative impact” of the fine on the barrister, especially as she is a “professional person who is completely reliant on referrals from external clients“. Despite the Head of Enforcement putting a succinct and pragmatic case for disclosure: “it is easier to explain why we did (proportionate, deterrent effect) rather than why we didn’t“, he is unfortunately persuaded that the most important thing is to “avoid any damage to reputation”. Bizarrely, one person claimed that they could “get the deterrent message across” despite not naming the barrister.

The GDPR requires that fines be “effective, proportionate and dissuasive” – an anonymous £800 fine fails on each point. Anyone who takes their professional obligations seriously needs no horror stories to persuade them. For those who do not, an effective, proportionate and dissuasive penalty is either a stinging fine or naming and shaming. The ICO had no appetite for either option, and effectively let the barrister get away with it. They valued her professional reputation above the privacy of people whose data she put at risk, and future clients who will innocently give their confidential and private information to someone with this shoddy track record.

If the NCC Group, and all the various vendors and GDPR carpetbaggers are to be believed, within a year, the UK will operate under a regime of colossal, multi-million pound fines that will bring errant businesses to their knees. In reality, the ICO cut the fines on charities by 90% to avoid upsetting donors, and rendered their enforcement against an irresponsible data controller pointless for fear of putting her out of business.

These two pictures cannot be reconciled. It is entirely possible for the ICO to put someone out of business – indeed, many recipients of their PECR penalties are forced into liquidation (this may be a ploy to avoid the fines, but nevertheless, the businesses close). But the majority of PECR penalties are issued against businesses operating on the very fringe of legality – they are not mainstream data controllers. They are not nice, professional barristers. They are not the audience for the Great GDPR Fine Hysteria. If the ICO cannot stomach the risk of putting a single barrister out of business pour encourager les autres, it is disingenuous to pretend that they will rain down fire on mainstream data controllers after May 2018. We’ll get more of the same – cautious, reactive, distracted by the incident, and unwilling to take aim at hard targets. Plus ça change.

Lincolnshire poachers

Dark times in the Fens, as Lincolnshire County Council finds itself in the grip of diabolical cyber-blackmailers who demand £1,000,000 to release the local authority from the grip of a terrifying new strain of virus that has locked up all their files. As ever, it’s unwise to judge the outcome before all of the details are in, but Lincolnshire’s story has some interesting aspects. One element seems to go in Lincoln’s favour: this is “zero-day malware“, the first time that the particular infection has been detected. This obviously would make it harder to defend against, and in any case the Council is “confident it had appropriate security measures in place“.

The Council’s chief information officer Judith Hetherington-Smith reassured residents with the claim that they were “absolutely looking after their data and it hasn’t been compromised”. This implies that no personal data has been compromised, but this can’t be entirely squared with some of Hetherington-Smith’s other comments. For example “Once we identified it we shut the network down, but some damage is always done before you get to that point – and some files have been locked by the software” Right, so there’s some damage then? “A lot of the files will be available for us to restore from the back-up.” A lot of them but not all of them? What about the ones that aren’t available?

That back-up is interesting, in the light of the fact that “People can only use pens and paper, we’ve gone back a few years.” An inherent part of information security is business continuity, ensuring that even if something falls over, the place can keep running. I’m running a course this week for people responsible for risk-managing big information systems, and the client has specifically asked me to emphasise the need for business continuity to be built in. The whole point of this is not to be knocked back to the pen and paper age – I heard a report on Radio 4 that Lincolnshire’s social workers had not had access to systems for several days, which means those charged with protecting the most vulnerable in Lincolnshire don’t have access to information they need to do their job. If this information isn’t “compromised“, then I don’t know how else you would define it. It’s a catastrophe. Rather than attempting to reassure (I’m amazed that no-one has said that they take Data Protection very seriously), the council needs to explain why they are offline for days without a back-up that allows essential systems to keep running.

But the most interesting part of the story, and the element that is most crucial for deciding whether Lincolnshire has breached the Data Protection Act is how the infection got into their systems in the first place. Forget the eye-catching ransom demand, the terrifying challenge of the previously unseen virus, forget even the question of why the Council has no alternative option when attacked than blindness and pens & paper. How did it happen, you cry? How did these cunning cyber-ninjas drip their deadly poison despite all of Lincolnshire’s “appropriate security measures“?

Somebody opened an email. 

I don’t know how good Lincolnshire’s technical security is: however sceptical I might be,  there may be good reasons why they could not mirror their systems or back them up in such a way that they could not be restored more quickly. Nevertheless, everything that the Council has said or done since the incident, even if their claim that no data has been compromised is true (I don’t believe them, but OK), is irrelevant. The fundamental question is why their staff are capable of falling victim to the dumbest, most basic security attack known to humankind. I just hope they don’t get any emails about the frozen bank accounts of the late Dr Hastings Kamuzu Banda. The Lincolnshire incident was entirely, wholly preventable, and they have to explain both to the Information Commissioner and to the fine folk of Lincolnshire why they allowed this to happen.

I have said it a thousand times, and here I am saying it again. An incident is not a breach. In order to have complied, Lincolnshire’s “appropriate security measures” have to include regular training and reminders, specifically warning about threats like malware in emails. Managers have to regularly check how their staff are working and whether they are following the clear, widely disseminated procedures and policies that would be necessary in order to comply. Audits would have to be in place, and the individual systems that Lincolnshire has had to switch off should have been assigned to named asset owners, who are responsible for actively assessing risks entirely like this one, and putting measures in place to keep them running even in the face of attacks.

If the person who opened the email has not been trained, reminded and appropriately supervised, this whole incident is Lincolnshire County Council’s fault and they should be taken to task for it. It doesn’t matter how sophisticated the software was, how unexpected Lincolnshire might be as a target: THEY LET THE BURGLARS IN. All the warm words about what happened after that, even if they’re all true, make no difference to this basic fact. You may say that an organisation can’t prevent human error, but that’s nonsense. Training, reminders, appropriate supervision and picking the right people in the first place massively reduce human error. Everything that happens afterwards is damage limitation: either Lincolnshire did what was required beforehand, or it’s a breach.

The Gamekeeper’s Fear of the Penalty

Amongst the hype over the end of negotiations over the new EU Data Protection Regulation, one theme kept emerging again and again: Big Penalties. It’s understandable that people might want to focus on it. The UK goes from a maximum possible penalty of £500,000 to one of just under £15,000,000 (at today’s Euro conversion rate) or even 4% of a private enterprise’s annual worldwide turnover. Only a fool would say that it wasn’t worth talking about. It’s much more interesting than the bit about Codes of Practice, and it’s easier to explain than the section about certification bodies.

It would be equally foolish to assume, however, that penalties on this scale will rain down from Wilmslow like thunderbolts from Zeus. At the same time as many were talking up the future, the Information Commissioner issued two monetary penalties under the current regime, one under Data Protection (£250 for the Bloomsbury Patient Network) and one under the Privacy and Electronic Communications Regulations (£30,000 for the Daily Telegraph). The £250 penalty is the lowest the ICO has ever issued for anything, while the PECR one is the lowest for a breach of the marketing rules, notwithstanding that the Daily Telegraph is probably the richest PECR target at which the ICO has taken aim.

You could argue that the embarrassment caused to the Telegraph carries an added sting (the ICO has never before taken enforcement action against a newspaper). It’s equally likely that the oligarchs who own the paper will consider £30,000 (£24,000 if they pay up in 35 days) to be a price worth paying if it had the desired effect on the outcome of a very close election. They’ll probably do it again.

In any case, the Bloomsbury Patient Network CMP is much worse. The Regulation calls for monetary penalties to be effective, proportionate and dissuasive, and yet everybody at the ICO thought that a £250 penalty, split between three people, was action worth taking and promoting. The Commissioner himself, Christopher Graham told the DMA in March 2015 that the ICO was not a ‘traffic warden‘, but if the Bloomsbury Three pay up on time, the £66.67 penalty they each face is no worse than a parking ticket you didn’t pay in the first fortnight.

The ICO’s press release claims that the penalty would have been much higher if the data controller had not been an ‘unincorporated association’, but this is irrelevant. The ICO issued a £440,000 PECR penalty against two individuals (Chris Niebel and Gary McNeish) in 2012, while the Claims Management Regulator recently issued a whopping £850,000 penalty against Zahier Hussain for cold calling and similar dodgy practices. The approach on PECR and marketing is positively steely. The problem clearly lies in Data Protection enforcement, and that is what the Regulation is concerned with.

The size and resources of the offending data controller are a secondary consideration; the test is whether the penalty will cause undue financial hardship. The ICO could bankrupt someone or kill their business if they deserved it. The Bloomsbury Patient Network’s handling of the most sensitive personal data was sloppy and incompetent, and had already led to breaches of confidentiality before the incident that gave rise to the penalty. Enforcement action at a serious level was clearly justified. Even if the level of the penalty was high enough to deter well-meaning amateurs from processing incredibly sensitive data, this would be a good thing. If you’re not capable of handling data about a person’s HIV status with an appropriate level of security, you have absolutely no business doing so at all, no matter good your intentions are. Donate to the Terence Higgins Trust by all means, but do not touch anyone’s data. If the ICO lacks the guts to issue a serious penalty, it would be better to do nothing at all and keep quiet, rather than display their gutlessness to the world.

Whoever made this decision cannot have considered what message it would send to organisations large and small who already think of Data Protection as pettifogging red tape, low on the agenda. Is there an organisation anywhere in the country that would consider the slim chance of being fined £66.67 to be a deterrent against anything. A fine is a punishment (it has to cause pain to those who pay it) and it is a lesson to others (it has to look painful to the wider world). The Bloomsbury Patient Network CMP is neither.

Despite the increased expectations raised by the GDPR, the ICO is actually losing its appetite for DP enforcement, with 13 Data Protection CMPs in 2013, but only 6 in 2014 and 7 in 2015. Meanwhile, there have been 24 unenforceable DP undertakings in 2015 alone, including one against Google which you’re welcome to explain the point of, and another (Flybe) which revealed endemic procedural and training problems in the airline which are more significant than the moronic cock-ups that went on at the Bloomsbury Patient Network. Wilmslow is so inert that two different organisations have told me this year that ICO staff asked them to go through the motions of self-reporting incidents that ICO already knew about, because the only way the enforcement wheels could possibly begin to turn was if an incident was self-reported. ICO staff actually knowing that something had happened wasn’t enough. It’s these same timid people who will be wielding the new powers in 2018.

Admittedly, there will be a new Commissioner, and it’s possible that the Government will pick a fearsome enforcement fiend to go after Data Protection like a dog in a sausage factory. You’ll forgive me if I don’t hold my breath. Nevertheless, something in Wilmslow has to change, because the General Data Protection Regulation represents a clear rebuke to the ICO’s DP enforcement approach.

Most obviously, in the long list of tasks in Article 52 that each Data Protection Authority must carry out, the first is very powerful: they must “monitor and enforce” (my emphasis) the application of the Regulation. Someone recently said that in certain circumstances, some organisations require a ‘regulatory nudge’, but the Regulation is much more emphatic than that. The ICO’s preference for hand-holding, nuzzling and persuading stakeholders (especially those where former ICO colleagues have gone to work) is a world away from an enforcement-led approach.

The huge increase of penalties throws down the gauntlet, especially when the ICO has rarely approached the current, comparatively low UK maximum. But the ICO should also pay close attention to the detail of Article 79 of the Regulation, where the new penalties are laid out. Of the 59 ICO monetary penalties, 57 have been for breaches of the 7th principle (security). The Regulation has two levels of penalty, the lower with a maximum of €10,000,000 (or 2% of annual turnover), and the higher with a maximum of €20,000,000 (or 4% of annual turnover). Breaches of Article 30, a very close analogue to Principle 7, is in the lower tier.

Admittedly, the higher penalty applies to all of the principles in Article 5 (which in a somewhat circular fashion includes security), but it explicitly covers “conditions for consent“, “data subject rights” and infringements involving transfers to third countries, areas untouched by the ICO’s DP penalty regime. The Regulation envisages monetary penalties at the higher level for processing without a condition, inaccuracy, poor retention, subject access as well as new rights like the right to be forgotten or the right to object. The ICO has issued a solitary penalty on fairness, and just one on accuracy – it has never fined on subject access, despite that being the largest single cause of data subject complaints.

The Regulation bites hard on the use of consent and legitimate interest, and misuse of data when relying on them would again carry the higher penalty. Most organisations that rely on consent or legitimate interest are outside the public sector, who rely more on legal obligations and powers. Indeed, the Regulation even allows for the public sector to be excluded from monetary penalties altogether if a member state wishes it. Nevertheless, since they got the power to issue them, only 24% of the ICO’s civil monetary penalties have been served on organisations outside the public sector (2 for charities and 12 for private sector).

I doubt the ICO is ready for what the Regulation demands, and what data subjects will naturally expect from such a deliberate attempt to shape the enforcement of privacy rights. The penalties are too low. The dwindling amount of DP enforcement is based almost exclusively on self-reported security breaches. While the Regulation might feed a few private sector cases onto the conveyor belt by way of mandatory reporting of security breaches, it will do nothing for the ICO’s ability to identify suitable cases for anything else. Few ICO CMPs spring from data subject complaints, and anyone who has ever tried to alert Wilmslow to an ongoing breach when they are not directly affected knows how painful a process that can be. The ICO has not enforced on most of the principles.

It’s been my habit whenever talking about the Regulation to people I’m working for to emphasise the period we’re about to enter. There are two years before the Regulation comes into force; two years to get ready, to look at practice and procedure, two years to tighten up. The need to adapt to the future goes double for the Information Commissioner’s Office. Instead of canoodling with stakeholders and issuing wishy-washy guidance, wringing its hands and promising to be an ‘enabler’, the ICO should take a long hard look in the mirror. Its job is to enforce the law; everything else is an optional extra. It’s wise to assume that the wish for total DP harmonisation will probably be a pipe dream; it’s equally obvious that the Regulation will allow for much easier comparisons between EU member states, and the ICO’s lightest of light touches will be found wanting.

Whoops!

Yesterday, after at least a year of pondering it, the Information Commissioner asked the Universities and Colleges Admissions Service (UCAS) to sign an undertaking, agreeing to change the way in which they obtain consent to use students’ data. The data is obtained as part of the application process and subsequently used for marketing a variety of products and services, and UCAS has agreed to change its approach. It’s important to note that this is an undertaking, so UCAS has not been ordered to do anything, nor are there any direct consequences if they fail to do what is stated in the undertaking. An undertaking is a voluntary exercise – it is not served, it does not order or require, it simply documents an agreement by a Data Controller to do something.

Aspects of the story concern me. The ICO’s head of enforcement is quoted as saying: “By failing to give these applicants a clear option to avoid marketing, they were being unfairly faced with the default option of having their details used for commercial purposes” but given that the marketing was sent by text and email, the opportunity to “avoid” marketing is not what should have been in place. If UCAS wanted to sell access to university and college applicants, they needed consent – which means opt-in, not opt-out. As the undertaking itself points out, consent is defined in the EU Data Protection Directive as freely given – an opt-out cannot constitute this in my opinion. If you think that an opt-out does constitute consent, try transposing that thinking into any other situation where consent is required, and see how creepy your thinking has suddenly become. Consent should be a free choice, made actively. We should not have to stop commercial companies from texting and emailing us – the onus should be on them to make an attractive offer we want to take up, not on consumers to bat away their unwanted attentions.

It’s entirely possible that the ICO’s position on consent is better expressed in the undertaking itself, but here we have a little problem. At least when it was published yesterday, half of the undertaking was missing. Only the oddly numbered pages were published, so presumably the person who scanned the document had a double-sided original and didn’t notice that they had scanned it single-sided. The published document also included one page of UCAS’ covering letter and the final signed page of the undertaking, which the ICO never normally publishes. This mistake reveals some interesting nuggets that we wouldn’t normally know, from the trivial (the Chief Executive of UCAS signed the undertaking with a fountain pen, something of which I wholeheartedly approve) to the potentially significant (the covering letter sets out when UCAS might divert resources away from complying with the undertaking).

But that’s not the point. The point is that the ICO uploaded the wrong document to the internet, and this is not the first time it has happened. I know this because on a previous occasion, I contacted the ICO to tell them that they had done it, and many people on my training courses have also seen un-redacted enforcement and FOI notices on the ICO website. The data revealed in the UCAS case is not sensitive (although I don’t know how the UCAS Chief would feel about her signature being published on the internet), but that’s not the point either. The ICO has spent the last ten years taking noisy, self-righteous action against a variety of mainly public bodies for security slip-ups, and the past five issuing monetary penalties for the same, including several following the accidental publication of personal data on the internet.

The issue here is simple: does the ICO’s accidental publication of this undertaking constitute a breach of the 7th Data Protection Principle? They know about the risk because they’ve done it before. Have they taken appropriate technical and organisational measures to prevent this from happening? Is there a clear process to ensure that the right documents are published? Are documents checked before they are uploaded? Does someone senior check whether the process is being followed? Is everyone involved in the process properly trained in the handling of personal data, and in the technology required to publish documents onto the web? And even if all of these measures are in place, is action taken when such incidents are identified? If the ICO can give positive answers to all these questions, then it is not a breach. Stuff happens. But if they have not, it is a breach.

There is no possibility, no matter how hilarious it would be, that the ICO will issue a CMP on itself following this incident, although it is technically possible. What should happen is that the ICO should quickly and effectively take steps to prevent this from happening again. However, if the Information Commissioner’s Office does not ask the Information Commissioner Christopher Graham to sign an undertaking, publicly stating what these measures will be, they cannot possibly speak and act with authority the next time they ask someone else to the same. Whether they redact Mr Graham’s signature is entirely a matter for them.

UPDATE: without acknowledging their mistake, the Information Commissioner’s Office has now changed the undertaking to be the version they clearly intended to publish. One wonders if anything has been done internally, or if they are simply hoping that only smartarses like me noticed in the first place.