Lincolnshire poachers

Dark times in the Fens, as Lincolnshire County Council finds itself in the grip of diabolical cyber-blackmailers who demand £1,000,000 to release the local authority from the grip of a terrifying new strain of virus that has locked up all their files. As ever, it’s unwise to judge the outcome before all of the details are in, but Lincolnshire’s story has some interesting aspects. One element seems to go in Lincoln’s favour: this is “zero-day malware“, the first time that the particular infection has been detected. This obviously would make it harder to defend against, and in any case the Council is “confident it had appropriate security measures in place“.

The Council’s chief information officer Judith Hetherington-Smith reassured residents with the claim that they were “absolutely looking after their data and it hasn’t been compromised”. This implies that no personal data has been compromised, but this can’t be entirely squared with some of Hetherington-Smith’s other comments. For example “Once we identified it we shut the network down, but some damage is always done before you get to that point – and some files have been locked by the software” Right, so there’s some damage then? “A lot of the files will be available for us to restore from the back-up.” A lot of them but not all of them? What about the ones that aren’t available?

That back-up is interesting, in the light of the fact that “People can only use pens and paper, we’ve gone back a few years.” An inherent part of information security is business continuity, ensuring that even if something falls over, the place can keep running. I’m running a course this week for people responsible for risk-managing big information systems, and the client has specifically asked me to emphasise the need for business continuity to be built in. The whole point of this is not to be knocked back to the pen and paper age – I heard a report on Radio 4 that Lincolnshire’s social workers had not had access to systems for several days, which means those charged with protecting the most vulnerable in Lincolnshire don’t have access to information they need to do their job. If this information isn’t “compromised“, then I don’t know how else you would define it. It’s a catastrophe. Rather than attempting to reassure (I’m amazed that no-one has said that they take Data Protection very seriously), the council needs to explain why they are offline for days without a back-up that allows essential systems to keep running.

But the most interesting part of the story, and the element that is most crucial for deciding whether Lincolnshire has breached the Data Protection Act is how the infection got into their systems in the first place. Forget the eye-catching ransom demand, the terrifying challenge of the previously unseen virus, forget even the question of why the Council has no alternative option when attacked than blindness and pens & paper. How did it happen, you cry? How did these cunning cyber-ninjas drip their deadly poison despite all of Lincolnshire’s “appropriate security measures“?

Somebody opened an email. 

I don’t know how good Lincolnshire’s technical security is: however sceptical I might be,  there may be good reasons why they could not mirror their systems or back them up in such a way that they could not be restored more quickly. Nevertheless, everything that the Council has said or done since the incident, even if their claim that no data has been compromised is true (I don’t believe them, but OK), is irrelevant. The fundamental question is why their staff are capable of falling victim to the dumbest, most basic security attack known to humankind. I just hope they don’t get any emails about the frozen bank accounts of the late Dr Hastings Kamuzu Banda. The Lincolnshire incident was entirely, wholly preventable, and they have to explain both to the Information Commissioner and to the fine folk of Lincolnshire why they allowed this to happen.

I have said it a thousand times, and here I am saying it again. An incident is not a breach. In order to have complied, Lincolnshire’s “appropriate security measures” have to include regular training and reminders, specifically warning about threats like malware in emails. Managers have to regularly check how their staff are working and whether they are following the clear, widely disseminated procedures and policies that would be necessary in order to comply. Audits would have to be in place, and the individual systems that Lincolnshire has had to switch off should have been assigned to named asset owners, who are responsible for actively assessing risks entirely like this one, and putting measures in place to keep them running even in the face of attacks.

If the person who opened the email has not been trained, reminded and appropriately supervised, this whole incident is Lincolnshire County Council’s fault and they should be taken to task for it. It doesn’t matter how sophisticated the software was, how unexpected Lincolnshire might be as a target: THEY LET THE BURGLARS IN. All the warm words about what happened after that, even if they’re all true, make no difference to this basic fact. You may say that an organisation can’t prevent human error, but that’s nonsense. Training, reminders, appropriate supervision and picking the right people in the first place massively reduce human error. Everything that happens afterwards is damage limitation: either Lincolnshire did what was required beforehand, or it’s a breach.

The Gamekeeper’s Fear of the Penalty

Amongst the hype over the end of negotiations over the new EU Data Protection Regulation, one theme kept emerging again and again: Big Penalties. It’s understandable that people might want to focus on it. The UK goes from a maximum possible penalty of £500,000 to one of just under £15,000,000 (at today’s Euro conversion rate) or even 4% of a private enterprise’s annual worldwide turnover. Only a fool would say that it wasn’t worth talking about. It’s much more interesting than the bit about Codes of Practice, and it’s easier to explain than the section about certification bodies.

It would be equally foolish to assume, however, that penalties on this scale will rain down from Wilmslow like thunderbolts from Zeus. At the same time as many were talking up the future, the Information Commissioner issued two monetary penalties under the current regime, one under Data Protection (£250 for the Bloomsbury Patient Network) and one under the Privacy and Electronic Communications Regulations (£30,000 for the Daily Telegraph). The £250 penalty is the lowest the ICO has ever issued for anything, while the PECR one is the lowest for a breach of the marketing rules, notwithstanding that the Daily Telegraph is probably the richest PECR target at which the ICO has taken aim.

You could argue that the embarrassment caused to the Telegraph carries an added sting (the ICO has never before taken enforcement action against a newspaper). It’s equally likely that the oligarchs who own the paper will consider £30,000 (£24,000 if they pay up in 35 days) to be a price worth paying if it had the desired effect on the outcome of a very close election. They’ll probably do it again.

In any case, the Bloomsbury Patient Network CMP is much worse. The Regulation calls for monetary penalties to be effective, proportionate and dissuasive, and yet everybody at the ICO thought that a £250 penalty, split between three people, was action worth taking and promoting. The Commissioner himself, Christopher Graham told the DMA in March 2015 that the ICO was not a ‘traffic warden‘, but if the Bloomsbury Three pay up on time, the £66.67 penalty they each face is no worse than a parking ticket you didn’t pay in the first fortnight.

The ICO’s press release claims that the penalty would have been much higher if the data controller had not been an ‘unincorporated association’, but this is irrelevant. The ICO issued a £440,000 PECR penalty against two individuals (Chris Niebel and Gary McNeish) in 2012, while the Claims Management Regulator recently issued a whopping £850,000 penalty against Zahier Hussain for cold calling and similar dodgy practices. The approach on PECR and marketing is positively steely. The problem clearly lies in Data Protection enforcement, and that is what the Regulation is concerned with.

The size and resources of the offending data controller are a secondary consideration; the test is whether the penalty will cause undue financial hardship. The ICO could bankrupt someone or kill their business if they deserved it. The Bloomsbury Patient Network’s handling of the most sensitive personal data was sloppy and incompetent, and had already led to breaches of confidentiality before the incident that gave rise to the penalty. Enforcement action at a serious level was clearly justified. Even if the level of the penalty was high enough to deter well-meaning amateurs from processing incredibly sensitive data, this would be a good thing. If you’re not capable of handling data about a person’s HIV status with an appropriate level of security, you have absolutely no business doing so at all, no matter good your intentions are. Donate to the Terence Higgins Trust by all means, but do not touch anyone’s data. If the ICO lacks the guts to issue a serious penalty, it would be better to do nothing at all and keep quiet, rather than display their gutlessness to the world.

Whoever made this decision cannot have considered what message it would send to organisations large and small who already think of Data Protection as pettifogging red tape, low on the agenda. Is there an organisation anywhere in the country that would consider the slim chance of being fined £66.67 to be a deterrent against anything. A fine is a punishment (it has to cause pain to those who pay it) and it is a lesson to others (it has to look painful to the wider world). The Bloomsbury Patient Network CMP is neither.

Despite the increased expectations raised by the GDPR, the ICO is actually losing its appetite for DP enforcement, with 13 Data Protection CMPs in 2013, but only 6 in 2014 and 7 in 2015. Meanwhile, there have been 24 unenforceable DP undertakings in 2015 alone, including one against Google which you’re welcome to explain the point of, and another (Flybe) which revealed endemic procedural and training problems in the airline which are more significant than the moronic cock-ups that went on at the Bloomsbury Patient Network. Wilmslow is so inert that two different organisations have told me this year that ICO staff asked them to go through the motions of self-reporting incidents that ICO already knew about, because the only way the enforcement wheels could possibly begin to turn was if an incident was self-reported. ICO staff actually knowing that something had happened wasn’t enough. It’s these same timid people who will be wielding the new powers in 2018.

Admittedly, there will be a new Commissioner, and it’s possible that the Government will pick a fearsome enforcement fiend to go after Data Protection like a dog in a sausage factory. You’ll forgive me if I don’t hold my breath. Nevertheless, something in Wilmslow has to change, because the General Data Protection Regulation represents a clear rebuke to the ICO’s DP enforcement approach.

Most obviously, in the long list of tasks in Article 52 that each Data Protection Authority must carry out, the first is very powerful: they must “monitor and enforce” (my emphasis) the application of the Regulation. Someone recently said that in certain circumstances, some organisations require a ‘regulatory nudge’, but the Regulation is much more emphatic than that. The ICO’s preference for hand-holding, nuzzling and persuading stakeholders (especially those where former ICO colleagues have gone to work) is a world away from an enforcement-led approach.

The huge increase of penalties throws down the gauntlet, especially when the ICO has rarely approached the current, comparatively low UK maximum. But the ICO should also pay close attention to the detail of Article 79 of the Regulation, where the new penalties are laid out. Of the 59 ICO monetary penalties, 57 have been for breaches of the 7th principle (security). The Regulation has two levels of penalty, the lower with a maximum of €10,000,000 (or 2% of annual turnover), and the higher with a maximum of €20,000,000 (or 4% of annual turnover). Breaches of Article 30, a very close analogue to Principle 7, is in the lower tier.

Admittedly, the higher penalty applies to all of the principles in Article 5 (which in a somewhat circular fashion includes security), but it explicitly covers “conditions for consent“, “data subject rights” and infringements involving transfers to third countries, areas untouched by the ICO’s DP penalty regime. The Regulation envisages monetary penalties at the higher level for processing without a condition, inaccuracy, poor retention, subject access as well as new rights like the right to be forgotten or the right to object. The ICO has issued a solitary penalty on fairness, and just one on accuracy – it has never fined on subject access, despite that being the largest single cause of data subject complaints.

The Regulation bites hard on the use of consent and legitimate interest, and misuse of data when relying on them would again carry the higher penalty. Most organisations that rely on consent or legitimate interest are outside the public sector, who rely more on legal obligations and powers. Indeed, the Regulation even allows for the public sector to be excluded from monetary penalties altogether if a member state wishes it. Nevertheless, since they got the power to issue them, only 24% of the ICO’s civil monetary penalties have been served on organisations outside the public sector (2 for charities and 12 for private sector).

I doubt the ICO is ready for what the Regulation demands, and what data subjects will naturally expect from such a deliberate attempt to shape the enforcement of privacy rights. The penalties are too low. The dwindling amount of DP enforcement is based almost exclusively on self-reported security breaches. While the Regulation might feed a few private sector cases onto the conveyor belt by way of mandatory reporting of security breaches, it will do nothing for the ICO’s ability to identify suitable cases for anything else. Few ICO CMPs spring from data subject complaints, and anyone who has ever tried to alert Wilmslow to an ongoing breach when they are not directly affected knows how painful a process that can be. The ICO has not enforced on most of the principles.

It’s been my habit whenever talking about the Regulation to people I’m working for to emphasise the period we’re about to enter. There are two years before the Regulation comes into force; two years to get ready, to look at practice and procedure, two years to tighten up. The need to adapt to the future goes double for the Information Commissioner’s Office. Instead of canoodling with stakeholders and issuing wishy-washy guidance, wringing its hands and promising to be an ‘enabler’, the ICO should take a long hard look in the mirror. Its job is to enforce the law; everything else is an optional extra. It’s wise to assume that the wish for total DP harmonisation will probably be a pipe dream; it’s equally obvious that the Regulation will allow for much easier comparisons between EU member states, and the ICO’s lightest of light touches will be found wanting.


Yesterday, after at least a year of pondering it, the Information Commissioner asked the Universities and Colleges Admissions Service (UCAS) to sign an undertaking, agreeing to change the way in which they obtain consent to use students’ data. The data is obtained as part of the application process and subsequently used for marketing a variety of products and services, and UCAS has agreed to change its approach. It’s important to note that this is an undertaking, so UCAS has not been ordered to do anything, nor are there any direct consequences if they fail to do what is stated in the undertaking. An undertaking is a voluntary exercise – it is not served, it does not order or require, it simply documents an agreement by a Data Controller to do something.

Aspects of the story concern me. The ICO’s head of enforcement is quoted as saying: “By failing to give these applicants a clear option to avoid marketing, they were being unfairly faced with the default option of having their details used for commercial purposes” but given that the marketing was sent by text and email, the opportunity to “avoid” marketing is not what should have been in place. If UCAS wanted to sell access to university and college applicants, they needed consent – which means opt-in, not opt-out. As the undertaking itself points out, consent is defined in the EU Data Protection Directive as freely given – an opt-out cannot constitute this in my opinion. If you think that an opt-out does constitute consent, try transposing that thinking into any other situation where consent is required, and see how creepy your thinking has suddenly become. Consent should be a free choice, made actively. We should not have to stop commercial companies from texting and emailing us – the onus should be on them to make an attractive offer we want to take up, not on consumers to bat away their unwanted attentions.

It’s entirely possible that the ICO’s position on consent is better expressed in the undertaking itself, but here we have a little problem. At least when it was published yesterday, half of the undertaking was missing. Only the oddly numbered pages were published, so presumably the person who scanned the document had a double-sided original and didn’t notice that they had scanned it single-sided. The published document also included one page of UCAS’ covering letter and the final signed page of the undertaking, which the ICO never normally publishes. This mistake reveals some interesting nuggets that we wouldn’t normally know, from the trivial (the Chief Executive of UCAS signed the undertaking with a fountain pen, something of which I wholeheartedly approve) to the potentially significant (the covering letter sets out when UCAS might divert resources away from complying with the undertaking).

But that’s not the point. The point is that the ICO uploaded the wrong document to the internet, and this is not the first time it has happened. I know this because on a previous occasion, I contacted the ICO to tell them that they had done it, and many people on my training courses have also seen un-redacted enforcement and FOI notices on the ICO website. The data revealed in the UCAS case is not sensitive (although I don’t know how the UCAS Chief would feel about her signature being published on the internet), but that’s not the point either. The ICO has spent the last ten years taking noisy, self-righteous action against a variety of mainly public bodies for security slip-ups, and the past five issuing monetary penalties for the same, including several following the accidental publication of personal data on the internet.

The issue here is simple: does the ICO’s accidental publication of this undertaking constitute a breach of the 7th Data Protection Principle? They know about the risk because they’ve done it before. Have they taken appropriate technical and organisational measures to prevent this from happening? Is there a clear process to ensure that the right documents are published? Are documents checked before they are uploaded? Does someone senior check whether the process is being followed? Is everyone involved in the process properly trained in the handling of personal data, and in the technology required to publish documents onto the web? And even if all of these measures are in place, is action taken when such incidents are identified? If the ICO can give positive answers to all these questions, then it is not a breach. Stuff happens. But if they have not, it is a breach.

There is no possibility, no matter how hilarious it would be, that the ICO will issue a CMP on itself following this incident, although it is technically possible. What should happen is that the ICO should quickly and effectively take steps to prevent this from happening again. However, if the Information Commissioner’s Office does not ask the Information Commissioner Christopher Graham to sign an undertaking, publicly stating what these measures will be, they cannot possibly speak and act with authority the next time they ask someone else to the same. Whether they redact Mr Graham’s signature is entirely a matter for them.

UPDATE: without acknowledging their mistake, the Information Commissioner’s Office has now changed the undertaking to be the version they clearly intended to publish. One wonders if anything has been done internally, or if they are simply hoping that only smartarses like me noticed in the first place.

Tales from the Crypt

If you don’t work in local government, you may never have encountered the Local Government Ombudsman, an organisation devoted to giving nutcases somewhere to grind their axes investigating possible maladministration in councils. The scope of the LGO’s work includes everything that councils do, but inevitably many complaints are about the most sensitive areas: child protection, looked after children, adoption, and adult social care. In dealing with complaints from the public, the LGO gets access to genuinely and (in Data Protection terms) legally sensitive information. Inevitably, given that councils have been the target of more ICO civil monetary penalties than any other sector, largely because councils are dumb enough to keep dobbing themselves in to Wilmslow, many are keen to use the most secure way of sending this confidential data to the Ombudsman.

It may seem odd, therefore, that the LGO sent an email to councils last month, containing the following message:

Encrypt or not to encrypt – that is the question …..

We’ve had a number of issues accessing encrypted emails which have been sent to us by councils. Whilst we appreciate that your information security policy may dictate how you send information to us, if there is any discretion please only send encrypted emails when it’s absolutely necessary.

Someone mentioned the gist of it to me, but I made an FOI request to the LGO to be certain that they really were sending out such a daft message. The LGO’s Information and Records Manager rather sweetly explained in their response to me that “our intention in sending this request was discourage councils encrypting emails that contain no sensitive personal or confidential data. Of course, if councils are sending sensitive personal data we would expect them to encrypt it – as we would do ourselves“. This is a useful piece of context for someone asking for the information under the auspices of FOI. However, this isn’t what they said to the numerous council link officers who received the email, and who were expected to act upon its contents. It’s almost the opposite.

Encrypting devices within an organisation is an easier proposition, as all the devices and connecting software are already part of the same system. The problem with encrypting email is undoubtedly that it involves different systems and protocols butting heads in the attempt to make a connection. The LGO pointed out to me that their case management system contains its own email system which can make receipt of an encrypted email difficult. But this is the LGO’s problem and nobody else’s. Councils have no choice about whether to supply data – one of the ‘key facts’ about the LGO on their website is that “We have the same powers as the High Court to obtain information and documents“. Given the ICO’s historic fondness for fining the sector for data security lapses, if councils opt for encryption by default, they should be applauded, especially by the organisation set up to investigate their conduct.

This will inevitably pose problems for the LGO internally, but the solution to this is not to encourage councils to reverse sensible changes in behaviour that another regulator has been pushing them into. They are a regulator whose job it is to deal with a diverse and multilayered sector with widely disparate cultures and practices, and they have to be capable of swallowing the inconvenient implications of it this. However difficult it might be to cope with, especially without the clarification provided to me in my FOI response (and as far as I know, to no-one else), the LGO’s current advice is damaging and unsafe. Councils should ignore it, and the LGO should withdraw it.

Crazy Naked Girls

There’s little to like about the voyeuristic coverage of the theft of images of famous women. Whether it is the feverish frottage of the mainstream press (which largely boils down to LOOK AT ‘EM ALL, IMAGINE ‘EM ALL NAKED, NNNNNNNNGGGGGG!!!!!) or the inevitably crass victim blaming (thank you, Ricky Gervais, for The Office and for absolutely nothing else), it’s all depressing.

The data protection strand in all this hasn’t been much better. Mobile devices are not a safe place to store sensitive data (true). The cloud is – in Graham Cluley’s immaculate phrase – just someone else’s computer (true). But too many security commentators have, perhaps unwittingly, aligned themselves with a ‘They asked for it’ line of thinking. A popular analogy is the one about burglary or car theft (this is an example from 2011). Apparently, you can’t complain if you leave your valuables on the front seat of your car and somebody steals them, and the same goes for pictures of your bits and the internet. In other words, the thinking is more or less that if Jennifer Lawrence is silly enough to take pictures of herself naked, she was basically asking for them to be stolen. For me, this is too close to the mentality that blames rape victims for being drunk, rather than rapists for being rapists. Friends, I blame the rapists.

Taking pictures of oneself is normal for most people, not just actresses – I am odd because I don’t do it, but if I was good looking, I probably would, all the time. It must be great to be extraordinary, and to enjoy being extraordinary. It’s too easy to be holier-than-thou and say that the violated only have themselves to blame. The victims made these images for themselves or they made them for someone else specific. They did not make the images for the media, or for the voyeurs who stole, sold or search for them. Anyone who handles or seeks them out violates the subject’s privacy, is a criminal and should be treated as such. The victims did nothing remotely scandalous or reprehensible – indeed, they did nothing that is anyone else’s business but their own. They probably didn’t do a privacy impact assessment before taking the pics, but that’s because they’re human beings and not data controllers.

The car analogy doesn’t work because mobile phones and the internet are not immediately understandable physical objects and spaces. When you leave your laptop on the passenger seat of your car, you can turn around and see the laptop sitting there. The risk is apparent and obvious. There’s a striking moment in Luc Besson’s current film ‘Lucy’ where Scarlett Johansen can see data streams soaring out of mobile phones across Paris, and navigates her way through them. We don’t see data like this. Few understand how the internet actually works (I’ve met a lot of people who think cloud storage means that data is floating in the air like a gas). We don’t see the data flowing or spot the footprint it leaves behind. We don’t know where the data ends up and the companies we use don’t tell us. We use unhelpful misnomers like ‘the cloud’ when we mean ‘server in a foreign land’. Many people don’t know how their phones work, where their data is stored, how it is copied or protected, or who can get access to it. This should be the problem that the photo hack alerts us to.

It’s possible that some people would change the way they used technology if they fully understood how it works, but that should be their choice, based on clear information provided by the manufacturers. At least one of those affected has confirmed that the images of her are quite old, so we can’t even judge the situation on what we know now. If taking the pics was a mistake (and I don’t think I’m entitled to say it was), it was a mistake made possibly years ago.

I don’t think people understand where their data is or how it is stored. Rather than wagging our fingers at the victims of a sex crime, anyone involved in data protection and security should concentrate on educating the world about the risks. I think the big tech companies like Google, Apple and Facebook would be uncomfortable with this idea, which is why security and sharing are presented as such tedious, impenetrable topics. They don’t want more informed use of their services, they just want the data like everyone else. The defaults for sharing and online storage, for location and tracking, for a whole variety of privacy invasive settings should be set to OFF. Activities involving risk should be a conscious choice, not an accidental side effect of living in the 21st century.