Categories
Christopher Graham

Whoops!

 

Yesterday, after at least a year of pondering it, the Information Commissioner asked the Universities and Colleges Admissions Service (UCAS) to sign an undertaking, agreeing to change the way in which they obtain consent to use students’ data. The data is obtained as part of the application process and subsequently used for marketing a variety of products and services, and UCAS has agreed to change its approach. It’s important to note that this is an undertaking, so UCAS has not been ordered to do anything, nor are there any direct consequences if they fail to do what is stated in the undertaking. An undertaking is a voluntary exercise – it is not served, it does not order or require, it simply documents an agreement by a Data Controller to do something.

Aspects of the story concern me. The ICO’s head of enforcement is quoted as saying: “By failing to give these applicants a clear option to avoid marketing, they were being unfairly faced with the default option of having their details used for commercial purposes” but given that the marketing was sent by text and email, the opportunity to “avoid” marketing is not what should have been in place. If UCAS wanted to sell access to university and college applicants, they needed consent – which means opt-in, not opt-out. As the undertaking itself points out, consent is defined in the EU Data Protection Directive as freely given – an opt-out cannot constitute this in my opinion. If you think that an opt-out does constitute consent, try transposing that thinking into any other situation where consent is required, and see how creepy your thinking has suddenly become. Consent should be a free choice, made actively. We should not have to stop commercial companies from texting and emailing us – the onus should be on them to make an attractive offer we want to take up, not on consumers to bat away their unwanted attentions.

It’s entirely possible that the ICO’s position on consent is better expressed in the undertaking itself, but here we have a little problem. At least when it was published yesterday, half of the undertaking was missing. Only the oddly numbered pages were published, so presumably the person who scanned the document had a double-sided original and didn’t notice that they had scanned it single-sided. The published document also included one page of UCAS’ covering letter and the final signed page of the undertaking, which the ICO never normally publishes. This mistake reveals some interesting nuggets that we wouldn’t normally know, from the trivial (the Chief Executive of UCAS signed the undertaking with a fountain pen, something of which I wholeheartedly approve) to the potentially significant (the covering letter sets out when UCAS might divert resources away from complying with the undertaking).

But that’s not the point. The point is that the ICO uploaded the wrong document to the internet, and this is not the first time it has happened. I know this because on a previous occasion, I contacted the ICO to tell them that they had done it, and many people on my training courses have also seen un-redacted enforcement and FOI notices on the ICO website. The data revealed in the UCAS case is not sensitive (although I don’t know how the UCAS Chief would feel about her signature being published on the internet), but that’s not the point either. The ICO has spent the last ten years taking noisy, self-righteous action against a variety of mainly public bodies for security slip-ups, and the past five issuing monetary penalties for the same, including several following the accidental publication of personal data on the internet.

The issue here is simple: does the ICO’s accidental publication of this undertaking constitute a breach of the 7th Data Protection Principle? They know about the risk because they’ve done it before. Have they taken appropriate technical and organisational measures to prevent this from happening? Is there a clear process to ensure that the right documents are published? Are documents checked before they are uploaded? Does someone senior check whether the process is being followed? Is everyone involved in the process properly trained in the handling of personal data, and in the technology required to publish documents onto the web? And even if all of these measures are in place, is action taken when such incidents are identified? If the ICO can give positive answers to all these questions, then it is not a breach. Stuff happens. But if they have not, it is a breach.

There is no possibility, no matter how hilarious it would be, that the ICO will issue a CMP on itself following this incident, although it is technically possible. What should happen is that the ICO should quickly and effectively take steps to prevent this from happening again. However, if the Information Commissioner’s Office does not ask the Information Commissioner Christopher Graham to sign an undertaking, publicly stating what these measures will be, they cannot possibly speak and act with authority the next time they ask someone else to the same. Whether they redact Mr Graham’s signature is entirely a matter for them.

UPDATE: without acknowledging their mistake, the Information Commissioner’s Office has now changed the undertaking to be the version they clearly intended to publish. One wonders if anything has been done internally, or if they are simply hoping that only smartarses like me noticed in the first place.

Categories
Christopher Graham

The Gamekeeper’s Fear of the Penalty

 

Amongst the hype over the end of negotiations over the new EU Data Protection Regulation, one theme kept emerging again and again: Big Penalties. It’s understandable that people might want to focus on it. The UK goes from a maximum possible penalty of £500,000 to one of just under £15,000,000 (at today’s Euro conversion rate) or even 4% of a private enterprise’s annual worldwide turnover. Only a fool would say that it wasn’t worth talking about. It’s much more interesting than the bit about Codes of Practice, and it’s easier to explain than the section about certification bodies.

It would be equally foolish to assume, however, that penalties on this scale will rain down from Wilmslow like thunderbolts from Zeus. At the same time as many were talking up the future, the Information Commissioner issued two monetary penalties under the current regime, one under Data Protection (£250 for the Bloomsbury Patient Network) and one under the Privacy and Electronic Communications Regulations (£30,000 for the Daily Telegraph). The £250 penalty is the lowest the ICO has ever issued for anything, while the PECR one is the lowest for a breach of the marketing rules, notwithstanding that the Daily Telegraph is probably the richest PECR target at which the ICO has taken aim.

You could argue that the embarrassment caused to the Telegraph carries an added sting (the ICO has never before taken enforcement action against a newspaper). It’s equally likely that the oligarchs who own the paper will consider £30,000 (£24,000 if they pay up in 35 days) to be a price worth paying if it had the desired effect on the outcome of a very close election. They’ll probably do it again.

In any case, the Bloomsbury Patient Network CMP is much worse. The Regulation calls for monetary penalties to be effective, proportionate and dissuasive, and yet everybody at the ICO thought that a £250 penalty, split between three people, was action worth taking and promoting. The Commissioner himself, Christopher Graham told the DMA in March 2015 that the ICO was not a ‘traffic warden‘, but if the Bloomsbury Three pay up on time, the £66.67 penalty they each face is no worse than a parking ticket you didn’t pay in the first fortnight.

The ICO’s press release claims that the penalty would have been much higher if the data controller had not been an ‘unincorporated association’, but this is irrelevant. The ICO issued a £440,000 PECR penalty against two individuals (Chris Niebel and Gary McNeish) in 2012, while the Claims Management Regulator recently issued a whopping £850,000 penalty against Zahier Hussain for cold calling and similar dodgy practices. The approach on PECR and marketing is positively steely. The problem clearly lies in Data Protection enforcement, and that is what the Regulation is concerned with.

The size and resources of the offending data controller are a secondary consideration; the test is whether the penalty will cause undue financial hardship. The ICO could bankrupt someone or kill their business if they deserved it. The Bloomsbury Patient Network’s handling of the most sensitive personal data was sloppy and incompetent, and had already led to breaches of confidentiality before the incident that gave rise to the penalty. Enforcement action at a serious level was clearly justified. Even if the level of the penalty was high enough to deter well-meaning amateurs from processing incredibly sensitive data, this would be a good thing. If you’re not capable of handling data about a person’s HIV status with an appropriate level of security, you have absolutely no business doing so at all, no matter good your intentions are. Donate to the Terence Higgins Trust by all means, but do not touch anyone’s data. If the ICO lacks the guts to issue a serious penalty, it would be better to do nothing at all and keep quiet, rather than display their gutlessness to the world.

Whoever made this decision cannot have considered what message it would send to organisations large and small who already think of Data Protection as pettifogging red tape, low on the agenda. Is there an organisation anywhere in the country that would consider the slim chance of being fined £66.67 to be a deterrent against anything. A fine is a punishment (it has to cause pain to those who pay it) and it is a lesson to others (it has to look painful to the wider world). The Bloomsbury Patient Network CMP is neither.

Despite the increased expectations raised by the GDPR, the ICO is actually losing its appetite for DP enforcement, with 13 Data Protection CMPs in 2013, but only 6 in 2014 and 7 in 2015. Meanwhile, there have been 24 unenforceable DP undertakings in 2015 alone, including one against Google which you’re welcome to explain the point of, and another (Flybe) which revealed endemic procedural and training problems in the airline which are more significant than the moronic cock-ups that went on at the Bloomsbury Patient Network. Wilmslow is so inert that two different organisations have told me this year that ICO staff asked them to go through the motions of self-reporting incidents that ICO already knew about, because the only way the enforcement wheels could possibly begin to turn was if an incident was self-reported. ICO staff actually knowing that something had happened wasn’t enough. It’s these same timid people who will be wielding the new powers in 2018.

Admittedly, there will be a new Commissioner, and it’s possible that the Government will pick a fearsome enforcement fiend to go after Data Protection like a dog in a sausage factory. You’ll forgive me if I don’t hold my breath. Nevertheless, something in Wilmslow has to change, because the General Data Protection Regulation represents a clear rebuke to the ICO’s DP enforcement approach.

Most obviously, in the long list of tasks in Article 52 that each Data Protection Authority must carry out, the first is very powerful: they must “monitor and enforce” (my emphasis) the application of the Regulation. Someone recently said that in certain circumstances, some organisations require a ‘regulatory nudge’, but the Regulation is much more emphatic than that. The ICO’s preference for hand-holding, nuzzling and persuading stakeholders (especially those where former ICO colleagues have gone to work) is a world away from an enforcement-led approach.

The huge increase of penalties throws down the gauntlet, especially when the ICO has rarely approached the current, comparatively low UK maximum. But the ICO should also pay close attention to the detail of Article 79 of the Regulation, where the new penalties are laid out. Of the 59 ICO monetary penalties, 57 have been for breaches of the 7th principle (security). The Regulation has two levels of penalty, the lower with a maximum of €10,000,000 (or 2% of annual turnover), and the higher with a maximum of €20,000,000 (or 4% of annual turnover). Breaches of Article 30, a very close analogue to Principle 7, is in the lower tier.

Admittedly, the higher penalty applies to all of the principles in Article 5 (which in a somewhat circular fashion includes security), but it explicitly covers “conditions for consent“, “data subject rights” and infringements involving transfers to third countries, areas untouched by the ICO’s DP penalty regime. The Regulation envisages monetary penalties at the higher level for processing without a condition, inaccuracy, poor retention, subject access as well as new rights like the right to be forgotten or the right to object. The ICO has issued a solitary penalty on fairness, and just one on accuracy – it has never fined on subject access, despite that being the largest single cause of data subject complaints.

The Regulation bites hard on the use of consent and legitimate interest, and misuse of data when relying on them would again carry the higher penalty. Most organisations that rely on consent or legitimate interest are outside the public sector, who rely more on legal obligations and powers. Indeed, the Regulation even allows for the public sector to be excluded from monetary penalties altogether if a member state wishes it. Nevertheless, since they got the power to issue them, only 24% of the ICO’s civil monetary penalties have been served on organisations outside the public sector (2 for charities and 12 for private sector).

I doubt the ICO is ready for what the Regulation demands, and what data subjects will naturally expect from such a deliberate attempt to shape the enforcement of privacy rights. The penalties are too low. The dwindling amount of DP enforcement is based almost exclusively on self-reported security breaches. While the Regulation might feed a few private sector cases onto the conveyor belt by way of mandatory reporting of security breaches, it will do nothing for the ICO’s ability to identify suitable cases for anything else. Few ICO CMPs spring from data subject complaints, and anyone who has ever tried to alert Wilmslow to an ongoing breach when they are not directly affected knows how painful a process that can be. The ICO has not enforced on most of the principles.

It’s been my habit whenever talking about the Regulation to people I’m working for to emphasise the period we’re about to enter. There are two years before the Regulation comes into force; two years to get ready, to look at practice and procedure, two years to tighten up. The need to adapt to the future goes double for the Information Commissioner’s Office. Instead of canoodling with stakeholders and issuing wishy-washy guidance, wringing its hands and promising to be an ‘enabler’, the ICO should take a long hard look in the mirror. Its job is to enforce the law; everything else is an optional extra. It’s wise to assume that the wish for total DP harmonisation will probably be a pipe dream; it’s equally obvious that the Regulation will allow for much easier comparisons between EU member states, and the ICO’s lightest of light touches will be found wanting.

Categories
Christopher Graham

A bridge too far

 

June is a significant time for Data Protection in the UK. At the end the month, we have the EU vote (where a vote to leave will throw at least the timetable for implementation of the new General Data Protection Regulation into disarray) and Christopher Graham steps down as Information Commissioner, to be replaced by Elizabeth Denham. There are several reasons to be optimistic about Denham’s appointment – she is the first Information Commissioner to have previous experience of privacy and FOI work, she has already taken on big corporate interests in Canada, and she isn’t Richard Thomas.

However, Denham inherits a series of headaches as she begins her reign as Elizabeth II, and it’s difficult to know which of them will be the hardest to shake off. There is the GDPR implementation, which would be a challenge even without the uncertainty that Brexit will create. She also has to tackle the ICO’s lack of independence from Government, which results in scandalous outcomes like the admission in an FOI response that Wilmslow takes orders from its sponsor department (see answer 3 here). But perhaps biggest of all is the ICO’s approach to enforcement.

On FOI, the ICO doesn’t approach enforcement – it does pointless monitoring and audits without any evidence of success, and the major government departments use the ICO as their internal review, sometimes not bothering to answer requests unless ordered to do so by an ICO case officer. The sole enforcement notice in the past five years wasn’t even promoted by the office because the now departed Deputy Commissioner Graham Smith didn’t want to draw attention to the failure to tackle Whitehall’s FOI abuses.

On Data Protection, the approach is to enforce against self-reported security breaches. There is nothing wrong with lots of enforcement on security – it’s a significant requirement of the legislation and many people are concerned about it. The problem is that Wilmslow doesn’t enforce on anything else, despite breaches of the other principles being widespread and obvious. Unless I missed one, the ICO has issued 61 Data Protection monetary penalties since getting the power to do so. Two have been for non-security breaches: Pharmacy 2U (1st principle data sharing without consent) and Prudential Insurance (accuracy). The overwhelming majority of enforcement notices (and undertakings, if you count them, which you shouldn’t) are on security matters. This is despite the fact that the UK has a massive culture of unlawful data sharing, over-retention, flouted subject access and perhaps most obvious, rampant, damaging inaccuracy. The ICO does nothing about it.

A classic example is a story reported in the Observer about the Dartford Crossing between Kent and Essex. Automatic Number Plate Recognition is used by Highways England to issue penalty charges to drivers who use the crossings without paying by phone or web within a fixed period of time. The only problem is that drivers who have never used the crossing are getting the penalties, but it is more or less inconceivable that the ICO will take action.

Having used the crossing myself, I can confirm that there are some Data Protection issues with the signage around the bridge / tunnel – the Observer article explains well how the signs can easily be confused with those for the London congestion charge, which works entirely differently. This is, in itself, a potential data protection breach, as personal data needs to be obtained fairly, especially when the data being obtained (the license plate) will not only be used to levy a charge, but because court action may result for non-payment.

One person is quoted in the article as having being charged  because the system misread a ‘C’ as a ‘G’. The Observer also reports that hire car users sometimes find penalties aimed at the wrong person because Highways England don’t specify a date that the charge applies to. In another case, the person receiving the charge had sold the car in question, and had a letter from DVLA to prove it. As with most of these situations, terrible customer service and inflexible processes mean that even when a charge is applied to the wrong person, nobody in the food chain has the authority or the inclination to sort things out. Both of the individuals cited in detail by the Observer were headed for the baliffs until the Observer got involved, and all action was terminated. Research by Auto Express notes that only 1 in 25 people appeal their penalty, but 80% of those that do are successful.

Every time Highways England / Dart Charge issues a penalty against the wrong person, it is a breach of the fourth Data Protection principle, which states that “Personal data shall be accurate, and where necessary, up to date”. Note the lack of any qualification or context here – data is accurate, or it’s a breach. Clearly, this means that most organisations are breach DP every minute of every day simply because of typos, but even adopting a flexible approach, there can be no doubt that demanding money and threatening court action is a situation where the Data Controller must be certain that the data is accurate, and if they get the wrong person, it’s a breach. The security principle talks about “appropriate measures” to prevent incidents, but the fourth principle doesn’t: it’s absolute.

Highways England / Dart Charge have breached the DPA, but would it be possible for the ICO to take action? In order to issue a monetary penalty, the ICO has to meet a series of tests.

1. The breach is serious

Dart Charge are pursuing people for debts they don’t owe. It’s serious.

2. The breach is deliberate

This one is potentially tricky, as we would need evidence that Highways England know that they are operating on the basis of inaccurate information in order for the breach to be deliberate. I can’t prove that Highways England are deliberately pursuing people, knowing that they are the wrong targets, although one of the Observer readers quoted gives clear evidence that they might be: “I spent 20 minutes trying to get through to someone who kept telling me I had to pay, even though he could see the problem”. However, we don’t need deliberate if we have:

3. The Data Controller knew or ought to have known about the risk and failed to take steps to prevent it

This test is clearly met – Highways England know that most of their penalty charges are overturned on appeal, they know that their system misreads licence plate characters, that it fails to properly distinguish dates, and they know that people contact them multiple times with evidence that the charge is wrong, but they ignore this evidence until they are embarrassed into action by a national newspaper. The breaches are still happening.

4. The breach is likely to cause damage or distress

Innocent individuals who have not used the Dartford Crossing are being pursued and threatened with legal action if they do not pay money that they do not owe. The breach is causing damage and distress and is highly likely to do so.

The ICO does not enforce on accuracy and they won’t touch this case. If I tried to report it to them, they would ignore my complaint because I have not been affected (if an affected person complained, they would do an unenforceable assessment). They do not ask Data Controllers to report incidents of damaging inaccuracy, and they do not even advocate investigating incidents of inaccuracy in the way that they do for security. This despite that fact that inaccuracy leads to the wrong medical treatment being given, innocent people’s houses being raided by the police, and old men nearly drowning in canals. The ICO took no enforcement action in any of these cases, despite them being in the public domain. I have dozens of others. Meanwhile, the Commissioner chunters on about a series of accidents and mishaps without any direct evidence of harm (ironically, even the pace of security enforcement has slowed, with only three DP monetary penalties at all so far this year).

Whatever Ms Denham’s priorities might be, she cannot ignore this. The ICO has shirked its responsibilities on the other principles for too long. A quick glance at the articles relevant to enforcement show that the GDPR is specifically designed to give breaches of the principles the higher maximum penalty. It’s a riposte to the ICO’s enforcement priorities since the HMRC lost discs incident in 2007, and it’s a bridge that the new Commissioner must be willing to cross.