Less than ideal

Last week, Stephen Lee, an academic and former fundraiser was reported as having attacked the Information Commissioner’s Office for their interpretation of direct marketing at a fundraising conference. It was, he said “outrageous” that the Commissioner’s direct marketing guidance stated that any advertising or marketing material that promoted the aims and ideals of a not-for-profit organisation was covered by Data Protection. According to Lee, only fundraising activities should be considered to be marketing.

[NB: Third Sector articles are sometimes open to all and sometimes limited to subscribers. If the links don’t work, please accept my apologies!]

He is quoted as saying “Who says that’s right? Just the ICO. Who did it consult? No one.” and  went on to say “Why and how and in what way should we be compelled to comply with that proposition?”

Who says that’s right? Who did the ICO consult? Well, let me see now.

1) The Council of Europe

In 1985, the Council of Europe issued a Recommendation on the protection of personal data used for the purposes of direct marketing. The definition of direct marketing includes both the offer of goods or services and “any other messages” to a segment of the population. The recommendation predates the guidance Mr Lee disparages by more than 30 years.

2) The 1995 Data Protection Directive

The Directive makes clear that direct marketing rules apply equally to charitable organisations and political parties as they do to commercial organisations, and emphasises the need for people to be able to opt-out of direct marketing. By redrawing the definition, Mr Lee would contradict this fundamental right.

3) The Data Protection Act 1998

Given that Mr Lee feels qualified to make bold statements about the interpretation of the Data Protection Act, it’s odd that he doesn’t seem to have taken the time to read it. Section 11 of the Act states that the definition of Direct Marketing “the communication (by whatever means) of any advertising and marketing material which is directed at particular individuals”. The important word there is “any” – organisations do not get to pick and choose which of their promotional messages are covered and which are not.

4) The Privacy and Electronic Communications Regulations 2003

PECR sets up the rules for consent over electronic direct marketing (consent for automated calls, opt-out and TPS for live calls, consent for emails and texts). It does not define direct marketing, but instead says this “Expressions used in these Regulations that are not defined in paragraph (1) and are defined in the Data Protection Act 1998 shall have the same meaning as in that Act”. Therefore, the DPA definition applies to PECR.

5) The Information Tribunal (now the First Tier Tribunal)

In 2005, the Information Commissioner served an Enforcement Notice on the Scottish National Party after they repeatedly and unrepentantly used automated calls featuring Sean Connery to promote the party in the General Election. The SNP appealed, and in 2006, the Information Tribunal considered the issue. One of the main elements of the SNP appeal was against the ICO’s definition of direct marketing. Although the case is about a political party, the ICO’s submissions are based on the proposition that charities as well as political parties are covered by the definition of direct marketing, and that the definition cannot be restricted to fundraising alone. The Tribunal accepted the ICO’s view in full, and dismissed the appeal.

6) The charity sector and anyone else who wanted to be consulted

The ICO may have issued guidance in the 1980s or 1990s on the definition of direct marketing, but the idea that promoting aims and ideals is part of it has been their view since 1999. In guidance issued on the precursor to PECR, the ICO stated clearly that direct marketing includes “not just to the offer for sale of goods or services, but also the promotion of an organisations aims and ideals”. They specifically mentioned charities, as they have ever since. Virtually every iteration of the ICO’s guidance on PECR and direct marketing has been subject to public consultation – indeed, the very guidance Lee is talking about was subject to a public consultation.

Here’s the problem. Lee is an Honorary Fellow of the Institute of Fundraising, and has a long association with it. The IoF has been the most consistently pernicious influence on the charity sector’s compliance with data protection and privacy law in the past ten years. Their guidance and public utterances on data protection are often misleading, and they recently had to change their own Code of Practice because it was legally incorrect. At best, they haven’t noticed the ICO position on charities and direct marketing for more than 15 years. At worst, they deliberately ignored it in favour of an interpretation that largely suits fundraisers. Lee complained at the conference about the “appalling” communication between the ICO and charity umbrella bodies, but Richard Marbrow of the ICO summed the problem up all too well:

One of the things the sector asked for was clarity, and I will try and bring you that. The trouble is, if you then say ‘we don’t like that clarity, could we have some different clarity please?’, we’re not going to get on very well.”

The most important thing about Lee’s outburst is the subtext – if any form of communication is not covered by the definition of direct marketing, then your consent is not required  in the first place and you have no right to stop receiving it. His interpretation is nonsense, but it is also ethically unsound. At its most basic level, privacy means the right to be left alone, the right to have an area of your life which is yours, which others can’t intrude into. Lee seems to want to erode that right. If his view was correct (it’s not), charities could bombard people with phone calls, texts or emails to tell them how marvellous they are, how important their work is, how vital they are for society. As long as they don’t ask for money, the logic of his argument is that people wouldn’t be able to stop them.

Lee’s other question (“Why and how and in what way should we be compelled to comply with that proposition?”) has an easy answer. Ignore it. Carry on breaching the law, ignoring the rules. I went to the cinema last night and saw adverts for two different charities that plainly breached PECR, so that seems to be the plan. Given that the furore over charities began with an innocent person bombarded with unwanted correspondence, it’s remarkable that senior figures in the charity sector are ready for another go, but if Mr Lee wants to drag charities’ reputations deeper into a swamp that they share with PPI scammers and payday loan merchants, he’s welcome.

But the ICO should not listen to their concerns, or open friendly channels of communication with the sector. They should apply the law firmly and regularly until the charities get the message. If this results in more enforcement against charities than other sectors, that will be only because the big charities are among the worst offenders and they haven’t put their houses in order. If charity giving suffers as a result, even amongst the many charities that have not transgressed, they should stop blaming others and look to their fundraisers, their colleagues and themselves.

A bridge too far

June is a significant time for Data Protection in the UK. At the end the month, we have the EU vote (where a vote to leave will throw at least the timetable for implementation of the new General Data Protection Regulation into disarray) and Christopher Graham steps down as Information Commissioner, to be replaced by Elizabeth Denham. There are several reasons to be optimistic about Denham’s appointment – she is the first Information Commissioner to have previous experience of privacy and FOI work, she has already taken on big corporate interests in Canada, and she isn’t Richard Thomas.

However, Denham inherits a series of headaches as she begins her reign as Elizabeth II, and it’s difficult to know which of them will be the hardest to shake off. There is the GDPR implementation, which would be a challenge even without the uncertainty that Brexit will create. She also has to tackle the ICO’s lack of independence from Government, which results in scandalous outcomes like the admission in an FOI response that Wilmslow takes orders from its sponsor department (see answer 3 here). But perhaps biggest of all is the ICO’s approach to enforcement.

On FOI, the ICO doesn’t approach enforcement – it does pointless monitoring and audits without any evidence of success, and the major government departments use the ICO as their internal review, sometimes not bothering to answer requests unless ordered to do so by an ICO case officer. The sole enforcement notice in the past five years wasn’t even promoted by the office because the now departed Deputy Commissioner Graham Smith didn’t want to draw attention to the failure to tackle Whitehall’s FOI abuses.

On Data Protection, the approach is to enforce against self-reported security breaches. There is nothing wrong with lots of enforcement on security – it’s a significant requirement of the legislation and many people are concerned about it. The problem is that Wilmslow doesn’t enforce on anything else, despite breaches of the other principles being widespread and obvious. Unless I missed one, the ICO has issued 61 Data Protection monetary penalties since getting the power to do so. Two have been for non-security breaches: Pharmacy 2U (1st principle data sharing without consent) and Prudential Insurance (accuracy). The overwhelming majority of enforcement notices (and undertakings, if you count them, which you shouldn’t) are on security matters. This is despite the fact that the UK has a massive culture of unlawful data sharing, over-retention, flouted subject access and perhaps most obvious, rampant, damaging inaccuracy. The ICO does nothing about it.

A classic example is a story reported in the Observer about the Dartford Crossing between Kent and Essex. Automatic Number Plate Recognition is used by Highways England to issue penalty charges to drivers who use the crossings without paying by phone or web within a fixed period of time. The only problem is that drivers who have never used the crossing are getting the penalties, but it is more or less inconceivable that the ICO will take action.

Having used the crossing myself, I can confirm that there are some Data Protection issues with the signage around the bridge / tunnel – the Observer article explains well how the signs can easily be confused with those for the London congestion charge, which works entirely differently. This is, in itself, a potential data protection breach, as personal data needs to be obtained fairly, especially when the data being obtained (the license plate) will not only be used to levy a charge, but because court action may result for non-payment.

One person is quoted in the article as having being charged  because the system misread a ‘C’ as a ‘G’. The Observer also reports that hire car users sometimes find penalties aimed at the wrong person because Highways England don’t specify a date that the charge applies to. In another case, the person receiving the charge had sold the car in question, and had a letter from DVLA to prove it. As with most of these situations, terrible customer service and inflexible processes mean that even when a charge is applied to the wrong person, nobody in the food chain has the authority or the inclination to sort things out. Both of the individuals cited in detail by the Observer were headed for the baliffs until the Observer got involved, and all action was terminated. Research by Auto Express notes that only 1 in 25 people appeal their penalty, but 80% of those that do are successful.

Every time Highways England / Dart Charge issues a penalty against the wrong person, it is a breach of the fourth Data Protection principle, which states that “Personal data shall be accurate, and where necessary, up to date”. Note the lack of any qualification or context here – data is accurate, or it’s a breach. Clearly, this means that most organisations are breach DP every minute of every day simply because of typos, but even adopting a flexible approach, there can be no doubt that demanding money and threatening court action is a situation where the Data Controller must be certain that the data is accurate, and if they get the wrong person, it’s a breach. The security principle talks about “appropriate measures” to prevent incidents, but the fourth principle doesn’t: it’s absolute.

Highways England / Dart Charge have breached the DPA, but would it be possible for the ICO to take action? In order to issue a monetary penalty, the ICO has to meet a series of tests.

1. The breach is serious

Dart Charge are pursuing people for debts they don’t owe. It’s serious.

2. The breach is deliberate

This one is potentially tricky, as we would need evidence that Highways England know that they are operating on the basis of inaccurate information in order for the breach to be deliberate. I can’t prove that Highways England are deliberately pursuing people, knowing that they are the wrong targets, although one of the Observer readers quoted gives clear evidence that they might be: “I spent 20 minutes trying to get through to someone who kept telling me I had to pay, even though he could see the problem”. However, we don’t need deliberate if we have:

3. The Data Controller knew or ought to have known about the risk and failed to take steps to prevent it

This test is clearly met – Highways England know that most of their penalty charges are overturned on appeal, they know that their system misreads licence plate characters, that it fails to properly distinguish dates, and they know that people contact them multiple times with evidence that the charge is wrong, but they ignore this evidence until they are embarrassed into action by a national newspaper. The breaches are still happening.

4. The breach is likely to cause damage or distress

Innocent individuals who have not used the Dartford Crossing are being pursued and threatened with legal action if they do not pay money that they do not owe. The breach is causing damage and distress and is highly likely to do so.

The ICO does not enforce on accuracy and they won’t touch this case. If I tried to report it to them, they would ignore my complaint because I have not been affected (if an affected person complained, they would do an unenforceable assessment). They do not ask Data Controllers to report incidents of damaging inaccuracy, and they do not even advocate investigating incidents of inaccuracy in the way that they do for security. This despite that fact that inaccuracy leads to the wrong medical treatment being given, innocent people’s houses being raided by the police, and old men nearly drowning in canals. The ICO took no enforcement action in any of these cases, despite them being in the public domain. I have dozens of others. Meanwhile, the Commissioner chunters on about a series of accidents and mishaps without any direct evidence of harm (ironically, even the pace of security enforcement has slowed, with only three DP monetary penalties at all so far this year).

Whatever Ms Denham’s priorities might be, she cannot ignore this. The ICO has shirked its responsibilities on the other principles for too long. A quick glance at the articles relevant to enforcement show that the GDPR is specifically designed to give breaches of the principles the higher maximum penalty. It’s a riposte to the ICO’s enforcement priorities since the HMRC lost discs incident in 2007, and it’s a bridge that the new Commissioner must be willing to cross.

Wanted

Many of today’s newspapers report (once again) that police forces are refusing to name wanted suspects because of Data Protection and Human Rights. It’s tempting to assume that by now, everyone knows that the Data Protection Act does not prevent the disclosure of wanted suspects’ names and photos, so when another newspaper makes an FOI request for the most wanted, the inevitably craven and risk-averse responses don’t really need to be debunked. Surely we all know that the cops either don’t want to get into nuanced conversations about the operational reasons not to name the suspects, they are too cowardly to use Data Protection to justify disclosure, or they just plain don’t understand the process? Is it really worth pointing out why the decision is so knuckle-headed?

Admittedly, without seeing all of the responses, I can’t be certain how bad they really are – all we have are selected quotes. I must also acknowledge that my judgement is clouded by having recently made FOI requests to a number of police forces, an experience that makes me assume that everything these forces have done is wrong. Nevertheless, it doesn’t look good – Humberside Police apparently told the Daily Mail that it wasn’t in the public interest to disclose sensitive personal data, despite the DP exemption in FOI not having a public interest test. Meanwhile, Leicestershire Police claimed a suspected murderer and rapist, could not be named because it went against the ‘principles of fairness’, while Staffordshire said its response was “processed in line with individuals’ rights”, which means either that Staffordshire have received a valid Section 10 notice from each of the suspects in question, or they don’t know what they are talking about. 18 other forces are cited by the Mail as having claimed that Data Protection prevents disclosure.

The only force who appear to have a leg to stand on are Nottinghamshire, who used Section 30(1) of FOI. S30 applies to investigations, so presumably Nottinghamshire are arguing that if they haven’t already named the suspects, it isn’t in the public interest to release them in response to an FOI. I can’t say for certain if this decision is correct, but the use of S30 suggests that Nottinghamshire’s decision is based on operational reasons related to their ongoing investigation. On that basis alone, they deserve the benefit of the doubt in a way that any force using S40 does not.

Rather than spend another 500 words calling police FOI and DP decision makers an assortment of rude names (which was my original plan for this blog), permit me to explain exactly why the use of Data Protection is always nonsense in these situations.

HOW DOES SECTION 40 WORK?

Section 40 of FOI defers entirely to the Data Protection Act when the request is for personal data about someone else. Essentially, if a disclosure of personal data would breach any of the Data Protection principles, if it would breach a valid Section 10 notice issued by the data subject, or if it would be exempt from subject access (i.e. the subject would not receive it themselves if they asked for it). In practice, the Information Commissioner considers that if the disclosure will not breach the first Data Protection principle, S40 is not a barrier. The forces must be arguing that disclosure of the wanted suspect’s data breaches the first principle.

HOW DOES THE FIRST PRINCIPLE WORK?

The first principle says that the processing of data – here, the disclosure – must be FAIR, LAWFUL, and ACCORDING TO A SET OF CONDITIONS.

FAIR

Fair means what it says in the dictionary, and it also means that the data subject must be informed of how their data will be used. The ICO is fond of the notion of ‘reasonable expectations’ – you don’t need to tell people how their data will be used if it’s obvious. This would plainly apply in these circumstances; a suspect cannot expect that their data will be suppressed while they are being hunted. In any case, S29 of Data Protection removes the requirement to use data fairly in any situation where doing so would prejudice the apprehension or prosecution of offenders. Therefore, if disclosure of the suspects’ identities would assist in their capture, fairness is no barrier.If disclosure will prejudice attempts to recover them, the FOI S30 exemption used by Nottinghamshire is the right exemption. The problem that would motivate the police is the effect on their investigation rather than the personal data issue.

LAWFUL

Lawful means that police forces cannot breach *other* laws by the processing of personal data. This could be why Human Rights were cited by some of the forces. If disclosure of the personal data would breach a suspect’s Article 8 rights to privacy, the disclosure would be unlawful, and so DP would be a barrier. But this is nonsense. The right to privacy is not an absolute right, and can be interfered with in a variety of circumstances, including where it is necessary in the interests of national security, public safety, for the prevention of disorder or crime. You can, if you like, argue that naming the suspects interferes with their privacy (I don’t think it does) but even if it does, if publication of the names will assist in their capture, the interference would clearly be necessary to protect public safety or prevent crime. It’s lawful, unless the police argue that disclosure will impair their investigation. If they thought that, they would use Section 30 of FOI.

CONDITIONS

The data in question is sensitive personal data, as it relates to the alleged commission of crime. This means that each force has to meet two conditions in order to disclose: once from Schedule 2  and one from Schedule 3.

Schedule 2 is easy – we can pick from 5 (the processing is necessary for the administration of justice or the processing is necessary for the exercise of public functions in the public interest) or 6 (the processing is necessary for legitimate interests that do not cause unwarranted prejudice to the rights and freedoms or interests of the subject). The first two might be preferable to the balancing exercise required by the third, but if you really think that disclosing the name of a wanted man causes unwarranted prejudice to their rights, you are a moron.

Schedule 3(7)(1)(a) gives us administration of justice again while 3(7)(1)(b) gives us exercise of functions conferred on any person. The DPA was amended in 2000, which also allows any disclosure of sensitive data necessary to prevent or detect an unlawful act.

The only problem here would be if the force believed that disclosure would prejudice their ability to catch the wanted suspects. For the third time, if this is the case, Data Protection is not what they are worried about. They may have good operational reasons not to want to disclose, but they are choosing instead to hide behind Data Protection, which has the dual problem of making them look like politically correct idiots, and damaging the reputation of Data Protection which, as I have demonstrated, can easily be used to justify the disclosure. It took me 30 minutes to write this, and I would happily use it as a justification to disclose personal data; the only reason not to would be an operational reason, and FOI provides much better exemptions to protect the integrity and effectiveness of police investigations.

The only possible explanation I can think of for why the police cling to this idea that DP is a barrier to disclosure is that someone is feeding them terrible advice and guidance about how DP really works, and nobody is willing to stick their necks out and question it. This paints a terrible picture of the information rights culture in policing, and someone needs lay down the law as a matter of urgency.

 

The Gamekeeper’s Fear of the Penalty

Amongst the hype over the end of negotiations over the new EU Data Protection Regulation, one theme kept emerging again and again: Big Penalties. It’s understandable that people might want to focus on it. The UK goes from a maximum possible penalty of £500,000 to one of just under £15,000,000 (at today’s Euro conversion rate) or even 4% of a private enterprise’s annual worldwide turnover. Only a fool would say that it wasn’t worth talking about. It’s much more interesting than the bit about Codes of Practice, and it’s easier to explain than the section about certification bodies.

It would be equally foolish to assume, however, that penalties on this scale will rain down from Wilmslow like thunderbolts from Zeus. At the same time as many were talking up the future, the Information Commissioner issued two monetary penalties under the current regime, one under Data Protection (£250 for the Bloomsbury Patient Network) and one under the Privacy and Electronic Communications Regulations (£30,000 for the Daily Telegraph). The £250 penalty is the lowest the ICO has ever issued for anything, while the PECR one is the lowest for a breach of the marketing rules, notwithstanding that the Daily Telegraph is probably the richest PECR target at which the ICO has taken aim.

You could argue that the embarrassment caused to the Telegraph carries an added sting (the ICO has never before taken enforcement action against a newspaper). It’s equally likely that the oligarchs who own the paper will consider £30,000 (£24,000 if they pay up in 35 days) to be a price worth paying if it had the desired effect on the outcome of a very close election. They’ll probably do it again.

In any case, the Bloomsbury Patient Network CMP is much worse. The Regulation calls for monetary penalties to be effective, proportionate and dissuasive, and yet everybody at the ICO thought that a £250 penalty, split between three people, was action worth taking and promoting. The Commissioner himself, Christopher Graham told the DMA in March 2015 that the ICO was not a ‘traffic warden‘, but if the Bloomsbury Three pay up on time, the £66.67 penalty they each face is no worse than a parking ticket you didn’t pay in the first fortnight.

The ICO’s press release claims that the penalty would have been much higher if the data controller had not been an ‘unincorporated association’, but this is irrelevant. The ICO issued a £440,000 PECR penalty against two individuals (Chris Niebel and Gary McNeish) in 2012, while the Claims Management Regulator recently issued a whopping £850,000 penalty against Zahier Hussain for cold calling and similar dodgy practices. The approach on PECR and marketing is positively steely. The problem clearly lies in Data Protection enforcement, and that is what the Regulation is concerned with.

The size and resources of the offending data controller are a secondary consideration; the test is whether the penalty will cause undue financial hardship. The ICO could bankrupt someone or kill their business if they deserved it. The Bloomsbury Patient Network’s handling of the most sensitive personal data was sloppy and incompetent, and had already led to breaches of confidentiality before the incident that gave rise to the penalty. Enforcement action at a serious level was clearly justified. Even if the level of the penalty was high enough to deter well-meaning amateurs from processing incredibly sensitive data, this would be a good thing. If you’re not capable of handling data about a person’s HIV status with an appropriate level of security, you have absolutely no business doing so at all, no matter good your intentions are. Donate to the Terence Higgins Trust by all means, but do not touch anyone’s data. If the ICO lacks the guts to issue a serious penalty, it would be better to do nothing at all and keep quiet, rather than display their gutlessness to the world.

Whoever made this decision cannot have considered what message it would send to organisations large and small who already think of Data Protection as pettifogging red tape, low on the agenda. Is there an organisation anywhere in the country that would consider the slim chance of being fined £66.67 to be a deterrent against anything. A fine is a punishment (it has to cause pain to those who pay it) and it is a lesson to others (it has to look painful to the wider world). The Bloomsbury Patient Network CMP is neither.

Despite the increased expectations raised by the GDPR, the ICO is actually losing its appetite for DP enforcement, with 13 Data Protection CMPs in 2013, but only 6 in 2014 and 7 in 2015. Meanwhile, there have been 24 unenforceable DP undertakings in 2015 alone, including one against Google which you’re welcome to explain the point of, and another (Flybe) which revealed endemic procedural and training problems in the airline which are more significant than the moronic cock-ups that went on at the Bloomsbury Patient Network. Wilmslow is so inert that two different organisations have told me this year that ICO staff asked them to go through the motions of self-reporting incidents that ICO already knew about, because the only way the enforcement wheels could possibly begin to turn was if an incident was self-reported. ICO staff actually knowing that something had happened wasn’t enough. It’s these same timid people who will be wielding the new powers in 2018.

Admittedly, there will be a new Commissioner, and it’s possible that the Government will pick a fearsome enforcement fiend to go after Data Protection like a dog in a sausage factory. You’ll forgive me if I don’t hold my breath. Nevertheless, something in Wilmslow has to change, because the General Data Protection Regulation represents a clear rebuke to the ICO’s DP enforcement approach.

Most obviously, in the long list of tasks in Article 52 that each Data Protection Authority must carry out, the first is very powerful: they must “monitor and enforce” (my emphasis) the application of the Regulation. Someone recently said that in certain circumstances, some organisations require a ‘regulatory nudge’, but the Regulation is much more emphatic than that. The ICO’s preference for hand-holding, nuzzling and persuading stakeholders (especially those where former ICO colleagues have gone to work) is a world away from an enforcement-led approach.

The huge increase of penalties throws down the gauntlet, especially when the ICO has rarely approached the current, comparatively low UK maximum. But the ICO should also pay close attention to the detail of Article 79 of the Regulation, where the new penalties are laid out. Of the 59 ICO monetary penalties, 57 have been for breaches of the 7th principle (security). The Regulation has two levels of penalty, the lower with a maximum of €10,000,000 (or 2% of annual turnover), and the higher with a maximum of €20,000,000 (or 4% of annual turnover). Breaches of Article 30, a very close analogue to Principle 7, is in the lower tier.

Admittedly, the higher penalty applies to all of the principles in Article 5 (which in a somewhat circular fashion includes security), but it explicitly covers “conditions for consent“, “data subject rights” and infringements involving transfers to third countries, areas untouched by the ICO’s DP penalty regime. The Regulation envisages monetary penalties at the higher level for processing without a condition, inaccuracy, poor retention, subject access as well as new rights like the right to be forgotten or the right to object. The ICO has issued a solitary penalty on fairness, and just one on accuracy – it has never fined on subject access, despite that being the largest single cause of data subject complaints.

The Regulation bites hard on the use of consent and legitimate interest, and misuse of data when relying on them would again carry the higher penalty. Most organisations that rely on consent or legitimate interest are outside the public sector, who rely more on legal obligations and powers. Indeed, the Regulation even allows for the public sector to be excluded from monetary penalties altogether if a member state wishes it. Nevertheless, since they got the power to issue them, only 24% of the ICO’s civil monetary penalties have been served on organisations outside the public sector (2 for charities and 12 for private sector).

I doubt the ICO is ready for what the Regulation demands, and what data subjects will naturally expect from such a deliberate attempt to shape the enforcement of privacy rights. The penalties are too low. The dwindling amount of DP enforcement is based almost exclusively on self-reported security breaches. While the Regulation might feed a few private sector cases onto the conveyor belt by way of mandatory reporting of security breaches, it will do nothing for the ICO’s ability to identify suitable cases for anything else. Few ICO CMPs spring from data subject complaints, and anyone who has ever tried to alert Wilmslow to an ongoing breach when they are not directly affected knows how painful a process that can be. The ICO has not enforced on most of the principles.

It’s been my habit whenever talking about the Regulation to people I’m working for to emphasise the period we’re about to enter. There are two years before the Regulation comes into force; two years to get ready, to look at practice and procedure, two years to tighten up. The need to adapt to the future goes double for the Information Commissioner’s Office. Instead of canoodling with stakeholders and issuing wishy-washy guidance, wringing its hands and promising to be an ‘enabler’, the ICO should take a long hard look in the mirror. Its job is to enforce the law; everything else is an optional extra. It’s wise to assume that the wish for total DP harmonisation will probably be a pipe dream; it’s equally obvious that the Regulation will allow for much easier comparisons between EU member states, and the ICO’s lightest of light touches will be found wanting.

Consenting adults

Around two months ago, the Etherington Review into charity fundraising and governance published a series of recommendations about the way the sector should be run. The most eye-catching and ridiculous is the Fundraising Preference Service, which I wrote about at the time. The reaction to the FPS from charities has been almost universally negative, with a series of articles appearing in charity publications and on charity websites, all condemning the idea that the public should be able to stop communications from charities.

There is nothing in Data Protection, the Privacy and Electronic Communications Regulations (PECR) in general or the Telephone Preference Service (TPS) provisions in particular that stops a charity from contacting a person who wants to be contacted. The FPS is non-statutory, and so cannot change it. Since 1995, Data Protection law has been built on a requirement that any contact based on consent requires a freely given, specific and informed indication of the subject’s wishes. That’s what the Directive says, so any claim that somehow the upcoming DP Regulation represents a significant shift in how consent works is exaggerated. The problem for some charities is they have ignored this. When I make a donation, that is a freely given, specific and informed indication of my wish to make that donation. If the charity wants to call me, or text me and rely on consent, they need a freely given, specific and informed indication that I want to be called.

The current practice of charity posters that ask for a quick £3 or £5 text donation for a specific cause are a classic example of how this doesn’t work. Yes, there is minuscule small print on the poster that indicates that further calls or texts will be made and I can opt-out, but unless one has carried a magnifying glass onto the Tube or into the toilet cubicle, the text is impossible to read, and easy to overlook. Many charities using the one-off donation technique seem to be doing so to harvest mobile numbers for fundraising calls. In Data Protection terms, this is unfair and does not represent consent (breach of the 1st principle); in PECR terms, if the number is on the TPS, the charity has not obtained consent and any calls made to a TPS registered number harvested in this way will be unlawful.

An article in Civil Society published shortly after the FPS proposals were first mooted contains this key quote:

The idea is that members of the public would be able to simply and easily add their names to a “suppression list” so they would not be contacted by fundraisers. Rather than rely on charities using the existing mail and Telephone Preference Services, the FPS would allow you to put a stop to all contact with charities.

The TPS already allows you to put a stop to all contact with charities by phone, along with everyone else. Charities are not unfairly discriminated against by the TPS, any more than any other sector might be. The TPS is a blunt instrument, but it is a fair one. The fact that charities see the FPS as being a problem suggests to me that they either don’t understand the TPS (they believe the donation = consent nonsense), or they think they can ignore it. Civil Society reported at the end of October that the Institute of Fundraising (which represents, remember, organisations that make money out of fundraising, rather than charities themselves) was changing its guidance in line with the expectations of the Information Commissioner’s Office. The IoF nevertheless claims that this change (i.e. complying with PECR) “unduly” restricts the ability of charities to “maintain relationships with their supporters“.

Donation = consent isn’t the only myth that has been propagated. Civil Society’s David Ainsworth claimed a few weeks ago that all the blame lies at the door of the ICO (and that’s often a valid argument). The problem is, the story isn’t true. Ainsworth said “In 2010 David Evans, a senior data protection manager at the ICO, explicitly told charities they were allowed to call people registered on the TPS, so long as they received no complaints. Just in case there was any doubt, this was followed up with official guidance which effectively said that the ICO did not intend to apply the law to charities.” I asked Ainsworth on Twitter if he could provide evidence that this is what the ICO said. All he could provide was a note written by the Institute of Fundraising, who are hardly objective. But even that note contradicts Ainsworth’s article, stating the TPS position clearly, with only a little bit of nuance.

TPS regulations ‐ any person registered on the telephone preference service (TPS) cannot be called unless they have advised the calling party that they are happy to receive calls. In practice, a charity might judge that, given the nature of the relationship between them and the supporter, they might be able to make a marketing call to that subscriber despite TPS registration.

In truth, what Evans said is a line I have heard many times from different ICO people – if a data controller thinks it has consent, acts on that consent, and crucially, the ICO doesn’t receive any complaints, then they probably had consent. In other words, the ICO won’t act on complaints it hasn’t received. The ICO did not give charities an exception. Should any charity have bothered to investigate, they would have found that ICO has no power to do so. The problem was, as Christopher Graham told Parliament last month, there were thousands of complaints about charity direct marketing, but they were all going to the Fundraising Standards Board, a self regulatory body that regulates the Institute for Fundraising’s code. The FRSB did not pass any of the complaints on to the Information Commissioner.

**UPDATE: originally, this blog said that the Fundraising Standards Board was ‘run by‘ the Institute for Fundraising, which was poorly worded shorthand, treating the IoF as if they are the embodiment of fundraisers and charities. The FRSB is a membership body, paid for by its members (who are charities and fundraisers), and its role is to act as a self-regulator for the Code of Fundraising Practice drawn up by the IoF. I don’t believe that the FRSB is properly independent of the Institute for Fundraising not least because they ‘enforce’ a code written by the IoF, and which was legally inadequate. I’m not the only person who thinks this: post-Etherington, the FRSB is being abolished, and responsibility for the Fundraising Code is being transferred to a new regulator. The IoF’s Chief Executive welcomed the new regulator’s creation (tacitly welcoming the abolition of the FRSB), and recognised that moving the Code from the IoF to the new regulator was necessary to avoid the perception of a ‘conflict of interest‘.**

The biggest barrier to charities accepting legal reality – either by complying with the TPS, or with some workable version of the FPS if such a thing is possible – may be the fact that some in the sector don’t really believe in consent at all. Matthew Sherrington, a consultant writing in Third Sector this week, wasn’t exactly subtle: “The awkward truth, which is difficult for charities to argue publicly, is that the generous public (the UK is the most generous in Europe, as it happens) do not give off their own bat, but need to be asked” (my emphasis). The same argument was made by Ian MacQuillin, blogging on behalf of Rogare, a fundraising think tank: “Everyone knows that most people give because they are asked to do so” and later on “I suspect that the FPS would be used not just by people who really are on the receiving end of such a deluge of fundraising material that it was making their lives a misery; but more by people who want to spare themselves the difficult choice of deciding how to respond to a donation request, and the guilt and cognitive dissonance that results when they say no“. The thinking that runs through both articles, and others, is that fundraisers must be able to ask, that the potential donor / prospect / target (which is what we all are to the fundraiser) should not be allowed to opt-out of being asked. We should have to listen to the pitch, and should be forced into the awkward, embarrassing (or in MacQuillin’s word) guilt-ridden option of saying no. There is, in this world, something inappropriate, even immoral in having a choice about whether to be approached in the first place.

**UPDATE: I have had a long Twitter conversation with Matthew Sherrington. He hasn’t put a comment on the blog (which he and anyone is welcome to do) but he thinks I have misrepresented what he said about consent and marketing, and I think that I should mention this. I stand by my comments above, but I’m linking to his article again here so you can read it and make up your own mind about what he says.**

It’s possible that fundraisers and consultants genuinely don’t understand the TPS, don’t understand that it’s already supposed to be possible to opt-out of every marketing phone call, or that texts and emails are opt-in in the first place. Fundraisers see widespread abuse of PECR and Data Protection, so assume that it’s all fine and that daft proposals like the FPS represent unfair singling out of the charity sector. At this point, it is fair to criticise the Information Commissioner for their generally insipid enforcement. I think there is also a sense of entitlement among charities (which is one thing, as most charities have a clear public interest objective), but also among fundraisers (who are, in the main, just private businesses making a profit). There are no exemptions. There is no charity carve-out or defence. The European Data Protection Directive, from which everything in UK DP and PECR law is derived, makes clear that charities are included along with everyone else. It’s in article 30, if you’d like to check.

In amongst all of the anger and self-justification available in the charity press, one article in Civil Society also caught my eye: “Trust in charities is at its lowest point since 2007, with charities now less trusted than supermarkets“, according to a survey carried out by npfSynergy. Some might blame the Daily Mail and Camila Batmanghelidjh, but purely anecdotally, on every training course about direct marketing that I have run in the past five years, the main examples people come up with for poor quality, persistent, sometimes rude marketing calls are either PPI or charities. Fundraisers and charities alike need to ask themselves if they want to be in company with spivs and spammers. Rather than try to rewrite history, or the law, or continue to adopt an approach based on pestering and guilt, perhaps the big charities should look at a business model that is bringing them into disrepute. There is a real question about how they raise funds without marketing calls and other contacts to people who don’t want to receive them but the only solution to this is to get PECR and the DPA amended to remove charities from the marketing requirements, but as this would deprive the public of their existing rights and mean that the UK is in direct breach of EU law, I doubt they’ll get very far. I still think the Fundraising Preference Service is unnecessary in the light of existing provisions, but if it is implemented in some meaningful form, and finally gets the message across to the most unrepentant of charity spammers, maybe I’m wrong.

King Canute famously stood in the waves and ordered back the sea, but only to show that his powers were limited. Some charities and fundraisers are up to their necks in water, but think that they have the ability and the right to turn the tide of history. If they don’t wise up, they will drown.