Brand new key

Parents at schools in Suffolk recently received an interesting piece of correspondence about an exciting initiative called ‘Suffolk SAFEKey‘, offered by Suffolk Police. For as little as £1 a month, subscribers to the service receive a special key fob with a reference number on it. Once registered, if the keys are lost, the person can use the reference number to contact Suffolk Police’s commercial partner (Keycare Limited) to get keys and owner reunited, incentivised by a £10 reward.

Alerted to this by a concerned citizen, I made an FOI request to Suffolk Police to find out more about the scheme, the arrangement with Keycare Limited, and how the email came to be sent. Suffolk Police told me that they contacted all 18 secondary schools in the county (by phone, so I don’t know how the request was couched), and of those, 8 forwarded the invitation to join SAFEKey to all parents. The force were unhelpfully vague about who else had been approached. I asked who they had contacted, and their answer conflated those they approached and those they claim had approached them. This means I know that those involved are charities (Suffolk Community Foundation / Age UK), “advocacy groups” (whatever that means), Neighbourhood Watch, the University of Suffolk and “lunch clubs and other such groups”, but I don’t know who contacted who.

On one issue, Suffolk Police were admirably clear. I asked them how they had obtained consent to send the email. This was their reply:

The parentmail service is not controlled by the Constabulary and the information provided is not personal data and as such, there is no requirement for us to obtain consent from those third party recipients.

Regulation 22 of the Privacy and Electronic Communications Regulations 2003 (AKA PECR)  applies to emails and texts, and it is remarkably unambiguous, despite all the dodgy marketers and list brokers who purport not to understand it.

a person shall neither transmit, nor instigate the transmission of, unsolicited communications for the purposes of direct marketing by means of electronic mail unless the recipient of the electronic mail has previously notified the sender that he consents for the time being to such communications being sent by, or at the instigation of, the sender

Suffolk Police instigated the sending of the email to parents by making an unsolicited approach to schools, asking them to send it. The email would not have been sent unless they had asked for it to be sent. Regulation 22 does not require them to be the sender. Should there be any doubt about this, the ICO asked Better Together to sign an undertaking following their misbegotten texts during the Scottish Independence campaign. Better Together used an agency – they never held the data and they didn’t send the texts. This is exactly the same situation. There are only two ways that marketing emails could be sent in this way: either parents would have to give consent direct to Suffolk Police, or give consent to the school to receive marketing from the force. This second possibility is one the ICO is keen to play down, as their Direct Marketing Guidance makes clear:

Indirect consent may therefore be valid if that organisation was specifically named. But if the consent was more general (eg marketing ‘from selected third parties’) this will not demonstrate valid consent to marketing calls, texts or emails.

Of course, as the senders of the emails, the schools have also breached PECR. And taking it one stage further, you could argue that Suffolk Police have also breached the Data Protection Act by processing personal data unfairly and unlawfully. If they don’t have a data processor contract with the schools, they may even have breached the seventh principle.

Many public bodies and charities struggle with PECR because they perceive ‘marketing’ as a purely commercial activity. This means that they think the messages they send are somehow not marketing, and are surprised when PECR bites. Suffolk Police can be under no such illusion. SAFEKey is not a policing activity, it is a wholly commercial venture, with the income split 50/50 between the force and Keycare Ltd. Moreover, there is an argument that the force is exploiting its position as a law enforcement body to promote its commercial activities – it’s unlikely that secondary schools would forward information about double glazing or PPI. The force might want this to seem like an aspect of their crime prevention work, but it isn’t – it’s a purely commercial venture. No public body, but especially not the police, should exploit their position as partners with other, smaller public bodies to plug their commercial activities.

There are other concerns. The force didn’t carry out a Privacy Impact Assessment before launching the SAFEKey scheme, which is surprising, as the project involves the force gathering personal data it does not need to carry out its legal functions, purely for the purpose of a commercial venture, using a variety of unrelated bodies as a conduit for the data and transmitting it to a commercial partner. At the very least, you would expect them to consider the risks. Moreover, although the extract I received from the contract between Keycare and Suffolk Police does make it clear that Keycare cannot use or share the personal data they receive for their own purposes, the security demands made by the police are relentlessly generic.

I don’t think the police should exploit the significant position of trust they enjoy to flog commercial services at all. But even if you disagree, there can be no question than when they do, the police should at all times obey the law. They haven’t done so here, and the ICO should investigate. As I did not receive one of the emails, they would ignore any complaint that I made, but they should intervene to make clear to all public bodies how PECR works.

 

Less than ideal

Last week, Stephen Lee, an academic and former fundraiser was reported as having attacked the Information Commissioner’s Office for their interpretation of direct marketing at a fundraising conference. It was, he said “outrageous” that the Commissioner’s direct marketing guidance stated that any advertising or marketing material that promoted the aims and ideals of a not-for-profit organisation was covered by Data Protection. According to Lee, only fundraising activities should be considered to be marketing.

[NB: Third Sector articles are sometimes open to all and sometimes limited to subscribers. If the links don’t work, please accept my apologies!]

He is quoted as saying “Who says that’s right? Just the ICO. Who did it consult? No one.” and  went on to say “Why and how and in what way should we be compelled to comply with that proposition?”

Who says that’s right? Who did the ICO consult? Well, let me see now.

1) The Council of Europe

In 1985, the Council of Europe issued a Recommendation on the protection of personal data used for the purposes of direct marketing. The definition of direct marketing includes both the offer of goods or services and “any other messages” to a segment of the population. The recommendation predates the guidance Mr Lee disparages by more than 30 years.

2) The 1995 Data Protection Directive

The Directive makes clear that direct marketing rules apply equally to charitable organisations and political parties as they do to commercial organisations, and emphasises the need for people to be able to opt-out of direct marketing. By redrawing the definition, Mr Lee would contradict this fundamental right.

3) The Data Protection Act 1998

Given that Mr Lee feels qualified to make bold statements about the interpretation of the Data Protection Act, it’s odd that he doesn’t seem to have taken the time to read it. Section 11 of the Act states that the definition of Direct Marketing “the communication (by whatever means) of any advertising and marketing material which is directed at particular individuals”. The important word there is “any” – organisations do not get to pick and choose which of their promotional messages are covered and which are not.

4) The Privacy and Electronic Communications Regulations 2003

PECR sets up the rules for consent over electronic direct marketing (consent for automated calls, opt-out and TPS for live calls, consent for emails and texts). It does not define direct marketing, but instead says this “Expressions used in these Regulations that are not defined in paragraph (1) and are defined in the Data Protection Act 1998 shall have the same meaning as in that Act”. Therefore, the DPA definition applies to PECR.

5) The Information Tribunal (now the First Tier Tribunal)

In 2005, the Information Commissioner served an Enforcement Notice on the Scottish National Party after they repeatedly and unrepentantly used automated calls featuring Sean Connery to promote the party in the General Election. The SNP appealed, and in 2006, the Information Tribunal considered the issue. One of the main elements of the SNP appeal was against the ICO’s definition of direct marketing. Although the case is about a political party, the ICO’s submissions are based on the proposition that charities as well as political parties are covered by the definition of direct marketing, and that the definition cannot be restricted to fundraising alone. The Tribunal accepted the ICO’s view in full, and dismissed the appeal.

6) The charity sector and anyone else who wanted to be consulted

The ICO may have issued guidance in the 1980s or 1990s on the definition of direct marketing, but the idea that promoting aims and ideals is part of it has been their view since 1999. In guidance issued on the precursor to PECR, the ICO stated clearly that direct marketing includes “not just to the offer for sale of goods or services, but also the promotion of an organisations aims and ideals”. They specifically mentioned charities, as they have ever since. Virtually every iteration of the ICO’s guidance on PECR and direct marketing has been subject to public consultation – indeed, the very guidance Lee is talking about was subject to a public consultation.

Here’s the problem. Lee is an Honorary Fellow of the Institute of Fundraising, and has a long association with it. The IoF has been the most consistently pernicious influence on the charity sector’s compliance with data protection and privacy law in the past ten years. Their guidance and public utterances on data protection are often misleading, and they recently had to change their own Code of Practice because it was legally incorrect. At best, they haven’t noticed the ICO position on charities and direct marketing for more than 15 years. At worst, they deliberately ignored it in favour of an interpretation that largely suits fundraisers. Lee complained at the conference about the “appalling” communication between the ICO and charity umbrella bodies, but Richard Marbrow of the ICO summed the problem up all too well:

One of the things the sector asked for was clarity, and I will try and bring you that. The trouble is, if you then say ‘we don’t like that clarity, could we have some different clarity please?’, we’re not going to get on very well.”

The most important thing about Lee’s outburst is the subtext – if any form of communication is not covered by the definition of direct marketing, then your consent is not required  in the first place and you have no right to stop receiving it. His interpretation is nonsense, but it is also ethically unsound. At its most basic level, privacy means the right to be left alone, the right to have an area of your life which is yours, which others can’t intrude into. Lee seems to want to erode that right. If his view was correct (it’s not), charities could bombard people with phone calls, texts or emails to tell them how marvellous they are, how important their work is, how vital they are for society. As long as they don’t ask for money, the logic of his argument is that people wouldn’t be able to stop them.

Lee’s other question (“Why and how and in what way should we be compelled to comply with that proposition?”) has an easy answer. Ignore it. Carry on breaching the law, ignoring the rules. I went to the cinema last night and saw adverts for two different charities that plainly breached PECR, so that seems to be the plan. Given that the furore over charities began with an innocent person bombarded with unwanted correspondence, it’s remarkable that senior figures in the charity sector are ready for another go, but if Mr Lee wants to drag charities’ reputations deeper into a swamp that they share with PPI scammers and payday loan merchants, he’s welcome.

But the ICO should not listen to their concerns, or open friendly channels of communication with the sector. They should apply the law firmly and regularly until the charities get the message. If this results in more enforcement against charities than other sectors, that will be only because the big charities are among the worst offenders and they haven’t put their houses in order. If charity giving suffers as a result, even amongst the many charities that have not transgressed, they should stop blaming others and look to their fundraisers, their colleagues and themselves.

The Bad Samaritan

The Samaritans have launched a new tool for the persecution of the vulnerable… Sorry, a nannyish attempt to spy on your friends, No, I mean, they’re trying to use technology to do what real friends would be doing anyway…. I’ll try this again. There’s this app they have. You’ve probably heard of it; it runs in the background monitoring tweets of those you follow on Twitter, and analyses them to look for indications that a person may be in need of support. The Samaritans are convinced it’s marvellous and has no Data Protection or privacy implications.

The Data Protection Act 1998 applies to the processing of any personal data, anywhere by any person. Certain areas are carved out – the use of personal data for national security purposes is inevitably and depressingly exempt, as is the use of data for purely personal, domestic reasons, and to an extent, the use of data for journalism. Beyond that, although the Data Protection principles are flexible, they apply to all uses of personal data.

At no point in the text of the Data Protection Act does it say that personal data that is public or published is exempt from the Act’s provisions. There is no section that says that, and no section that can be interpreted as meaning that. Moreover, I can use the same quote I used from the Information Commissioner’s Code of Practice on Online data that I used in my last blog about monitoring of blogs:

“If you collect information from the internet and use it in a way that’s unfair or breaches the other data protection principles, you could still be subject to enforcement action under the DPA even though the information was obtained from a publicly available source.”.

And “You should only use their information in a way they are likely to expect and to be comfortable with.”

As the Samaritans have claimed that their app is entirely legal and has no Data Protection implications, I am certain that they will have no problem answering the following questions:

Principle 1:

  • No consent is being obtained; which data protection conditions allow the Samaritans to monitor and – crucially – to analyse and interpret the state of mind of Twitter users without consent?
  • How are data subjects to be informed that their tweets are being monitored and – crucially – analysed with a notification to any third party who chooses to register?
  • The first principle requires the processing of data to be ‘fair’: what steps have the Samaritans taken to ensure that those registering to receive notifications via the app have no malicious intentions towards the subject and will not use the notification for malicious purposes?

Principle 2:

  • What assessment has been carried out to ensure that the processing (i.e. attempting to identify the subject’s state of mind in order to notify secretly a third party of that) is compatible with the subject’s original purpose in publication? How is that original purpose identified?

Principle 3:

  • How have the Samaritans established that their gathering of data and analysis of Twitter users’ state of mind is relevant and not excessive?

Principle 4:

  • Principle 4 states that personal data ‘shall’ be accurate for the purpose – there is no qualification to this. How have the Samaritans ensured that the analysis of a Twitter user’s state of mind is accurate when alerting a third party to it?

Principle 6:

  • What provisions have the Samaritans in place to provide the following:
  • Subject Access: data subjects are entitled to know what data is held about them, and who has received it. Will data subjects be told who has received alerts about them if they ask? If not, which exemption applies?
  • Section 10 Right to object to damaging / distressing processing: data subjects have a right to object to damaging processing – will such requests be honoured? If not, why not?
  • Section 12: Data subjects have a right to request that any automated processing will be carried out by a human being. Will Section 12 requests be honoured and if not. why not? How many members of Samaritans staff are available to carry out the analysis?

Principle 7:

  • What technological and organisational security measures are in place to ensure that the analysis of Twitter users state of mind (potentially sensitive personal health data as defined by the Act)?

Principle 8

  • How have the Samaritans ensured that the sharing of personal data about Twitter users’ state of mind is restricted to the European Economic Area? If it has not, how is the sharing of information about Twitter users’ state of mind outside the EEA justified under Principle 8.

For the record, I think the 30 day retention period of data (principle 5) may be OK.

Crazy Naked Girls

There’s little to like about the voyeuristic coverage of the theft of images of famous women. Whether it is the feverish frottage of the mainstream press (which largely boils down to LOOK AT ‘EM ALL, IMAGINE ‘EM ALL NAKED, NNNNNNNNGGGGGG!!!!!) or the inevitably crass victim blaming (thank you, Ricky Gervais, for The Office and for absolutely nothing else), it’s all depressing.

The data protection strand in all this hasn’t been much better. Mobile devices are not a safe place to store sensitive data (true). The cloud is – in Graham Cluley’s immaculate phrase – just someone else’s computer (true). But too many security commentators have, perhaps unwittingly, aligned themselves with a ‘They asked for it’ line of thinking. A popular analogy is the one about burglary or car theft (this is an example from 2011). Apparently, you can’t complain if you leave your valuables on the front seat of your car and somebody steals them, and the same goes for pictures of your bits and the internet. In other words, the thinking is more or less that if Jennifer Lawrence is silly enough to take pictures of herself naked, she was basically asking for them to be stolen. For me, this is too close to the mentality that blames rape victims for being drunk, rather than rapists for being rapists. Friends, I blame the rapists.

Taking pictures of oneself is normal for most people, not just actresses – I am odd because I don’t do it, but if I was good looking, I probably would, all the time. It must be great to be extraordinary, and to enjoy being extraordinary. It’s too easy to be holier-than-thou and say that the violated only have themselves to blame. The victims made these images for themselves or they made them for someone else specific. They did not make the images for the media, or for the voyeurs who stole, sold or search for them. Anyone who handles or seeks them out violates the subject’s privacy, is a criminal and should be treated as such. The victims did nothing remotely scandalous or reprehensible – indeed, they did nothing that is anyone else’s business but their own. They probably didn’t do a privacy impact assessment before taking the pics, but that’s because they’re human beings and not data controllers.

The car analogy doesn’t work because mobile phones and the internet are not immediately understandable physical objects and spaces. When you leave your laptop on the passenger seat of your car, you can turn around and see the laptop sitting there. The risk is apparent and obvious. There’s a striking moment in Luc Besson’s current film ‘Lucy’ where Scarlett Johansen can see data streams soaring out of mobile phones across Paris, and navigates her way through them. We don’t see data like this. Few understand how the internet actually works (I’ve met a lot of people who think cloud storage means that data is floating in the air like a gas). We don’t see the data flowing or spot the footprint it leaves behind. We don’t know where the data ends up and the companies we use don’t tell us. We use unhelpful misnomers like ‘the cloud’ when we mean ‘server in a foreign land’. Many people don’t know how their phones work, where their data is stored, how it is copied or protected, or who can get access to it. This should be the problem that the photo hack alerts us to.

It’s possible that some people would change the way they used technology if they fully understood how it works, but that should be their choice, based on clear information provided by the manufacturers. At least one of those affected has confirmed that the images of her are quite old, so we can’t even judge the situation on what we know now. If taking the pics was a mistake (and I don’t think I’m entitled to say it was), it was a mistake made possibly years ago.

I don’t think people understand where their data is or how it is stored. Rather than wagging our fingers at the victims of a sex crime, anyone involved in data protection and security should concentrate on educating the world about the risks. I think the big tech companies like Google, Apple and Facebook would be uncomfortable with this idea, which is why security and sharing are presented as such tedious, impenetrable topics. They don’t want more informed use of their services, they just want the data like everyone else. The defaults for sharing and online storage, for location and tracking, for a whole variety of privacy invasive settings should be set to OFF. Activities involving risk should be a conscious choice, not an accidental side effect of living in the 21st century.

Peeping Tom

There have already been excellent articles about the antics of contributors to the Facebook page “Women Who Eat on Tubes’, including one by Sophie Wilkinson, one of the women who has been targeted, by the Irish Data Protection lawyer Fergal Crehan, and a surprisingly sympathetic interview in the Telegraph with the page’s progenitor, Tom Burke, in which he insisted he is not a weird deviant. Burke made an unsympathetic and discourteous defence of his activities on Radio 4’s Today programme this morning, claiming that his was an artistic project for the cognoscenti and in any case, that there is not right to privacy in a public place. There has been some measure of privacy in public places at least since Mr Peck won his Human Rights case in 2003, so perhaps Mr Burke hasn’t been keeping up with recent events.

If you haven’t seen the Facebook or Tumblr pages (I understand the Facebook page is gone), they comprised pictures of women eating on the Tube, taken without their consent by men. These women put things in their mouths, know worra mean, eh? The nudge-nudge-wink-wink voyeurism of WWEOT isn’t hard to spot, but I have another point. WWEOT breached the law.

Long standing readers of this blog (and DP anoraks) will know that Data Protection offers an exemption in Section 36 for ‘personal, family and household’ uses of personal data. Private citizens are entitled to take photographs in public places, have CCTV on their houses, keep personal diaries about their neighbours, indeed anything they feel like doing. The DPA is not intended to regulate the ordinary person’s activities. However, the exemption has limits. In 2003, the EU Court of Justice found that a Swedish Citizen, Mrs Lindqvist, who wrote about admittedly trivial details about her fellow churchgoers on a church website, had breached the Swedish equivalent of our Section 32. Mrs Lindqvist’s publication of information on the internet robbed her of the domestic exemption. As I have written before, the UK Information Commissioner has always studiously pretended that the Lindqvist decision didn’t happen, but the UK Courts have on at least one occasion described this position as ‘absurd’.

The effect of Lindqvist – properly implemented – would be significant, and it would have a substantial effect on social media. Any person who wished to publish an image of another person (or other personal data) on the internet would have to comply with the Data Protection Act. They would probably have to use such data fairly, they would need the consent of the individual (or some other condition like legitimate interest), and they would need to respond to subject access requests and Section 10 notices (which prevent processing that causes damage in some circumstances). There’s no question, for those people who want to put images of other people onto the internet, it would be a huge inconvenience. Lindqvist would also drag the Information Commissioner into an endless cycle of domestic disputes.

Two things here: first, the law is the law. The ICO has no serious argument that Lindqvist is not an accurate reflection of European DP law, and should do its job properly. Second, in my opinion, a person should have the right to sit on a tube eating a banana, dance badly in a nightclub, fall asleep on a train or wear red trousers and be posh without their fellow citizens taking their photos and sneering at them online. If you want to put your own data onto the internet, a proper implementation of Lindqvist would not be any hindrance. But if you can explain to me why other people (anyone, female or male) should lose their rights to be left alone because you have a smartphone and you want to use it, feel free to drop me a comment.

And so back to WWEOT. As I understand Lindqvist, Mr Burke as the creator (or – no doubt – curator) of the group lost S36 merely because of the publication, as have all of his contributors. At the very least, he should be doing a Data Protection notification and paying his annual £35 to the Information Commissioner. Of course, this would mean that every Facebook user and blogger that published data about a third party should do the same. Let me be clear, I think they should. However, there is a more concrete reason that WWEOT fails its domestic purposes test. Mr Burke is clear that he sees Women Who Eat on Tubes as an artistic endeavour. So that’s domestic purposes thrown out of the window and bouncing down the road in the rear view mirror. He’s covered by the DPA.

There is another exemption that Burke and his compadres may be able to use. Section 32 of the Data Protection Act allows a fairly broad exemption from much of Data Protection if the data is being processed (i.e. the photographs) for the special purposes – journalism, literature and art. Using Section 32 requires the Data Controller (Mr Burke) to ‘reasonably believe’ that compliance with any of the DP principles is incompatible with the special purposes. It’s possible that Burke might be able to argue that the public interest in the publication of his non-consensual images meant the principles didn’t apply, but I think he should be made to do so.

But by applying the DPA to his images, every woman featured on the site would be and should be able to test his arguments out, and force him to justify the overriding public / artistic interest either to the Information Commissioner (who might run a mile) or the Courts (who would probably make a sensible decision). This should not just be seen as a matter of public debate on whether WWEOT is artistic, or reportage. There is a legal method to test Burke’s assertions that the women could have used – and if an incarnation of WWEOT still exists, still can use.

There is a much wider point to be made about WWEOT and the attitudes that lie behind it that is probably another blog entirely. Nevertheless, the casual intrusion into people’s ordinary daily lives that the internet and smartphones have made possible allows all manner of bullying, stranger-shaming and plain old-fashioned voyeurism. Until we stop looking at what we can do on the internet, and start deciding what we should do, I think clunky tools like the DPA should be employed far more aggressively by the people who find themselves unwillingly in the camera’s lens.