The Nudatrons* are back in the news again, as the Equality and Human Rights Commission has warned that their use in UK airports may be illegal, with a primary concern about how individuals might be selected for scanning as opposed to a traditional metal detector scan. In response, the government has stated it is carrying out an equalities impact, and the Department for Transport has published an interim Code of Practice for body scanning.
The debate over acceptability of body scanners, which were introduced without public consultation after a passenger’s failed attempt to explode a package hidden in his underpants at the end of last year, has fudged together a number of separate and complex security and privacy issues, including:
- whether security checks should be applied equally to every passenger, or just to a selection of those passengers;
- whether or not it is acceptable to select individuals for security checks based upon their profile (age, gender, ethnicity, religion, behaviour, journey, travel documents);
- whether it is acceptable to scan children or others who might not be able to give valid consent to the process;
- whether individuals should be able to select the primary security check mechanism (scan, pat down);
- which scanning technologies are acceptable for use, and what privacy-enhancing controls might be placed over them.
The acceptability of profiling in this context is a very sensitive issue. Common sense dictates that an elderly lady might be less of a threat than a young man, and therefore less likely to be worthy of a full scan. This enables guards to expedite both her transit through security, and that of the other people who no longer have to wait for her to pass through the scanner. The problem is that the topic is much more subtle than that, and as soon as we introduce ethnicity, religion etc then there is a reasonable likelihood of discrimination issues arising. Some have argued that this could be countered by creating a scanner ‘fast track’ – in other words if a passenger is selected for scanning then they actually transit security faster than others – but that’s not going to solve the problem.
This isn’t an issue that will be resolved easily, and by confusing it with the other points above, everybody loses. The Department for Transport’s interim guidance sidesteps the whole problem by dictating that individuals must be selected at random, rather than as a result of profiling, but then omits to deal with opt-out or privacy-enhancing technologies (PETs):
- it seems reasonable that a passenger should, under the terms of the Data Protection Act, be able to select their preferred primary security check. If a passenger prefers to be scanned rather than patted down, so be it; likewise, if they want a pat-down in place of a scan, that seems reasonable. If guards feel that there is a reason to demand a secondary security check, then of course they should be able to do so. Likewise, if a parent or legal guardian does not wish to have their child scanned, then they should be able to opt out on their behalf.
- scanners are now available with a range of PETs that remove body detail, thus preserving privacy, but the guidance makes no mandate that these technologies should be used in favour of more invasive displays.
With the ever-increasing hysteria about terrorist threats (which will only increase as the 2012 Olympics approach), we are likely to start seeing these scanners outside of airports, and the DfT has a duty to revise its guidance to resolve the issues around opt-out and PETs – there is no reason why that can’t happen immediately. Privacy is essentially resolvable, and should not be mixed up with equality, otherwise both rights are eroded in much the same way that privacy is being eroded when mixed with security needs.
The most important issue upon which we must focus is that of profiling, and this needs a much more informed, facilitated public debate before the equality problems can be properly understood and hence resolved.
* [A term I first heard used by Charlie Brooker]
The project is based upon information theory – using entropy to singulate an individual from a wider population by combining facts about smaller groups. For example, a postcode does not disclose a single individual, but coupled with a gender and a month of birth it’s quite possible to pick out the subject. This singulation is measured in entropy, where 1 ‘bit’ of entropy indicates 2 possibles; 2 bits of entropy indicates 4 possibles; 3 bits of entropy indicates 8 possibles, and so on. As EFF point out, with approximately 7bn humans on the planet, an entropy of 33 bits should suffice to identify one individual in that population.
So, pretend you’ve set up your browser to be relatively anonymous – you’ve tied down the cookies, you have a dynamic IP address, you don’t enter personal data unnecessarily – you’d imagine that you have ‘anonymous’ browsing? Not necessarily. EFF’s Panopticlick test tool looks at a number of qualities of your browser to consider its entropy, including:
- revealed browser ID (IE, Safari, Firefox, Chrome etc);
- the plugins available within the browser (Flash, QuickTime etc);
- time zone;
- screen size and color depth;
- installed fonts;
- cookies enabled.
The Panopticlick engine scores the browser based on these and other attributes, and calculates an entropy. So, my Safari browser holds at least 19 bits of identifying information, and that is sufficient to render it unique of the 0.5m+ browsers that have visited the site so far.
The implications are important: a site utilising this approach can, to a reasonable level of probability, identify continuity of relationship between site visits, even without planting a cookie on the user’s machine. Once the user is known, the site can track that user on return visits, even if they refuse to disclose who they are. Users that wish to protect themselves need to use ‘non-rare’ browsers (i.e. Firefox on Windows), with a minimum of plug-ins, and exploit anonymisation tools such as TorButton.
There are of course some flaws in this approach; fonts and plug-ins may change, and the engine appears to think that the browser is unique regardless of how many times it returns (thanks to Jerry for pointing that out). But either way, there’s a fresh level of anonymisation needed for users who wish to browse without being tracked.
The BBC reports that the magistrates’ yearbook for Norfolk was accidentally sent to a Category D prison for printing, thereby potentially exposing information about all of Norfolk’s magistrates to the prisoners. Whilst the details do not include home addresses, one magistrate is quoted as saying:
“It doesn’t particularly concern me as anyone can look in the phone book and find my name, but some magistrates could be really worried – there will be those who will be terrified,” he said.
Magistrates aren’t to be confused with judges – most are lay people who have volunteered to take on the role part-time, have been selected by their peers, and undergone training about their duties, but are not themselves from a legal background. They tend to deal with small cases, but will often have seen no end of minor crimes against individuals and property pass before them during their careers. What shocks me here is the response of the Court Service:
“All hard copies and electronic copies of the document at the prison have now been destroyed. The screening process for documents to be printed at the prison workshop has been made more secure to ensure this does not happen again.”
Has it not occurred to the Court Service that maybe the real problem here is the existence of a book of details in the first place? That a new-fangled piece of technology called the Internet might allow authorised users to access only the information they require, without distributing books of it just in case it comes in handy? With technology attitudes such as this, it’s little wonder that the likes of the Libra project suffered so many problems…
The Information Commissioner has confirmed that from April he will have new powers to fine organisations up to £500,000 for wilful or reckless misuse of personal information. Actions likely to incur the full wrath of the ICO will include:
- Where an individual becomes the victim of identity fraud following a security breach of financial data by a data controller
- Where an individual suffers worry and anxiety that his sensitive personal data will be made public even if his concerns do not materialise following a security breach of his medical record by a data controller
- Where a marketing company collects personal data one purpose and then, without the individual’s knowledge or consent, knowingly discloses the data to a third party for another purpose.
It’s good to see that the ICO finally has powers commensurate with the importance of the role. To date, the relatively small penalties available to the Information Commissioner have had little deterrent effect upon companies that are negligent or wilfully abuse personal information, and in the most serious cases it has fallen to other regulators such as the Financial Services Authority to enforce appropriate sanctions. Hopefully the new powers will engender a fresh respect for the Information commissioner and the importance of protecting personal information.
There is however one particular area in which these new powers might be inappropriate: there can be only limited value in fining organisations that are funded by the public purse, since this simply impacts the final service delivery from that organisation. Where incidents occur in public authorities, penalties should be focussed upon the responsible individuals rather than the organisation as a whole.
The attempted Y-Front bombing of a US-bound flight by Umar Farouk Abdulmutallab on Christmas Day has been the dominant international news story of the past few weeks. The repercussions for privacy are significant: President Obama has been scathing about security controls and demanded reforms. Around the world, governments are rushing to purchase full-body scanners that effectively undress passengers. As with any other spectacularly disproportionate and invasive new technology, we are assured that they are a panacea for all our security concerns, and that privacy controls will be adequate. But this time people are being taken in by the rhetoric.
I’ve noticed a somewhat grumpy trend in my blogs over recent months: the problem is that there’s so much to grumble about. Public sector appears to be in meltdown with accusations of ministers pursuing a ‘scorched earth’ policy in the build-up to the election, vendors milking the last drops they can before the government finally notices the money’s run out, and the Chancellor trying to paper over the cracks with tiny little budget cuts here and there. Outside of government we also see privacy, security and identity failures, but market forces tend to correct them pretty quickly. All these problems make writing ‘bad news’ stories about as tough as shooting really big fish in a really small barrel.
So, my resolution for the New Year will be to run a week of ‘good news’ stories. I want to celebrate the unsung successes in delivering user-centric, privacy-friendly, secure systems that arrive on-time, on budget, and meet everyone’s expectations. Heck, I may even gift a pair of Xmas socks to the winner.
Please could you nominate winners in the comments below? I’m not interested in off-the-shelf vendor products that happen to do the job well, but public-sector or bespoke commercial systems (no matter how big or small) that have understood and upheld the users’ wishes in the handling of personal information. Hopefully we’ll get sufficient nominations to make this into a regular thing – and to give you a break from the normal irritable service…
The Information Assurance Advisory Council has been quoted as threatening government with a refusal to change existing contracts in order to comply with the requirements of the Hannigan review of Data Handling. In a statement bizarrely reminiscent of the public spat between the Tories’ David Davis and Intellect’s John Higgins…
Neil Fisher, vice chair of the Information Assurance Advisory Council, a think tank of government, industry and security bodies, said it is unrealistic of the government to attempt to change terms on contracts that still have years to run. Changing contracts stirs up the difficult issue of who carries the risk of something going wrong, he said.
“Industry will be quite robust about this,” he said. “They are not a charity. They do work for payment and they do it against clear instructions from the client, and that is the way these relationships work. By doing so, they limit their liability.
“What the government would like is for industry to try to meet it half way somewhere. I’m not sure that’s a clever way of doing it or one which will contractually will be enforceable,” he added.
This ridiculous situation where major System Integrators who have profited massively from public sector contracts deny any moral or legal responsibility for data security has to come to an end. CESG, Cabinet Office, MoD have been issuing mandatory security instructions for many years, so failure to implement is not because they had nothing to comply with. All the major SIs are represented in the various CLAS and LIST X directories, so they cannot claim to be unaware of their duties.
That leaves us with three main ways that this mess could have arisen in the first place:
- government failed to specify the need for security controls (and the SIs chose not to enlighten them during the procurement process);
- senior civil servants bypassed the accreditation process by signing off the system without sight of an accreditation documentation set;
- the oligopoly of SIs have collectively failed to represent security needs properly in their bid documents in order to present the ‘lowest possible’ price to government.
Regardless of which circumstance created this situation – and it is most likely a combination of the three – it has to be fixed. Where civil servants are shown to have been negligent by not specifying proper security controls, clearly it is the government’s responsibility to accept reasonable costs to remedy the problem. Where the SIs simply failed to implement controls that they should have known were needed, then the government will have to crack out the lawyers and knuckle down for a fight.
And to stop this mess arising again? We require more prescriptive security controls issued by a single national authority for information assurance – controls that cannot be signed away by a Whitehall mandarin. We need to completely reform government procurement to break the stranglehold that half a dozen vendors have over the lion’s share of public ICT spending. And we need to recognise that these so-called industry representative bodies do not speak for all of the UK ICT vendors – just those that are doing very nicely out of the current protection racket where nobody gets protection at all.
It had to happen sooner or later – the police have been accused of arresting people solely for the purposes of adding further ‘suspects’ to the National DNA Database. It’s time to revisit why the current approach threatens to undermine the whole worth of DNA as an evidential tool.
I wrote on the topic of the appointment of the new Identity Commissioner, and since then a comment has literally flooded in. I’m indebted to a spokesman for the Home Office, who writes:
Today’s Computer Weekly blog entitled ‘The Identity Commissioner speaks – and drops the Home Office in it’ does not contain a comment from the Home Office.
Please could you include the following comment.
A spokesman for the Identity and Passport Service said: “The appointment of the Identity Commissioner was conducted in accordance with the principles and standards for public appointments which included public advertisement and the use of an executive recruitment agency.”
That’s certainly put my mind at rest, and I’m of course happy to correct my original article. I’m delighted to know that the rules were followed to the letter in the same way as they were when MPs claimed their expenses, HMRC tried to send CDs to DWP, and we went to war with Iraq.
Chris Williams of the Register has written an excellent analysis of the recent jailing of a schizophrenic man for refusal to release cryptographic keys to the police. This was a scenario foreseen when RIPA was introduced, but is the first high-profile example of it being used to imprison an individual who demands the right to privacy.
I won’t attempt to recount the detail of his piece, but in short, the man (referred to as JFL) was arrested when returning to the UK from France because he was carrying a hobbyist rocket (without motor). A subsequent investigation revealed that he had released without charge for attempting to enter Canada, and he was returning to the UK to meet with Customs officials following a missed bail appointment. Forensics found trace amounts of explosive on him (9 nanograms, with 5 being the normal threshold at which evidence is ignored). He’d also had a number of hard drives, USB thumb drives and other materials delivered to addresses in the UK. So far, so good – there’s at least something to suggest suspicion.
However, the man refused to release the PGP keys to decrypt files on the drives, and so police sought a section 49 notice under RIPA Part 3 to give him a limited amount of time to release the keys. He did not attend his bail, and instead moved around until police caught up with him in a raid several weeks later. At this point, insisting that he would not release the keys, which he claims protect information relevant to business interests, he was charged with 10 offences under RIPA Part III. Police argue the files “could be child pornography, there could be bomb-making recipes.” JFL maintained his silence as a matter of principle.
JFL was tried and sentenced, although the suspicion of terrorist activity was dropped long before the trial, and no pre-sentence report on his mental health was requested by the court. He was sentenced to 9 months imprisonment and is currently in a secure mental unit after being sectioned whilst serving his sentence.
This is the British legal system at its very worst. Whilst there may have been grounds for an initial investigation, once it became clear that the only person JFL might be a danger to is himself, the police chose to pursue him under the catch-all of national security/child protection that is used to justify most failing and botched policies and central government projects. Even the lay person would recognise that a schizophrenia sufferer would probably misinterpret the authorities’ intentions and respond in a seemingly irrational manner to surveillance and investigation. This all smacks of saving face and hitting targets, and I trust that someone out there will use FoI rights to demand details of the costs involved in this prosecution.
In the meantime, remember that if you encrypt files and then lose the keys, this could happen to you too – the law won’t differentiate between refusal to disclose and failure to disclose. Welcome to the age of transparency.