When IT Meets Politics

Jun 29 2012   8:16PM GMT

Libor also broke the first rule of Information Governance

Philip Virgo Profile: Philip Virgo

Tags:
Barclays
Data quality
Decimalisation
Eurim
Libor
Open data

Until two days ago I was among those who believed that “Libor” was an authoritative index based on actual transactions, not unchecked estimates, collated once a day, from traders with a vested interest, personal as well as organisational.

I did my original systems analysis training during the run up to Decimalisation in 1971: the first nation-wide opportunity for large scale computer-assisted fraud.

We spent more time learning about how to ensure that the data going into the system was accurate and that what was reported was fit for purpose than about the technologies we would use to process it.

The first “rule” was that unless the data was provided by those who had both a vested interest in its accuracy and the knowledge and the opportunity to check that it was indeed correct accurate, it was likely to be at best  full of random errors and at worst  systemically misleading.

I had already (in 1969) had occasion to see the truth of that statement with regard to the statistics used by the government of day, when a blip caused by the inclusion of three years exports for my previous employer in a single months balance of payments led the then Chancellor of the Exchequer to say, erroneously, that Britain had “turned a corner”.

If it is correct that LIBOR was indeed based on subjective inputs from traders as to what they would like to have seen, as opposed to being an objective by product of processing actual transactions, then the real question  is “how on earth was that allowed to happen at all, let alone go on for so long?”

The honesty or otherwise of those involved in the process is almost irrelevent by comparison with the cavalier attitude to  information governance. We can see that elsewhere with regulators obssessing over data protection as opposed to accuracy and integrity. It is as though the errors in the patient record that led to the treatment that caused your death did not matter, provided the Caldecott guardian was happy with its security.

Some of the asumptions underlying the Open Data  White paper  mean this is not an academic, “post mortem question”.  Accurate and timely data is of great value. But a mash-up of garbage is toxic sludge.

Unless we once again take seriously the issues of data quality  (and the disciplines of information governance  we are in danger of building a future based not merely on sand  (silicon) but on quick sand (silicon processed sludge). 

Hence the critical importance of the recommendations in the report by EURIM on “Improving the Evidence Base: the Quality of Information“, published this time last year

Enhanced by Zemanta


1  Comment on this Post

 
There was an error processing your information. Please try again later.
Thanks. We'll let you know when a new response is added.
Send me notifications when other members comment.
  • Adrian Seccombe
    Sadly my experience is that a large amount of data that senior leaders receive will have been "massaged" to ensure that it matches their expectations. I suspect this is likely to be just as true in government. The instance of the bucket of foul smelling excreta delivered to the door of the headquarters building and the evolving message as the story of the "manure" delivery was passed floor by floor to the top floor. By the time the message arrived in the board room the company had taken delivery of a powerful growth agent! The top floor were however clearly to blame, any less than positive stories, of even the most horrendous events seemed always to end in the shooting of the messenger. Messengers are quick learners!

    While I concur with your observation that massaged data is toxic date, the real crime occurs when recipients accept positive messages more enthusiastically than negative ones. We must protect the FULL Security of our Information Assets. Especially as the opportunity is most often in the negative story.

    This is one of the key reasons that the Jericho Forum has encouraged the development of Microperimeterisation. As you rightly say Security is not just about maintaining data confidentiality, indeed there are a number cases where the integrity or availability of the data is far more important than protecting it's confidentiality. Transparency of both the data and the processes that touch it are often key to the effective management of Information Assets.

    The primary data management principle

    Capture Data at Once Source is sadly too often ignored.

    We must however remember that the very act of measuring will always impacts accuracy, and worse can negatively impact behaviour, as this Blog points out.

    For me we have a Values based problem that is best overcome by an effective mix of Transparency, Egency and Auditability. It is not the fault of the Computer, as you seem to imply in your "silicon processed sludge" analogy. Human Failings are at the heart of the issue, one of the most powerful Values Principles a previous employee, was simply stated as "If you are happy for your family to read of your actions in a front page headline, you are likely doing the right thing."

    Transparency is indeed a powerful modifier of human behaviour. In this context; how can we use IT to enhance our societal values? Rather than as is seemingly happening, continuing to allow the use of IT to erode the Values of our society?

    (To be clear I am not in favour of large state based registers of citizen behaviour, available only to the state, that would be far too 1984!)

    0 pointsBadges:
    report

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to:

Share this item with your network: