When IT Meets Politics


February 16, 2015  12:32 PM

Is VATMOSS a serious VATmess or a storm in a teacup?

Philip Virgo Profile: Philip Virgo
.gov.uk, E-commerce, FSB, VAT, VATMOSS

I have received a number of e-mails asking me to blog on the issues or “do something”, but none from those actually engaged in running the businesses supposedly affected. More-over, all the lobbying appears to be behind closed doors. The guidance on Gov.Uk indicates  that those who are already registered for VAT and use a payment service need do very little. That from the FSB  reinforces this message.

Is it correct, however, that even micro-businesses which use the same “legal identity” for sales of on-line products and services in the UK as for overseas, will now have to register and charge VAT to their domestic customers, thus putting prices up by 20%?

If so, will this lead to halting off-shore sales, a flight off-shore or guidance from the suppliers of accounting and payment services on how to legally and cheaply split the business?
 
In other words, is this attempt to reduce VAT avoidance a serious obstacle to the UK future as a location for innovative on-line start-ups or will it simply lead to more work for tax advisors and a rash of competing “VAT apps” for teenagers (and sub-teenagers) aspiring to sell their own games instead of pirating those of others?

I have no idea – hence the question?

February 14, 2015  7:42 PM

The Future of Technology

Philip Virgo Profile: Philip Virgo
Agile, amstrad, DSDM, Filetab, Gartner, IPR, RAD, RPG, vapourware, Visicalc, Word

20150214 The future of technology.png

Acknowledgements to SWardley – although I changed item 18 to read “XYZ, its so passe and never really worked anyway. ABC is where its at – and ours is great”. 

I would, however, also comment on the timescale.

When my team at the NCC Microsystems Centre  invented and defined the term “vapourware” (back in 1983), I had in mind a timescale of barely five years from birth to death of most “buzzword technologies”.

Despite the efforts of US copyright and patent trolls to slow the pace of innovation, I think that 35 years is a bit long for the life cycle of a terminology.

P.S. I have just done some “research”.

The pace of change in the early 1980s with regard to micro-computer products was unusually rapid and the life cycles of even market leaders were much shorter (e.g. the raise and fall of Visicalc ,Concurrent CPM and the Amstrad PCW) than today.

Taking a longer view, it took about a decade for the term “Agile” to replace the acronym DSDM which, in turn, had taken about a decade to replace RAD (Rapid Application development), which was a reinvention of approach behind Filetab (alias RPG, alias FPL etc.).

Filetab did indeed last 35 years before the name finally vanished from the market – with the Java version, to handle applications inter-operability within mobile phones, hidden from view.

Meanwhile Windows and Word are both approaching their 35th birthdays.

I therefore thank Mr Wardley for his insight and leave readers to ponder for themselves what determines the life cycle of the products, services, technologies (and terminologies) of today.

Is it

– entrepreneurs finding new ways to meet user needs?
– investment in research and development? public or private?
-,government supported technology (and transfer) programmes?
– corporate spend on IPR lawyers? 
– ???

Continued »


February 10, 2015  10:30 AM

Who do you trust to retrain existing staff to handle that which you cannot risk to contractors?

Philip Virgo Profile: Philip Virgo
AIM, Alex, Apprenticeship, Big Data, Compliance, crisis, cybersecurity, e-skills, GREXIT, NED, risk, SIGINT, Skills, Turnover

January usually sees a sharp rise in recruitment effort across the financial services industry, to replace those leaving at year end or who hand in their notice after the Christmas break. This year recruitment effort is down because of the uncertainties caused by the crash in oil prices and the expected cost to the EU of preventing Grexit. Except for risk and compliance staff – where staff turnover continues to spiral upwards as supply falls ever further behind demand. According to Alex on 9th February (that most authoritative of sources on CIty developments) there are now 17,000 compliance officers getting in the way of doing business.  
Those who have not yet taken action to secure their staff must therefore do something different – now .  GCHQ has shown the way by announcing 50 cybersecurity apprenticeships   for school leavers applying by 15th March. Meanwhile the Tech Partnership cybersecurity internship programme has had an impressive take-up. E-mail Howard Skidmore if you wish to bid for some of those not yet matched (believed to be less than 20) or to offer placements for the next intake.

The rest of you also have to consider who you will trust to retrain your existing staff, including users, to handle those roles which you cannot afford to contract to those you do not know.
Before Christmas I blogged on the expectation that 2015 will be the year of the compliance created collapse in cyberconfidence .
 
Over 60% of significant security incidents (data breaches, fraud, network collapse etc.) involve insiders, albeit digititis (e.g. mistakes with maintaining legacy systems overlaid with fashionable vapourware) and ignorance (linked to equally vulnerable identity and access control processes) remains a more common cause than malice or criminal behaviour.

Debate on how to improve the security of businesses or their customers is almost entirely driven by those selling technology or outsource services and processes to help tick compliance boxes. But the travelling compliance “expert”, who stays long enough to help you tick the latest regulatory boxes and collect the understanding and credentials to open the trapdoors in your security firewalls, is now by far the biggest single risk. He, it is usually a “he”, is an even greater (and more unnecessary) risk than short stay security “consultants”, help desk staff or cleaners. Albeit the “over-ambitious chief executive” who ditches due diligence in his (it is nearly always a he) dash for growth remains a greater absolute danger.

I recollect conversations with those then in charge of “risk” at BP when they came to try to audit safety and security systems along the supply chains of the organisations they had acquired in the US as the basis for their entry into the Gulf of Mexico.  Their worst fears came true with the incident which came close to destroying the entire business while enriching a whole generation of Southern lawyers. I recollect similar conversations after the Chief Executive of RBS cut short due diligence with regard to his US acquisitions, before embarking on the take-over too far which did destroy the business.

Due diligence along the security (including risk and resilience) supply chains of organisations being considered for take-over is now big business for the law and audit practices of the City of London and their demand for the skills necessary is helping fuel the current salary spiral and staff merry-go-round which threaten to destroy the security of those who cannot ensure the loyalty of those who manage risk on their behalf.     

A couple of weeks ago  I thoroughly enjoyed an evening with the Management Consultants Livery Company  when I helped open a discussion of the impact of “Big Data” (which I view as a subset of the current state of “Management Science“) on the Management Consultancy profession. I was interested to learn that the market leaders all have a very strong focus on training their own staff, rather than outside recruitment, even though they expect to lose more half with 2 – 3 years. The following morning I attended an excellent NED Forum  on the current state of the Dark Market and the analysis and intelligence services now available. I was interested to learn that, once again, the market leaders train their own analysts because the necessary Information Science disciplines are missing among the many recruits available from law enforcement or the military.

It is perhaps as well to remember that the cryptography operations of Bletchley Park were quite small compared to the Sigint (alias data analytics, or “Information Science”) operations which also maintained the symbiotic German Order of Battle (even down to the level of working out that two radio operators shared a girlfriend called Rosa) . The Sigint operation was entirely female and some of the techniques used have not yet been declassified – because they underlie that which even Snowden did not discover and leak.   

Hence the importance of ensuring that update training in Management Science, alias the disciplines behind “Big Data” is available, when and where needed, to give existing security staff the skills they need to help organise intelligence-led security. It also makes good sense to trawl existing user staff, particularly female staff, for the necessary aptitudes before going outside for new recruits. When I ran the original Women into IT Campaign (1988 – 92) one of the surprises (at least to me) was the discovery that, on average, women stayed significantly longer than men, especially if offered flexible working conditions and other support to cope with family responsibilities (including elderly relatives, not just children).

Most compliance roles do not need cryptographic aptitudes or big data training but, if the exercise is to be more than just ticking the regulatory boxes, they do need an understanding of  the business so as to ensure the compliance routines reinforce good customer service and do not get in the way of profitable business. The current demand for compliance staff and the rate of turnover among those who have no good reason for loyalty, means that is often both cheaper and quicker to retrain long stay user staff, particularly those who might otherwise become expensively redundant, than to recruit externally. The exercise also gives an opportunity to screen for those who might be brought into the main security team to help supervise those to whom those technical operations and support operations which do not need to be in-house are contracted.

But who do you trust to deliver that training? This is not a trivial question of “competence”.

Trainers, like compliance officers, can make trusted contacts across your Chinese walls. I have therefore agreed to help the Tech Partnership  identify those who are trusted to deliver training in other sensitive areas so that they can be asked if they ar e interested in helping specify and deliver modular update training in some of the areas identified as being in critical shortage, such as Identity and Access management (from customer mobiles and bring your own device to tiered access to complex systems and multiple locations, such as a global financial institution or an international airport) or the use of big data  (alias management science) techniques to identify risk. Then there as the skills needed by compliance staff, the selection and training of whom should also be used to identify your next generation of security staff. I gave a longer list last year of the skills gaps based on my work for e-Skills, but we have prioritised since.

Once again, email Howard Skidmore, or myself, if you are willing to suggest who you would trust. Comments on who you would not trust are  also most helpful.


February 7, 2015  10:29 AM

Turning the Super Tanker: Francis Maude’s achievements to date – but its not over yet.

Philip Virgo Profile: Philip Virgo
Cabinet office, cloud, G-Cloud, Software as a Service

Francis Maude has announced that he is not standing again at the next election but, commenting on his work with the Chancellor to drive forward the reform of public service delivery, said “there is much to do – before the election and after – to ensure the reforms are irreversible”. When he gave thanks to his team at Sprint 15 he concluded with some thoughts on the scale of the challenges ahead particularly that of changing the Civil Service culture to one of  “fail small, fail fast” so as to do a much better job of using evolving technologies:

– To put people first, with services that are simpler, clearer, faster.

– To do more, and better, for less. And we’ve shown you can do it.

– To build a truly 21st century digital government, capable of leading a world-beating digital economy”.


We can see how difficult that change will be with the strictures of the Public Accounts Committee on a Secretary of State who over-ruled his officials’ plans to follow the traditional approach of “fail big and expensive, but alibi’ed by massive consultancy and outsource spend so that we are not to blame”. At least the DWP officials and suppliers wasted a couple of £billion less than their counterparts in the NHS (Tony Blair’s National Plan for IT in the Health Service), before the Minister finally forced them to take seriously his “request” (Minister’s cannot easily over-rule their officials on matters of implementation) to pilot the new processes with real claimants before scaling them up for mass roll-out.  

After reading Francis Maude’s blog to Digital Leaders, “Leading Transition in a Digital Age“, I thought back to his comments at the Conservative Party Conference in 2009 on his plans for their first 100 days. The largest room in the conference hotel was standing room only, Afterwards the responses ranged from the enthusiastic to the sceptical. Those from the policy wonks and industry lobbyists were rather different to those of IT professionals and of current and former public servants with decades of experience trying to bring about change. Most of us have since been shown to be both right and wrong.

We will never know what would have happened had the Conservatives had a working majority, but what Francis Maude has achieved under a coalition government is remarkable. The end of his first sentence in the Digital Leaders blog is, however, absolutely accurate: “we are just getting going”.    

There are two views about what “leaders” (corporate or political) should do their first “hundred days”. One (usually corporate) view is that you spend it quietly on tours of inspection, discovering just what it is that you have inherited, who is competent and willing to help you change it and who should be removed before they can stop you and your allies. The next hundred days is when you get rid of the third of those reporting to you who will never be part of any solution – and set about motivating the remainder to do the same to the tier below them. The opposite (usually political) view is that you have to prepare in advance and hit the ground running, because you will never have the same opportunity again.

Unfortunately politicians rarely have a choice. “History shows” that those who try the first option nearly always get bogged down with day-to-day pressures and achieve little if anything. The situation that the coalition government inherited in 2010 gave them no choice. 25 years of “over-enthusiastic outsourcing”, to put it mildly, meant that central government was bleeding to death and had lost the skills to stop the haemorrhaging. It had ceased to be competent either to plan and deliver change itself or to get results, let alone value for money, from those it contracted to do so. 

The task was not just to turn the Exxon Valdez (undermanned, with the captain drunk in his bunk and the radar broken) before it hit Bligh Reef. It was not just to refit the bridge with controls that were fit for purpose. It to simultaneously support the conversion of a 20th century oil tanker into a 21st century cruise liner to serve an ageing population, with the owners  mortgaged to the hilt, while the potential passengers had not saved enough to pay their fares, let alone in advance.

Looking back, we can see that in his first hundred days Frances Maude set in motion processes that have helped slowed the rate of bleeding, (the burden of inherited, inflexible outsourcing contracts and PFI deals means that it has not yet stopped) and enabled monitoring and control systems that may soon be fit purpose. The savings to date are, however, still modest compared with the £billions now in sight: for example from pooling public sector telecommunications spend to enable better, shared, services at lower cost. Hence the importance of the mapping exercise to which Fracnes Maude referred in his Sprint15 speach: the first visible product from the Digital Task Force created last year.

The achievements of the GDS to date are impressive, particularly given the start point, but in his comments to Sprint15, Fancis Maude was quite clear on the scale and nature of the task still ahead – including to help achieve the £10 billion of savings targetted in December.

Provided the next government continues the process. I believe history will show that, even if he does not himself continue the task from the House of Lords, Francis Maude has not only managed to slow the tanker in time to enable refloating without too much spillage, but has also succeeded in beginning the long slow process of rebuilding the skills base of the Civil Service – so that it can indeed implement the changes necessary, provided the political driving force remains in place   

We should not, however, under-estimate the level and nature of ongoing  opposition to the introduction of  adequate (let alone good) practice and governance to a world of revolving doors between retiring civil servants and suppliers who have grown fat from selling consultant-planned “big” change programmes to their departments. The latter “know” how things should be done.  Their “vision” does not, for example, involve the mandatory use of procurement frameworks which require adherance to the open inter-operability standards. These are, however, essential for a world in which innovative small firms and co-operatives of users are expected to produce the pieces for incremental “jigsaw solutions” that can and will evolve, using agile methodologies and changing components from changing players, as needs and technologies change.

Many, both officials and suppliers, would still prefer to delay change in the hope that the post election government will revert to the traditional spendthrift approach, mortgaging the future even further, if they cannot raise the net tax yield. Their careers, including post-retirement life styles, depend on preventing a world of incremental and evolutionary change. Their interests are mirrored by all those consultants, lawyers and lobbyists who are used to receiving big fees for big projects, whether they go right or wrong. The future for the rest of us and our children and grandchildren will be bleak if they succeed.

The good news is that at least some of the big beasts of the IT world, including some surprising names, are beginning to really get their heads round how to make serious money from supporting customers’ in-house teams and collectives of innovative SMEs and third-sector players in the delivery of solutions that use open, inter-operable, cloud-based, software-as-a-software to deliver better service, at lower cost, more reliably, on positive cash flow, The bad news is that they are still a minority.    

I hope that, freed of the need for fight a seat in the next election, Francis Maude will help ensure progress with what he has set in train right up until the election day and then return, perhaps via the House of Lords, to continue the fight – having found a suitable successor to look after the voters of Horsham and answer parliamentary questions.   


February 5, 2015  11:18 AM

Westminster debates Rural Broadband, meanwhile it’s quicker by bike less than a mile away

Philip Virgo Profile: Philip Virgo
Broadband, City, Crapband, CTF, Defra, Hansard, Labour, Libdem, London, Notspots, rural, Soho, UKIP

The DEFRA Select Committee report on “Rural Broadband and Digital Only Services” was published on Tuesday. Yesterday 18 backbench MPs, almost all Conservatives from Rural Consituencies, contributed to a debate on “Rural Phone and Broadband Connectivity“. More would have participated, had it not run out of time. Those unable to speak included a London Labour MP who released her comments to the Evening Standard pointing out it was quicker to send a short advertising video from Soho to Covent Garden by bicycle. Meanwhile the defensive alliance of incumbents  (BT, Deutsche Telecom and France Telecom) announced today is designed to enable infrastructure sharing, rather than addiitonal spend. 

I will stop there because I am just about to leave for a meeting hosted by Westminster Council bringing together local property owners (both public and private sector) and network operators to discuss practical co-operation in organising shared wayleaves and access agreements to cut the time and cost of addressing Inner City notspots and better serve small business and social housing complexes and well as some surprisingly affluent areas that are currently served by crapband (CRAP = copper, rust, alluminium and other pollutants, including radio noise if it is a wireless connection, between you and the nearest fibre connection).     

I am getting bored with those who “admire” problems and find excuses for delay while they milk past investments. I am more interested in helping those looking for ways to work together to meet unsatisfied demand at affordable cost.

The participation in the debate yesterday and the points made (do read Hansard), as well as those made recently by the National Audit Office and Public Accounts Committee, to which I referred in my previous blog , indicate that this topic will feature in quite a number of local campaigns in the General Election. Those interested in helping refine and review the Conservative Technology Forum “F-Plan” can use the e-mail address given on the CTF Web Site page covering the Digital Infrastructure Study  to contact me. I am also happy to share material with those interested in helping Labour, LibDems or UKIP to produce policy in this area because views on the actions needed do not appear to split along party lines.   


February 2, 2015  1:49 PM

Access to superfast broadband has a return on investment 10 times that of HS2

Philip Virgo Profile: Philip Virgo
Avonline, BDUK, HS2, NFU, Ofocm, PAC, satellite, superfast, transport, Westminster

In December 2014 the House of Commons Library producing a briefing for MPs which showed the current state of broadband availability by constituency  and in January 2015 the National Audit Office produced an update report  to aid the recent Public Account Committee review. Meanwhile the draft Ofcom Annual Plan  indicates that the new regime (with Dame Patricia Hodgson as Chairman and Sharon White as CEO) is likely to take a rather more robust approach to measuring broadband performance (in every sense of the word) and competition, particularly with regard to services to business (large and small, urban as well as rural).

According to the analyses at the back of the House of Commons Briefing “superfast”, i.e. over 30 mbps, is available to only 31% of properties in the Cities of London and Westminster (putting the constituency 591st) while 32% have it in West Dorset (587th). BT has told Westminster Council and the Corporation of London that this is because Central London is “uneconomic” because of the high proportion of business properties. Hence the widespread welcome for the news that Ofcom is finally going to investigate the way that BT is protecting its shrinking revenues from business, including public sector users 
 
I am grateful to Patrick Cosgrave for drawing my attention to the comments of Richard Bacon (Con. Norfolk South and Vice Chairman of Public Accounts Committee) during their recent hearing on Rural Broadband : “You’ve [i.e. BDUK] got £1.7 bn to spend and your cost benefit ratio is 20. HS2 has £42.6bn (in fact more as this figure excludes the rolling stock) but its cost benefit ratio is only somewhere between 1.75 and 2 . Your cost benefit ratio is 10 times more yet HS2 is getting 25 times more money. When I asked Philip Rutman, Permanent Secretary for  the Department of Transport,  if they’d considered putting broadband everywhere (admittedly I had to ask him eight times) he finally said No, they hadn’t. The fact is they’ve got 25 times more money but your cbr is 10 times more than theirs. This is a policy issue and shouldn’t you [BDUK] be doing a better job negotiating more money from the Treasury?”
 
In his newsletter on Shropshire Broadband, Patrick summarisef the following discussion as “Sue Owen of BDUK then agreed that this would be a topic of discussion in advance of the March budget. BDUK’s CEO, Chris Townsend, sitting beside her, smiled wryly at this but did not comment. As he had described with some enthusiasm earlier in the meeting that the solution for excluded rural areas will be satellite broadband, perhaps it should be a focus of broadband campaigns across the country to make it an election issue to press for a better deal now that this outcome is likely, and the comparison of pound for pound benefits between two major infrastructure projects is known.”  I am also grateful to Patrick for his summary of the other points made during the discussion and for drawing my attention to a review by Mark Jackson which put the discussions into context – making this blog much easier.

Patrick was disappointed at the news that satellite provision is likely to be the method of meeting the 2 Mb guarantee by the end of 2015. Given the availability of deals like that between the NFU and Avonline for 15 – 22 mbps to farmers I happen to belive that was a sensible and long overdue decision. I am told, however, that there are now issues, however, with regard to service in areas where demand for satellite services threatens to exceed current capacity. The implication appears to be that this should be reserved for areas where there is no realistic alternative. I hope that the meeting being planned for later this month by the Digital Policy Alliance and All-Party Space Committee will provide an opportunity to brief those MPs and candidates for whom this topic is important – including on the need to support investment in increased satellite capacity.
 
Patrick’s summary of the other issues addressed during the PAC meeting included:    
*    large numbers of rural pockets completely left out
*    rural small businesses largely by-passed
*    all farmers having to be online by April this year
*    alleged cherry-picking by BT of areas that are most profitable for them
*    upload speeds rather neglected
*    allegations that BT is by-passing rural industrial estates because leased lines are more profitable.
*    still no competition in Phase 2 contracting despite assurances from BDUK last January that this would not be the case
*    BT not prepared to disclose all costs local authorities  not permitted by BT’s gagging clauses to compare costs
*    BT not overcharging so much for cost of green cabinets in Phase 2 (at least some good news) and £92m of savings have been identified to plough back into projects
*    no real potential for BT to work working in partnership with alternative providers to create tailored solutions
*    BDUK can provide PAC details of exactly who will not be upgraded (we will pursue this via an FOI if necessary)
*    how/why did BDUK let BT get away with all this in the first place, leaving local authorities at the mercy of BT’s immensely superior commercial acumen?
*    is BT abusing its monopoly situation?

Patrick was particularly interested by the question of how many so-called “upgraded” premises will actually benefit from superfast (24 Mb+) broadband?  The BT representative couldn’t provide that figure, but said he would take the question away with him and get back to the chair.

I am personally interested by the how the estimates of the cost of national broadband rollout keep rising while the cost of building and upgrading networks to serve specifc communities falls month on month – as competition grows from players using internet age technologies and architectures. Before criticising the original BDUK/BT cost estimates we should remember they were based on extending a network that was planned in the 1980s to bring broadcast quality video to the home by 2002 – before the changes in communications technology brought about with the rise of the mass market internet. Those changes are now accellerating as we move into a world where everything is inter-connected

Today the largest component in network upgrade costs appears to be wayleave and access charges, followed by management overheads. Even civil engineering costs come a poor third.  Hence my immediate focus on bringing property owners and network operators together to test the practicality of agreeing win-win shared solutions to take 80% (and sometime even more) out of the cost of addressing not spots. This week we have the first meeting of what I hope will develop into a neutral umbrell a for co-operation wherever players are more concerned with getting a share of the revenues from economic growth and job creation than defending past the business models of the past. We know the problems we will face. Hence the strategy of beginning with those who stand to gain most from success, whether or not others choose to join or copy them. I have asked the Digital Policy Alliance to provide a neutral umbrella. Those wishing to participate who have not already received an invitation should join the DPA and state that this is one of their priority areas. My own contributation on via DPA advisory board is uncharged but I cannot ask them to do work which is not supported by paying members who want to achieve practical results,     
 


January 28, 2015  11:12 AM

Will Companies’ House liability for damages help trigger better approach to identity governance?

Philip Virgo Profile: Philip Virgo
Amazon, BCS, Cabinet office, Data governance, Data protection, Google, privacy, Surveillance, Yahoo, Zoopla

We have much debate about information security and data protection but far more harm is caused because accurate information is not available when and where it is needed. The Cabinet Office “liability avoidance” approach to identity and authorisation policy has long been driven by the need to address long standing quality problems, the proportion of department or agency records containing serious errors with regard to individuals or businesses often in the range 10 – 25%. Now, at long last, the reason has been made public.

Public debate regarding legal, incluidng corporate, identities has been complicated by a long-running (four years to the recent High Court judgement) case in which Companies House was being sued for having destroyed a healthy, century old family business, by promulgating a simple data error. The game is not yet over. There may yet be an appeal. The issue of “Crown Immunity”, inherent in the various EU Roman-Law based, Directives and Regulations concerning identity and data protection and sharing, will hopefully come into play – in the year of the 850th Anniversary of Magna Carta. The “immunity” of the “Data Barons” should also, however, come into play if we wish to avoid a “Peasants’ Revolt” as our current intellectual property regime loses public support and becomes unenforcable..  

Hence the importance of the issues raised in my last blog when I suggested a need to bring the UK information governance regime (both public and private) under proper judicial oversight. That means accuracy and availability, not “just” surveillance and security. Most of the debate over the governance of medical records (for example) is to do with privacy but we should not forget far more practical harm is caused by coding errors that can destroy lives and careers, whether or not the patient survives the failure to provide accurate information to clinicians when and where it is need.   

The debate over “big” and “open” data, including liability for mis-use and error is, hopefully, about to take a new and more realistic turn, away from mythology, technology and the business models of those refining our digital footprints and associated misinformation and cybercrud into “the new oil”.

In the 850th year of Magna Carta we need to address the duties that the Data Barons, as well as the State, owe to the rest of us with regard to their use of our personal data, especially when they broadcast errors that can destroy lives and businesses, as Companies House did when they confused Taylor and Son Ltd with Taylor and Sons Ltd when supplying insolvency data to the credit reference agencies. In looking at the significance of this case we need to remember that the services of the credit refernce companies are not only used to underpin and approve on-line business to business and business to consumer transactions. They are also used to help “authenticate” the on-line identities used whenever more than about $100 dollars is at risk. Given that the use of such services is central to the approach behind the Government Verify programme, the issues also go to the heart of the Modernising Government Agenda – with all the implications for social inclusion (and the 20 – 30% of the population without a credible digital footprint or credit profile).  

On the 9th February Computer Weekly is partnering the BCS and Tech UK in a “Big Digital Debate” . Given the growing public distrust of all things digital, I do hope that some-one, perhaps the chairman (over to you Bryan) will ask the speakers for their views on whether the Data Barons (from Amazon through Google to Yahoo and Zoopla) should take a greater legal responsibility to us all, if they wish to be able to enforce their related digital intellectual property rights. Similarly, should not “The Crown” (alias Whitehall) take explicit responsibility for the use of our information by its agencies and contractors if it expects us to deal with it on-line?

I deliberately pick content-related IPR enforcement as being the only part of a 21st Century Peasants Revolt that would scare the Data Barons of Today. All other sanctions are broken or worthless in an inter-connected world.  “The Crown” is, of course, more vulnerable to public opinion – at least for the next hundred days.          


January 15, 2015  10:00 AM

How do we ensure that responses to Hebdo murders do not dishonour 800th Anniversary of Magna Carta?

Philip Virgo Profile: Philip Virgo
Arndale, Bombing, CESG, EURIM-IPPR, freedom, GCHQ, Interent, IRA, Manchester

The would-be killers of Charlie Hebdo (click here for some of the cartons from the resurrection)
are in danger of achieving their objectives:

•    an unprecedented  propaganda coup (for such a tiny, ill-funded group), 
•    glorious martyrdom, (gunned down in a hail of bullets) and
•    a display of global hypocrisy  leading to an exploitation of groupthink to justify a massive diversion of resources into expensive and ineffective displacement activities   that could  cripple the “real” enemy of the violent extremism  –  freedom of thought and expression.

I was troubled by calls for global “solidarity” under the banner “Je Suis Charlie”. I was rather more impressed by the thinking behind  “Je ne suis pas Charlie: Je suis Ahmed” 
 
How do we “exile”, or at least expose and isolate the heretics  (of all religious strands) who genuinely believe they have a right (or even duty) to kill in the name of their selective interpretation of the word of God. Hence the reason for my Christmas Blog on The Amman Message .  The hollowness of UK claims to be serious about “the war against terror” is exposed by the failure to insist that such a  uniquely authoritative collective message  is used by Ofsted as one of the yardsticks for reviewing  the content of muslim religious education in UK schools. 

I have also been long concerned by the shallowness of debate about the meaning of the “Freedom of the Internet”.  At School and University I was taught that “Freedom” is not an abstract. It has to be put into context: freedom from hunger, freedom from fear etc.  In the on-line world we need to balance the “right” to free speech against freedom from the fear of trolling and abuse.   We also need to give users genuine choice, not just the Hobson’s choice offered by the cartel of global players who run most of the on-line world: tracking our footprints to refine (big data engines) our habits into the new oil, to sell to advertisers but not to provide to law enforcement to help identify and “remove” those who abuse us.

I am not convinced by the calls for blanket powers to address the use of the internet by potential terrorists while we fail to make serious progress with reporting and intelligence sharing at almost any level with regard to almost any type of criminal behaviour on-line. Some years ago I was told that during the run-up to the bombing of the Arndale Centre in Manchester, Special Branch believed there was only one, inactive, IRA cell in Manchester. My informant told me that within 48 hours of the bombing, the local CID had been given details, including membership and meeting locations, of six active cells. The local criminals had no problem with the IRA bombing London:  locally was, however, very  different. 

Last March I chaired a small round table on intelligence sharing. A key point was that law enforcement, including the security services, were drowning in data that they did not have the resources, people or technology to analyse. The suggested way forward was to make it less impossible for “industry” to use its own “big data” processes to filter data to protect possible criminal behaviour (and other abuse that was in flagrant breach of their supposed terms and conditions) and pass over that which was relevant in formats that could be easily collated and used. That would , however, require addressing issues of governance and co-operation, including responsibilities, liabilities and priority setting, that people prefer to keep behind closed doors. 

We also discussed the need to revisit the current convention whereby the defence can trawl through all of the prosecution’s evidence base. It might have been an important safeguard if we still had capital punishment but we do not and such trawls are clearly being used to identify and silence potential witness and thus reinforce reigns of terror in inner city sink estates dominated by organised crime and street gangs (whether religious or secular or the unholy alliances across the “social” boundaries between the two).

I was among those infuriated by the many falsehoods and inaccuracies in “The Imitation Game” but it did succinctly summarise the dilemma faced by those wishing to keep sources of intelligence secret and those wishing to take action. An illustration of the consequences of getting the balance wrong can be seen with regard to the failure to respond to the original calls for help when the extremists began their take-over of the Finsbury Park Mosque .

The final report of the EURIM- IPPR study into “Partnership Policing for the Information Society”  looked at governance issues but failed to come up with recommendations that caught the imaginations of the participants. I revisited the subject in the context of the  governance of “big data” (public or private)  the in surveillance society when I reviewed the Labour Party Modernising Government  study  and its call for an ethical dimension.

I was, however, taken aback by how soon the issues would become topical. 

I would like therefore to repeat my call for their recommendation to: “Create an ethical framework and governance for ethical issuers around the interaction of the state, its citizens and corporations” to be responded to by the resurrection of the pre-1958 duties of Britain’s third most senior Judge, the Master of the Rolls  and his deputy, The Keeper of Public Records . It is an “obvious” way to provide authoritative,  credible and (where necessary) public judicial oversight  for the surveillance and information security activities of the state and also of the industry players with whom they should be co-operating to address abuse of all types, not just terrorism,  over the Internet.

The current mish-mash of under-resourced Information, Interception, Surveillance Cameras and other “Commissioners” and regulators should then “report” to the “Master”. Meanwhile the “Keeper” should be able to draw on the resources of the National Audit Office, CESG and National Archive to enforce good practice in private, while providing the evidence for “public” enforcement by the Commissioners and/or Master should this prove necessary. Of course there is a lot of small print to address. But, in the face of threats from all those who think that our private information is their new oil (alias “big data”) what better way could there be to help counter the similar hypocrisy  around the celebrations of the 800th anniversary of Magna Carta, than to put the data and surveillance activities of the state and of its contractors back under clear judicial oversight and Common (not Roman) law.

We should also remember that proud boast of the last “Keeper of Public Records” that her forbears had provided the evidence to have three of their sponsoring ministers (Lord Chancellors who had fallen out of favour) executed. We no longer have the death penalty for anything but a similarly rigorous approach to the enforcement of good information practice could also do more than any amount of blether to help rebuild trust in the UK as a globally trusted hub for information processing. 

There are some interesting side effects.

If it were possible for the Master to also have effective oversight of the intra-UK surveillance operations of GCHQ and Law enforcement it should also be possible to streamline current controls which get in the way of the timely and effective use of intelligence. Many of those controls currently give the worst of all possible outcomes, getting in the way of rapid response (as during the 2011 riots when law enforcement lacked the processes to use the real-time intelligence streaming in on a voluntary basis from communications services) while failing to give public confidence because they are unknown.

The spirit behind Magna Carta gained force in the 17th Century, an age of mass immigration (Huguenots and others fleeing from persecution), mistrust, plots and riots. Sound familiar?  Hence the need to take its values seriously. “Je ne suis pas Charlie. Je suis Ahmed.” 


January 9, 2015  11:31 AM

Will 2015 be the year that politicians unite to deliver ethical & socially inclusive public services

Philip Virgo Profile: Philip Virgo
I have just read that the digitisation of government needs a major transformation effort

Which comes first?

The digitisation or the transformation?

It is over twenty years since Nobel Prize winner Arno Penzias (then running Bell Labs) told PITCOM than computerisation never made anyone redundant. It was the organisational changes that were enabled by computerisation that made whole functions and their departments, redundant. So what have we actually learned since LEO 1 automated the production control pf Lyons Bakery and  LEO2 automated the Army Pay roll? If this article is anything to go by …. not a lot.

Last year I promised, (while blogging on the problems with rural social inclusion as exemplified by the problems with using digital by default to identify farmers )  to comment on the report of the exercise commissioned to help the Labour Party Digital Government review .
I have been taken to task before by Chi Onwurah MP for not declaring my political allegiances. I therefore remind readers that whiIe I am Vice Chairman (Policy Studies) for the Conservative Technology Forum, I am also a levy paying, albeit now retired, member of Unite and of the Co-op. The views in this blog are my own. They reflect neither left, right nor centre but that area where old (non-Marxist) Labour meets tribal (open, but not necessarily “free”, market) Tory, round the bike sheds at the back.

I particularly welcome the call for those looking at “Digital Government” to focus on social inclusion and ethical standards rather than simple cost saving, although I would have welcomed rather more on how to measure performance and to hold government to account with regard to both. I would also have preferred more focus on the objectives than the technology, although I was personally interested much of the latter. I fear, however, that too many of those responsible for public sector IT systems, particularly those over-zealously outsourced by the last Labour government (such as under the National Plan for NHS IT and the many botched hospital PFIs) have mindsets akin to those who run the US Bureau of Indian Affairs.

I was therefore delighted to see the recent admission from Andy Burnham that Labour had taken the privatisation of the NHS too far. Perhaps he already knew that Circle was planning to withdraw from the running of Hinchinbrooke Hospital. I look forward to seeing further evidence that all main parties recognise that the outsourcing and offshoring of critical public sector functions, including the security of our personal information, has passed its zenith and that the time has come to rebuild the in-house systems skills of the public sector. That rebuilding needs to include the skills for end-users, not just IT “experts”, to use open source, interoperable and agile methodologies to support the integration and transformation of service delivery, under democratic control and open accountability.

Recommendation 31 of the Digital Government Review , five days training for all civil staff during the next parliament to become digital champions, is far too modest and has far too low a priority. All civil servants should have the equivalent (including both off-the-job workshops and on-the-job distance learning) of at least ten days a year to help them do their jobs better. The goal should be to involve end-users and their managers in driving incremental change within inter-operability frameworks using IT “professionals”, “systems experts” and outside “consultants” in support roles only.

That has been the ostensible aim of those developing what we now call “agile” methodologies for over forty years. The time has come to take them at their word and adopt the necessary disciplines – while recognising why this is so hard in practice .

I strongly agree with the authors of the review that the goal should be better service for those in most need (my personal rephrasing of their social inclusion goals). Given the state of public finances  (including the overhang of bloated PFIs and other rigidly wasteful outsourcing contracts) that will have almost certainly have to be achieved by incremental change, on positive cash flow using software as a service over shared network and cloud services to cut new system costs by  30% (and more) above the savings on those they replace. We can then argue whether the additional savings should be used to improve services to the growing number of elderly (including me!) or to cut taxes.

I found it difficult to work out which of the other 34 recommendations were there to help achieve objectives and which were there to address assumed constraints.  Many appear to very technical and capable of interpretation in a variety of ways, not all of them good professional practice. I did the programme management module on MSc06 (1971 – 3) at London Business School and subsequently ran the only one of Tony Benn‘s DTI tripartite industry strategy programmes to achieve its objectives (The Water Industry Computing Development Plan). I learned that if a programme has more than six priorities, it has none. More-over only the top three really matter. Most supposed objectives, such as health and safety, equal opportunities and even timescales and budgets, are constraints, not objectives. My own experience has been that the biggest constraints are the skills and time available, not the funding.  I have seen too many expensive fiascos resulting from politicians throwing  consultants and contractors with the wrong experience, motivation and management at a problem because they think that mortgaging the future will provide a short cut to success. I would love to see a bipartisan agreement to follow good professional programme management practice. But pigs might fly

Is the objective of Digital Government to deliver better and more socially inclusive automated services? Or is it to deliver better services, digital or otherwise, making use of technology to help human beings overcome organisational problems and resource constraints in addressing the needs of those in most need of help, support and/or treatment?

I suspect that some of the authors of the report did not recognise the tension between the two approaches. Others probably did, but could not agree how to reconcile the differences. I sympathise. Politicians, advisors and officials are subject to massive lobbying from armies of consultant and suppliers telling them that technology and outsourcing are the “answer” and the implementation should be contracted to them. Those currently at the top of most major suppliers to the public sector got there by winning such contracts, rather than working on their subsequent delivery. They now have grave difficulty in adjusting to the reality of a world where the public sector is not only broke but mortgaged to the hilt (PFIs and outsourcing deals).

The only realistic way forward is incremental change, using “agile” methodologies supported by low cost mobile technologies accessing cloud-based services. But this has to be funded by cannibalising existing contracts to save 30% and more on current outsourcing costs: hence the desire of major suppliers, and their lobbyists, to delay change while they shrink their UK sales and support teams and adjust to a new world.  Hence also the enthusiasm for complex studies to buy time.

I therefore applaud the focus on social inclusion, but would simplify it down to a requirement that public service delivery systems (whether digital or not) should be designed for access by those in most need, using carers they trust.

No large scale roll-out should be committed unless and until the specification has been successfully tested on the target audience. The Secretary of State for DWP’s insistence on this basic principle is what lies behind the delays with Universal Credit. This appoach was alien to officials and suppliers, let alone the big management consultancies whose experts always know best. They insisted on cutting code and installing equipment under their existing, extended, contracts before they documented and tested the “pathfinders”. In other words, they ignored the reasons for Australia’s successful use of the Oracle methodology they were supposedly copying.  Hence the core reason for £hundreds of millions, and more importantly, several years of unnecessary human suffering and waste.

In the small print of the 2012 budget the Chancellor mandated that no new system should go live after 2014 unless “the responsible minister can demonstrate that they can themselves use the system successfully”  . Thus George Eustace is personally involved with the three week “agile” cycle to belatedly sort out the systems of  the Rural Payment Agency. Similarly the Universal Credit systems cannot go live unless ministers can use them.

Hopefully the launch of the new Digital Accessibility Alliance will be followed by an extension of the policy, preferably in the pre-election budget with all-party support, to mandate the testing of all new systems with members of the target audience, not just the minister, before roll out is contracted.

I note the plans to budget large sums for teaching the “digitally excluded” how to use current technologies but regard this as less effective use of limited government funds than training civil servants to take public service delivery back in-house and to work with local authorities and the voluntary sector to run “joined up people systems” that meet the needs of those most dependent on them.  Almost PITCOM’s first activity was an exhibition of computer-based aids for the disabled, in the Upper Waiting  Room. It was opened by Sir George Young  when he was a junior Health Minister.

Over 30 years on and we are still failing to make effective use of the technology to help those who could and should benefit most. Barnados and the Salvation Army are well ahead of the Government Data Service in the sophistication of their use of IT to help them serve and protect (their levels of delivered security also put Government to shame) those in most need.  Perhaps government should pay leading charities some of what it pays to the big consultancies for advice on how to better use IT to meet the needs of those in most need.
That leads me on to the “ethical” dimension, which Labour would entrust to Cabinet Office.

The Charities are able to do such more at lower cost and more securely by enlisting the hearts and minds of those who work with and for them. Meanwhile no-one, other than a handful of Big data enthusiasts, trusts most Whitehall department further than they can be thrown  . That is not because of the lack of probity of individual civil servants but because of the constraints within which they operate, including rotation to a new role as soon as they begin to gain genuine experience and build trust.

So how should we handle the issues of trust with the delivery of public services?

Abnd here I come why I agree so strongly with the ethical objectives behind the review ….

Back in 2008 I chaired the workshop on the ethics of public service delivery at a conference in Bled to do the ground work for planning the EU Etica Project   I have blogged before on how and why the exercise petered out and revisited the arguments a year ago  We agreed a number of principles regarding the transition to e-Government and digital services.

These included:

•    The transition should never erode the quality of citizenship: it should enhance it or be neutral. It should be based on incentives rather than force …

•    The digital divide is largely socio/economic. People should not be penalised for not using on-line services if they do not wish to: many have access problems including the elderly and illiterate. The state has to create the opportunities and incentives to acquire e-literacy and should not set an impossibly high standard (e.g. use of browsers, security tools etc.)

•  E-Data should not be subsequently mis-used against the citizen, lost, mislaid, sold etc.

•  Citizens are accustomed to exercise digital choice and older people prefer to use mobiles (including alarm systems).

We then homed in on six recommendations:

1)    Governments have to be strongly encouraged to offer citizens online services via their choice of channel and of intermediary and these means have to be multilingual and secure.

2)    Research programmes should be encouraged to ensure that the technologies used for e-Government services are fit to be used by the majority of the citizens. Given that the majority of those dependent on such services are disabled, this requires a focus on mixing audio, text and particularly video-streaming technologies.

3)    Governments should use the e-participation technologies in order to gather views on the channels people would like to use, as well as on the concerns and priorities for services and to collect feedback on the quality and relevance of the services they receive.

4)    It is unethical for Governments to demand information from citizens that they cannot keep secure and confidential.

5)    There is a need for programmes to identify and demonstrate good practice for the secure sharing of data across organisational boundaries, including across national borders.

6)    There is a need for greatly improved gradations of choice under the control of the individual: with allowance for changes of time and circumstance as well as with whom the information is to be shared under what conditions – rather than simplistic one-off choices or defaults. This approach raises many questions as to who authorizes or authenticates the choice as well as of cost and practicality.

These might have appeared challenging to some but some of the eastern European countries, lacking the technology backage of the West,  had even then made serious progress. Today our smart phones offer all the necessary facilities at a fraction of the cost in 2008 so the research programmes are no longer needed. Neither do we really need mass training programmes to enable the final third to use technologies that are fast becoming obsolete. There is a good case for digital access centres (recommendation 5 in the Labour Digital Government Review) but it is more to do with people contact and support for complex distance learning packages.

The idea of using structured e-participation via mobiles to consult target audiences (making a reality of recommendation 15), via their choice of channels and to also organise “acceptability testing” before the mass roll-out of on-line services is committed, appears, however, to remain “as alien to IT suppliers and consultants as it is to government departments and policy advisors, who know that they know what is best – but for who is it best.”

There is a lot in the review on big data and open data – but who should police the governance of data collected by government and its agents under statutory powers? Confidence is crumbling in the wake of the revelations from Bradley Manning, Snowden and a growing flood of leaks from both public and private sectors. The current dialogue of the deaf with regard to surveillance powers is not helped by the inability of civil servants to do other than follow the departmental line once their minister have spoken, even if only to launch a consultation.

I rather like recommendation 34: the provision of a channel for civil servants to comment anonymously. But a channel to whom? The Select Committee on Public Administration attempted to find ways of using information on egregious behaviour provided by anonymous whistle-blowers in its ground-breaking report on public sector IT practice  But getting material to the Minister, other than via a trusted intermediary to his or her parliamentary office, remains harder than leaking it to the press. Passing it to the National Audit Office is similarly fraught. I nonetheless think this recommendation is well worthy of further, bi-partisan, thought.

The recommendation (14) to “Create an ethical framework and governance for ethical issuers around the interaction of the state, its citizens and corporations” is excellent and does not just apply to digital technology. But once again who should have this role? Until the Public Records Act 1958   it could be argued that this was the responsibility of Britain’s third most senior Judge, the Master of the Rolls  and his deputy, The Keeper of Public Records .  It now sits with a mish-mash of under-resourced Information, Surveillance and other “Commissioners”, plus some residual functions within the National Archive.

In the face of threats from all those who think that our private information is their new oil (“big data”) what better way of celebrating the 800th anniversary of Magna Carta by putting its use by the state and its contractors back under clear judicial oversight and common (not roman) law? That could also do more than any amount of blether to help rebuild trust in the UK as a globally trusted hub for information processing.

This review is already overlong for a blog and I have yet to start on the sections on the role of the Government Data Service. Suffice it to say that I regard the GDS as a brave and overdue attempt to start rebuilding the “delivery” skills of central government after two decades of over-zealous outsourcing. But it has yet to demonstrate the skills to plan, design, procure or implement major change, as opposed to rationalising web sites and creating procurement frameworks through which little business flows. Most big departments and their suppliers are prolonging legacy contracts  in the hope that it go away after next election.

Meanwhile the GDS is currently facing its first real baptism of fire: turning round the Rural Payments Agency using an “agile” approach to ensure that the on-line interfaces are usable by the target audience. If it succeeds it will have demonstrated the skills to be a worthy successor of the CCTA in supporting those departments and local authorities not large enough to be able to develop serious in-house skills. It would be good if it were to then develop into an executive agency capable of helping smaller departments take back in house that which should never have been outsourced. But whether it can ever realistically help meet the needs of big departments like the Home Office, HMRC or DWP is another matter.

The suggestion of devolving autonomy to consortia of smart cities and local authorities is rather more realistic than the idea that these should be supported from within Cabinet Office. The best local authorities already outperform central government in terms of quality of service and price performance. It would be better to reduce the level of interference from Whitehall and allow them to grow their own co-operative consortia, building on the success of past exercises, such as those organised by SOCITM and the best of the former REIPs . We need to give them freedom to integrate the local delivery of services which are currently fragmented across the central government silos created by Lloyd George in 1917 and reinforced in the later 1940s in the second attempt to create a Whitehall planned world fit for heroes.

We need to recognise that GDS is still a fledgling organisation with little more experience of delivering reliable services to the most vulnerable in society than the outsource suppliers who were going to transform the health service – and brought it to its knees.

It is more important that it is rebuilt to help support that which only government can do. One example is to support and promote the development of inter-operability standards at all levels: from data, produce and service definitions, through communications protocols and security standards to performance measures. We need a central repository for standards, maintaining files on which products and services adhere to which (and who certificated them), for use (inter alia) a revitalised GDS to police their use across current and public sectors procurement, including frameworks, to help ensure flexibility, inter-operability and compatibility in a world where both technology and demand are evovling in unpredictable ways.

Perhaps this role should be given to the National Physical Laboratory, who should be funded accordingly, including to provide and support low cost participation and access routines to international (not just UK and EU) standards bodies for the innovative small firms who are expected to provide 25% of future public procurements.

To conclude, I would like to see the Labour Digital review used as the start point for a constructive bi-partisan dialogue on how to get the best from using technology to support ethical, socially inclusive public services. But I have no illusions over the challenge this poses to those whose gained their current positions by climbing professional and technical (or political) drainpipes. For once I am rather pleased at being accused of being a journalist at heart!


December 25, 2014  10:14 AM

Christmas is a time for reconciliation. Please read the Amman Message

Philip Virgo Profile: Philip Virgo
Al-Qaeda, Caliphate, Internet, Koran, pope, recruitment, Secular, Sharia, Shia, Sunni, terrorists

We have recently much blether about the use of the web by terrorists, including for publicity and recruitment. We hear much less about its use to promote reconciliation and peace. In this context I do suggest readers visit the website of the Amman Message and then think about what it means to them.

It contains the nearest the Muslim world has to a definition of contemporary Islam, as opposed to the odd collections of 19th and 20th Century make-believe peddled by Islamic State, Al Qaeda and others who have “lost their way” and compete in their use of social media to recruit disaffected teenagers and malcontents to join them.

The process that led to the Amman Message was as though the Pope, the Archbishop of Canterbury, the Moderator of the General Asembly of the Church of Scotland and the other leaders of the main Protestant Communities had come together to define Christianity as a similarly tolerant and compassionate faith, in the face of threats from those calling for a violent crusade against not only heretics and non-believers but also those who disagreed with their selective and idiosyncratic interpretation of the Gospels, including the supposed Revelation of St John.

If you think about the way the Catholic Church and Church of England operate or Article viii of the constitution of the Church of Scotland, let alone the way that other protestant communities are organised, you will realise how difficult such a coming together would be for Christians. It was equally hard and unusal for the Muslim World. Which is why the “Message” should be much better known and publicised – so that laymen, not just scholars for whom it was produced, can understand its implications.

Those who came together to produce the Amman Message should be viewed as the thought leaders of the Muslim World: leading teachers and scholars from all the mainstream Sunni and Shia religious and law schools. It is not a “simple” collection of Prelates and Judges setting the “rules”. Even the Sharia is better seen as a set of collections of interpretations of the will of God rather than a western style set of laws. This has the side effect of making it relatively easy for Governments to introduce legal codes that are nominally “Sharia compliant” – provided they do not obviously deviate from the “path“.    

By contrast the leaders and spokesmen of Al Qaeda or Islamic State are commonly self-taught, with little or no formal scholastic education. Osama bin Laden studied economics and business adminstration. Abu Hamza studied engineering. And so on. Hence also the bizarre legal codes of the Islamic State Caliphate.

The authority of the Amman Message is, therefore, all the more impressive.    

So why is it not better known?

At that point it may be helpful to look at the politics of the Internet and the motives of those whose interests are served by stoking conflict between the Muslim Communities, let alone between them and both Christianity, (ancient, as in Syria and Iraq, modern) and Judaism.

I am, however,a great fan of John Donne, not just his erotic poetry but his later sermons and “religious” writings – at a time when christians were busy persecuting each other in the name of the God of Love. One of my favourites begins “Kind pity chokes my spleen” and contains:

“To stand inquiring right, is not to stray;
To sleep, or run wrong, is. On a huge hill,

Cragged and steep, Truth stands, and he that will

Reach her, about must and about must go,

And what the hill’s suddenness resists, win so.”
:

A very similar set of messages can be found in the Koran saying, in effect, that those who claim to know the will (not just word) of God with absolute certainty and its meaning for you – are false prophets. You have to study for yourself. I therefore leave you to ponder working out how and why the voices of calm and reason are drowned out in the cacophony of the on-line world, with its lawyer-enforced, short-term, secular business models. 

When it comes to calling on God in support of a “justified war”, I personally find little difference between the teachings of St Augustine of Hippo and the messages about self defence found in the Koran. Neither justifies what is happening in the Middle East today – other than those relating to secular self-defence against men of violence. 

My own summary of the careful and measured tones of the Amman Message is simple: “anyone who presumes to know the will of God and encourages others to persecute and kill in his name – is a heretic”.

However, I too have no religious training. I therefore urge you to read the three points of the message for yourself and then think, very hard, even if you do not feel able to pray to your version of God for guidance.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: