In all the hubbub around mobile users increasingly making their own choices of operating systems and hardware, something has been lost sight of – it doesn’t really matter if you bring your own device (BYOD), a more pressing matter for businesses should be ‘where is our data accessed?‘ (WODA).
This issue extends beyond the choice of the mobile endpoint as increasingly ‘mobile’ doesn’t simply mean a single mobile touchscreen tablet alternative to a fixed desktop PC, but multiple points or modes of access with users flitting between them to use whichever is most appropriate (or to hand) at any moment in time. What has become mobile is the point of access to the business process, not just the hardware.
This multiplicity of points of mobile access – some corporate owned, some not – means that when IT services are required on the move they are often best delivered ‘as a service’ from the network, so it is no wonder that the growth in acceptance of cloud seems to have symbiotically mirrored the growth of mobile.
Both pose a similar challenge to the embattled IT manager. A significant element of control has been taken away – essentially the steady operating platform ‘rug’ has been pulled from under their feet.
So how do they retain some balance and control?
The first thing is to accept that things have changed. BYOD is more than a short-lived fad; most people have embraced their inner nerd and now have an opinion about what technology they like to use, and what they don’t like. They buy it and use it as a fundamental part of their personal life from making social connections to paying utility bills. Most people are more productive if comfortable with familiar technology, so why force them to use something else?
However, enterprise data needs to be under enterprise control. Concerns about data are generally much higher than those surrounding applications and the devices themselves. This is a sensible, if accidental, prioritisation of how to deal with BYOD – first focus on corporate data. Unfortunately, few organisations have either a full document classification system or an approach to store mobile data in encrypted containers separated from the rest of the data and apps that will reside on BYO devices.
These are both worthy, if rarely reached at present, goals, but at least the first steps have been taken in recognising the problem. Organisations now need to understand their data a little better, and apply measured control of valuable data in the BYOD world – which doesn‘t look like diminishing any time soon.
In the core infrastructure, things have changed significantly too. Service provision has evolved from the convergence (or one could say, collision) of the IT industry with telecoms to deliver services on demand. IT might have been fragile with interoperability and resilience standards, but some of the positive side of telecoms has spilled over. And eventually telecoms are starting to understand the power of supporting a portfolio of applications and that there is more to communications than voice. Cloud, or the delivery of elements of IT-as-a-service, is the active offspring of the coupling of IT and telecoms.
For businesses, struggling to do more IT with smaller budgets and fewer resources, the incremental outsourcing of some IT demands into the cloud makes sense.
However, cloud is still exhibiting some traits of the rebellious teenager. While there are some regions in Europe that appear more resistant to cloud (notably, Italy, Spain and to a lesser extent France), overall acceptance is positive, although this is across a mix of hybrid, private and public cloud approaches. There are also significant concerns about the location of data centres and the location of registration or ownership of cloud storage companies.
These are understandable in the light of recent revelations, but to enforce heavy security on all data ‘just in case’ would be excessive and counterproductive. Thankfully, most companies seem to realise this, and there is a pragmatic mix of opinions as to how to best store and secure data held in the cloud.
This needs to be an informed decision, however, and just as with mobile, all organisations need to be taking a more forensic approach to their digital assets. IT needs to work hand in hand with the business to identify those assets and data that are most precious, assess the vulnerability and apply appropriate controls, differentiated from other things that are neither valuable nor private as far as the organisation is concerned. The days of blanket approaches to data security are over.
For more information and recent research into cloud and mobile security, download this free Quocirca report, “Neither here nor there“.
A little over a year ago, BMC was not looking good. It had a portfolio of good, but tired technology and was failing to move with the times. Internal problems at various levels in the company were leading to high levels of employee churn. Things did not look good.
Led by CEO Bob Beauchamp, BMC was taken off the stock market and into private ownership. Investors were chosen based on their long term vision: what Beauchamp did not want was an approach of drive revenues and then cash in rapidly.
This has freed up BMC to take a new marketing approach. New hires have been brought in. The portfolio is being rationalised. The focus is now on the user experience, with an understanding that mobility, hybrid private/public cloud systems and the business user are all important links in the new sales process. Substantially more money has been freed up to be invested in sales & marketing and research & development than was the case in its last year as a public company.
BMC’s first new offering aimed to show an understanding of these issues was MyIT – an end-user self-service system that provides consumer-style front end systems with enterprise-grade back end capabilities. MyIT has proved popular – and has galvanised BMC to take a similar approach across the rest of its product portfolio.
Help desk (or service desk as BMC prefers to call it) has been a mainstay of BMC over the years. Its enterprise Remedy offering is the tool of choice in the Global 2000, but it was looking increasingly old-style in its over dependence on screens of text; was far too process-bound; and help desk agents and end users alike were beginning to question its overall efficacy in the light of new SaaS-based competition such as ServiceNow. At its recent BMC Engage event in Orlando, BMC launched Remedy with Smart IT, a far more modern approach to service desk operation. Enabling better reach at the front end through mobile devices and better integration at the back end through to hybrid cloud services, Remedy with Smart IT offers a far more intuitive and usable experience than was previously available from BMC, available both as an on-premise and cloud-based offering.
BMC believes that it already has a strong counter-offer to ServiceNow in the mid-maturity market with its Remedyforce product (a service desk offering that runs on Salesforce’s Salesforce1 cloud platform). The cloud-based version of Remedy with Smart IT, combined with MyIT will provide a much more complete offering with a better experience for users, service desk staff and IT alike across the total service desk market.
Workload automation is another major area for BMC. Its Control-M suite of products has enabled automation of batch and other workloads from the mainframe through to distributed systems. However, this has been a set of highly technical products requiring IT staff with technical and scripting skills. Now, the aim is to enable greater usage by end users themselves, enabling business value to be more easily created.
All this is a journey for BMC – identifying and dealing with the needs of end users and how automation can help is something that is changing with the underlying platform. For example, a hybrid platform requires more intelligence to identify where a workload should reside at any time (for example on private or public cloud), and the promise of cloud in breaking down monolithic applications to create the composite application built dynamically from required functions needs contextual knowledge of how the various functions can work together.
This needs deep integration with BMC’s products in its performance and availability group. Being able to identify where problems are and dig down rapidly to root cause and remediate issues requires systems that can work with the service desk systems and with workload automation to ensure that business continuity is well managed. Here BMC’s TrueSight Operations Management provides probable cause analysis based on advanced pattern matching and analytics, enabling far more proactive approaches to be taken to running an IT environment.
TrueSight also offers further value in that it is moving from being an IT tool to a business one. Through tying in the analytics capabilities of TrueSight into business processes and issues, dashboards can be created that show the direct business impact in cash terms for any existing or future problems, enabling the business to prioritise which issues should be focused on.
BMC has to work to deal with managing IT platforms both vertically at the stack level and horizontally at the hybrid cloud level. It has taken a little time for BMC to move effectively from being a physical IT management systems vendor to a hybrid physical/virtual one; now, via its Cloud and Data Centre Automation team in BMC is positioning itself to provide systems to both end user and service provider organisations that are independent of any tie-in to hardware vendors, differentiating itself from the likes of IBM, HP and Dell (Dell is a long-term BMC partner anyway, although its acquisition of Quest and other management vendors has provided Dell with enough capability to go its own way should it so choose). At the same time, BMC still works closely with its data centre automation customers; it has recently published what it calls the Automation Passport, a best practices methodology for using automation to transform the business value of IT.
BMC still has a strong mainframe capability, which differentiates it from many of the new SaaS-based players. Sure, not all organisations do have a mainframe, but the capability to manage the mainframe as a peer system within the overall IT platform means that those with one only have BMC, CA and IBM to look to for such an embracing management system. IBM’s strength is in its high-touch capacity of putting together a system once it is on the customer’s site. BMC and CA have both been moving towards simpler messaging and portfolios, along with providing on-premise and cloud based systems to give customers greater flexibility in how they deal with their IT platforms.
Overall, BMC seems to be turning itself around. The lack of financially-driven quarterly targets has freed up Beauchamp and his team to take a far more strategic view of where the company needs to go. Product sales volumes are up, and customer satisfaction is solid. However, BMC has to continue with a suitable speed along this new journey – and also has to ensure that it gets its message out there far more forcibly than it is doing at the moment.
Branches are where the rubber still hits the road for many organisations; where retailers still do most of their selling, where much banking is still carried out and where health care is often dispensed. However, for IT managers, branches are outliers, where rogue activity is hard to curb; this means branches can become security and compliance black spots.
Branch employees may see fit to make their lives easier by informally adding to the local IT infrastructure, for example installing wireless access points purchased from the computer store next door. Whilst such activity could also happen at HQ, controls are likely to be more rigorous. What is needed is an ability to extend such controls to branches, monitoring network activity, scanning for security issues and detecting non-compliant activity before it has an impact.
A proposition from Boston, USA-based vendor Pwnie Express should improve branch network and security visibility. Founded in 2010, Pwnie Express has so far received $5.1 million Series-A venture capital financing from Fairhaven Capital and the Vermont Seed Capital Fund. The name is a play on both Pony Express, the 19th century US mail system and the Pwnie Awards, a competition run each year at the Black Hat conference to recognise the best discoverers of exploitable software bugs.
Pwnie Express’s core offering is to monitor IT activity in branches through the installation of plug–and–play in-branch network sensor hardware. These enable branch-level vulnerability management, asset discovery and penetration testing. As such the sensors can also scan for wireless access points, which may have been installed by branch employees for convenience or even by a malicious outsider, and monitor the use of employee/visitor-owned personal devices.
To date Pwnie monitoring has been on a one-to-one basis and so hard to scale. That has changed with the release of a new software-as-a-service (SaaS) based management platform called Pwn Pulse. This increases the number of locations that can be covered from a single console, allowing HQ-based IT management teams to extend full security testing to branches. Pwn Pulse also improves backend integration to other security management tools and security information and event management (SIEM) systems improving an organisation’s overall understanding its IT security and compliance issues.
Currently 25 percent of Pwnie Express’s sales are via an expanding European reseller network, mainly in the UK. With data protection laws only likely to tighten in Europe in the coming years, Pwnie Express should provide visibility into the remote locations other security tools simply cannot reach.
SAP‘s recent $8.3b deal to acquire on-line travel and expense management vendor Concur can be read a few ways. The first, and most positive one, is that it shows that SAP is continuing to try and broaden its appeal, diversifying from being “the ERP company”.
Another view is that SAP has had a few bites at the cloud cherry and mostly failed. Concur brings a massive cloud infrastructure with it, and SAP can make use of this in other ways.
A third, less charitable view is just that SAP has a large amount of money that it needs to be seen to do something with – and Concur was around at the right time and place.
Which one is most likely? I would plump for diversification, with a bit of cloud thrown in. SAP acquired SaaS-based human capital management vendor, SuccessFactors, in 2012. It can be argued that Concur fits quite nicely into this vein – both are SaaS; both deal with managing employees. This takes SAP from being the ERP solution for a few to a provider of functions for everyone; becoming a far stickier and embedded supplier that is even harder for an organisation to extricate itself from.
However, such a simplistic view hides many problems that could now face SAP as it integrates Concur. Travel and expense management is complexity that only a few software vendors have managed to deal with. It is not a simple replacement for employees using Excel spreadsheets to log their expenses – but requires deep domain expertise in areas such as multi-national tax laws, per diem rules, how travel management companies (TMCs) operate, how to interact with financial institutions on a broad scale to manage company and personal credit cards in a secure and effective manner and so on. Concur understands this in spades – but what impact will SAP have on this?
Sure, SAP understand the first part of this: ERP has had to deal with multi-national currencies and tax laws for some time. The rest, though, is new territory for SAP.
Not only are the basics of expense management a difficult area, but Concur has been pushing the boundaries of what it does. In the US, it has various deals for example with integrated taxi cab expense management, where the employee uses their mobile phones to identify a nearby cab and hail it electronically, and then pay the cab driver via the phone with the expense directly integrated into expense claims. Other ongoing work has been looking at how travellers can have their whole trip automated from booking through travel and stay with capabilities such as the use of near field communication (NFC) as a means of booking into hotels without a need to go to the check in desk, and for mobile phones to act as electronic keys to unlock the hotel room door. Such work requires a certain mindset and understanding of the travel and entertainment expense world – and the investment of large amounts of money.
Also, with Concur’s 2011 acquisition of travel details management vendor TripIt, SAP finds itself with a more consumer-oriented product: taking it well out of its comfort zone.
It leaves SAP with a couple of choices – the first is to pretty much leave Concur as a separate entity, trying to keep all its existing staff and domain expertise to continue focusing on what Concur has been calling “the perfect trip” experience. SAP can provide Concur with the deeper pockets to continue work in achieving the perfect trip – but is SAP up to understanding this and achieving any pay back on such investment?
For customers, they now find themselves with the unfortunate impact of moving from dealing with a small but fleet of foot and interesting supplier, to a rather staid and enterprise-focused behemoth. I believe that this will raise flags for many customers: those who have been dealing with Concur in the past (travel and expense management professionals) are unlikely to be the ones in a company who have been dealing with SAP, and many companies will have ruled out SAP for other functions such as ERP and CRM and have gone for others, such as Oracle or Microsoft. Dealing with SAP may then be seen as the thin end of the wedge, with rapacious SAP salespeople trying to usurp the incumbent ERP and CRM vendors.
As with most acquisitions, the SAP/Concur deal will raise worries in may existing customers’ minds, and will open up opportunities for Concur’s competitors. As stated earlier, the market is not exactly flush with such companies that understand travel and expense management well and have software that addresses all requirements. For companies such as KDS and Infor, the SAP/Concur deal must be seen as opening up opportunities.
For Concur’s existing customers, I would advise caution. The two companies’ view of the world are not the same – watch to see how SAP manages the acquisition; watch how many staff start to move on from Concur to join its competitors. If it becomes apparent that SAP is trying to force Concur to fit into the SAP mould, maybe it will be time to look elsewhere.
Ricoh recently held its first industry analyst summit in Tokyo. The event focused on communicating Ricoh’s focus on its services-led business transformation through its 18th Mid-Term Plan.
Ricoh is in the midst of transformation, actively streamlining its company structure to accelerate growth across a number of markets. Like many traditional print hardware companies, it is shifting its focus to services. Its primary focus is on what it calls “workstyle innovation”. Over the past few years, Ricoh has repositioned the company as a services-led organisation – and has greatly enhanced its marketing communications and web presence to shift perception of Ricoh as a company that can support a business’ transformation in today’s evolving and mobile workplace. Ricoh’s services target is to gain 30% growth in revenue globally in 3 years. It plans to achieve this by enhancing its core business as well as expanding its presence in new markets.
Core business enhancement
Ricoh’s core business revolves around office printing, where it has carved out a strong strategy around managed document services (MDS). This established approach has enabled enterprises to tackle the escalating costs associated with an unmanaged print infrastructure. Ricoh has extended this model to encompass all document-centric processes and is effectively increasing its presence in the market on a global basis. In Quocirca’s recent review of the MPS landscape, it is positioned as a global market leader – testament to its global scale, unified service and delivery infrastructure and effective approach to business process automation.
Ricoh’s 18th mid-term plan relates to five key business areas. Its primary business, the office business market, encompasses both hardware technology and services such as MDS, business process services (BPS), IT services and Visual Communication. Ricoh also operates in the consumer market (as seen in its new THETA 360 camera, a range of projectors and an electronic white board product); the industrial business market (optic devices, thermal media and inkjet heads), commercial printing (production printers) and new business, which includes additive manufacturing. Ricoh plans a full-scale entry into commercial printing and intends to expand its growth in the industrial market by 50% in the next three years.
Ricoh announced eight new service lines
- Managed Document Services – leveraging Ricoh’s 5 step adaptive model to help organisations optimise document-centric processes.
- Production Printing Services – portfolio of integrated services to complement Ricoh’s hardware and solution portfolio for in-house corporate printing or graphic arts and commercial Printing.
- Business Process Services – streamlining business processes such as human resources, finance and accounting, and front office outsourcing services such as contact center services.
- Application Services – Integration of applications such as insurance claims processing services
- Sustainability Management Services – Services to reduce environmental impact such as electricity and paper for Ricoh and non-Ricoh devices.
- Communication Services – Development, deployment and integration of unified communication solutions including communication/collaboration solutions (such as Video Conferencing, Interactive White Board, Digital Signage, Virtual Help Desk)
- Workplace Services – Services to maximise efficiency of workplace and effectiveness of workforce, including optimised use of space, smart use of technology and automation of certain office functions.
- IT Infrastructure Services – Consulting, designing, supplying and implementation of IT infrastructure as well as support and management of full IT Infrastructure by remote and on-site support.
Perhaps the most focus was given to Ricoh’s IT services portfolio which varies by region. Ricoh has made a number of IT services acquisitions across several regions and is seeing strong success in Asia Pacific, Europe and the US. In The US, the acquisition of MindSHIFT is enabling Ricoh to target small and medium sized businesses. If Ricoh can articulate a strong proposition around IT services, this could be a key differentiator to its traditional competitors over the coming year. However, Ricoh is now operating in a wider IT services market and perhaps its penetration will be limited to its existing customer base looking to extend existing MDS engagements to the IT infrastructure.
Ricoh is working on a range of technologies around what it calls the infinite network (TIN) where all people and things will be connected all the time. This is Ricoh’s view of the internet of things (IoT) and also embraces Ricoh’s vision of the need to connect to a rapidly increasing set of sensors in the environment.
Ricoh R&D discussed a range of differentiated technology platforms which aim to address multiple markets, enabling the business units and operating companies to go to market with highly differentiated solutions for the office and for specific large verticals. This includes communication and collaboration, visual search and recognition, digital signage and hetero-integration photonics (optics and image processing).
Perhaps the most relevant to the print industry is its mobile visual search technology which provi des an interactive dimension to the printed page. A simple snap of an image can provide access to digital content such as text, video, purchase options and social networks. Ricoh has commercialised this through its Clickable Paper product. Based on digital layers, this enables consumers to hover their mobile phone over a magazine advert, for example, and it could generate video or a link to a web site. Ricoh demonstrated an example used by Mazda, which is using the technology in its brochures.
This technology promises to potentially breathe new life into print by connecting print to the digital world. The market is rapidly evolving market and Ricoh is competing with a range of interactive print/ augmented reality vendors in this space. The only other printer vendor to offer something similar is HP, with its Aurasma technology, which has been available for a number of years.
Ricoh, like its traditional print competitors, needs to drive a dramatic shift to a services business model – its long-term relevance depends on this. While Ricoh has developed a cohesive set of new service offerings, it already has developed a relatively mature set of business process services across areas such as e-invoicing, healthcare, loan applications and so on. Quocirca believes that this should be a priority for Ricoh going forward with its services strategy.
Indeed, Ricoh has already made strong inroads with its MDS strategy. To drive deeper engagements with larger enterprises needs to further articulate a strong vision around business process automation. Ricoh faces strong competition from Lexmark and Xerox in this space.
Ricoh illustrated that it is innovating across a number of markets and this shows commitment to expanding its presence in non-core markets. Overall, Ricoh is taking the right direction to change perceptions of its brand and develop broader services capabilities. Ricoh certainly has a broad array of services, but it is now competing in many new markets and should focus on building its credibility in a few core areas and partnering with best of breed providers in others.
Some of the less conventional products, such as Clickable Page, need to be positioned carefully, and Ricoh will need to either ensure that it moves with improvements in the technology and with the increasing use of wearable technology, and even fully understand when such ephemeral approaches have run their time and so pull out of providing any offerings in the space.
Content sync and share systems are available from many players – Dropbox, Box, Microsoft OneDrive are just a few of the many for those who want to be able to access their files from anywhere via the cloud.
However, the ubiquity of systems and the lack of adequate monetisation at the consumer level is making this a difficult market in which to make a profit. Each of the vendors now has to make a better commercial play – and this may mean establishing product differentiators.
The first step has been to offer enterprise content sync and share (ECSS), where central administrators can control who has access while individuals can work cohesively as teams and groups. Again, though, although the likes of Huddle and Accellion were early leaders (and still differentiate by offering on-premise and hybrid systems), Box, Dropbox and Microsoft are all busy moving into the same space, and all that is happening is the base line of functionality is getting higher – differentiation is still difficult.
The trick is in making any ECSS tool completely central to how people work – making it a platform rather than a tool. At a basic level, this means making any file activity operate via the ECSS system, rather than through the access device’s file system. File open and save actions must go directly via the cloud – this basic approach makes the tool far more central to the user.
However, this is also likely to commoditise rapidly. Vendors still need to do more – and this then requires far more from the system. For example, increasing intelligence around the content of files can enable greater functionality. Indexing of documents allows full search – but this in itself is no better than many of the desktop search tools currently available. Applying meta data to the files based on parsing the content starts to add real value – and makes each ECSS system different.
At the recent BoxWorks 2014 event in San Francisco, some pointers as to possible direction were indicated. The basic workflow engine built in to Box is being improved. Actions will now be taken based not only on workflow rules, but also on content. For example, data loss prevention (DLP) can be implemented via Box through checking the content of a document against rules and preventing it from being moved from one area to another or from being shared with other users who do not meet certain criteria.
Alerts can be set up – maybe a product invoice needs paying on a certain date, or a review needs to be carried out by a specific person by a certain date. Based on the workflow engine, these events can be identified and processes triggered to enable further actions to be taken.
By controlling the meta data correctly, ECSS tools can start to move into being enterprise document (or information) management systems, and even towards full intellectual property management systems. Maintaining versioning with unchangable creation and change date fields provides the capability to create full audit chains that are legally presentable should the need arise. The meta data can be used to demonstrate compliance to the numerous information and information security laws and industry standards out the, such as HIPAA and ISO 27001 or 17799. Through such means, the ECSS system becomes an enterprise investment, not just a file management tool.
To make the most of this, though, requires an active and intelligent partner channel. The channel can bring in the required domain expertise, whether this be horizontally across areas such as data protection, or vertically in areas such as pharmaceutical, finance or oil and gas. Box has pulled in partners such as Accenture and Tech Mahindra to help in these areas.
Partners need to be able to access the platform to add their own capabilities, though. This requires a highly functional application programming interface (API) to be in place. This is an area where Box has put in a great deal of work, and the API is the central means for enabling existing systems to interface and integrate into the Box environment and vice versa.
Box has a strong roadmap for adding extra capabilities. It needs to get this out into the user domain as soon as possible in order to show prospective users how it will be differentiating itself from its immediate competitors of Dropbox, Microsoft and Google, whose marketing dollars far outstrip Box’s.
Box is in the middle of that dangerous place for companies – up until now, it has been small enough to be very responsive to its customers’ needs. In the future, it needs to have a more controllable code base with fewer changes – which could make it appear to be slower in adapting to the market’s needs. By building out its platform and creating an abstracted layer in how its platform works through using its own APIs, Box can create an environment where the basic engine can be stabilised and extra functionality layered on top without impacting the core. Through this, Box is setting up a disciplined approach for its own engineering to use its platform for innovation as it wants its partners and customers to do so. Provided it sticks to this path, it should be able to maintain a good level of responsiveness to market needs.
Box and its partners need to build more applications on top of the ECSS engine; to create a solid and active ecosystem of applications that are information-centric and have distinct enterprise value. It also needs to make a better job of showing how well it integrates with existing content management systems, such as Documentum and OpenText to create even more value, democratising information management away from the few to the whole of an organisation and its consultants, contractors, suppliers and customers.
It is likely that over the next year, some of the existing file sync and share vendors will cease to exist as they fail to adapt to the movement of others in their markets. Box has the capacity and capabilities to continue to be a major player in the market – it just needs to make sure that it focuses hard on the enterprise needs and plays the platform card well.
High rainfall over the last week has led to flooding. Last night, there was a large number of burglaries. An escape of toxic gases this morning has led to emergency services requesting everyone to evacuate their premises. There are billions of barrels of crude oil that can be recovered over the next decade.
Notice something missing with all of this? They may all be factually correct; they all discuss time – and yet they are all pretty useless to the reader due to one small thing missing – the “where?” aspect.
For example, I live in Reading in the UK – if the flooding is taking place in Australia, it may be sad, but I do not need to take any steps myself to avoid the floods. If the burglaries are close to my house, I may want to review my security measures. Likewise, if there is a cloud of toxic gas coming my way, I may want to head for the hills. An oil and gas company is not going to spend billions of dollars in digging holes in the hope of finding oil – they need to have a good idea of where to drill in the first place.
And the examples go on – retail stores figuring out where and when to build their next outlet; utility companies making sure that they do not dig through another company’s services; organisations with complex supply chains needing to ensure that the right goods get to the right place at the right time; public sector bodies needing to pull together trends in needs across broad swathes of citizens across different areas of the country.
The need for accurate and granular geo-specific data that can add distinct value to existing data sets has never been higher. As the internet of everything (IoE) becomes more of a reality, this will only become more of a pressing issue. The next major battle ground will be around the capability to overlay geo-specific data from fixed and mobile monitors and devices onto other data services, creating the views required by different groups of people in the organisation in order to add extra value.
I was discussing all of this with one of the major players in the geographic information systems (GIS) market, Esri. Esri has spent many years and a lot of money in building up its skills in understanding how geographic data and other data sets need to work contextually together. Through using a concept of layers, specific data can be applied as required to other data, whether this be internal data from the organisation’s own applications, data sets supplied by Esri and its partners, or external data sets from other sources.
The problem for vendors such as Esri though is the simplistic perception of location awareness that there is in the market. Vendors such as Esri and MapInfo along with content providers including the Ordnance Survey and Experian and others are perceived as purely mapping players – maybe as a Google Maps on steroids. This minimises the actual value that can be obtained from the vendors – and stops many organisations from digging deeper as to what can be provided.
For example, the end result of a geolocation analysis may not be a visual graph at all. Take the insurance industry, for example. You provide them with your postcode, they pull in a load of other data sets looking at crime in your area, likelihood of flooding, number of claims already made by neighbours, possibility of fraud, and out pops a number – say, £100 for a low risk insurance prospect, £5,000 for a high risk. Neither the insurance agent nor the customer has seen any map, yet everything was dependent on a full understanding of the geographical “fix” of the data point, and each layer of data only had that point in common. Sure, time would also have been needed to be taken into account – this makes it two fixed points, which could be analysed to reach an informed and more accurate decision.
The key for the GIS players now is to move far more to being a big data play that can be seen by prospects as a more key part of their needs. Esri seems to understand this – it has connectors and integrations into most other systems, such that other data sources can be easily used, but also so that other business intelligence front ends can be used if a customer so wishes.
So, what’s the future for GIS? In itself, probably just “more of the same”. As part of a big data/internet of everything approach, geographic data will be one of the major underpinnings for the majority of systems. When combined with time, place helps to provide a fixed point of context around which variable data can be better understood. It is likely that the GIS vendors will morph into being bundled with the big data analytic players – but as ones with requisite domain expertise in specific verticals.
As the saying goes, there is a time and place for everything: when it comes to a full big data analytics approach, for everything, there is a time and place. Or, as a colleague said, maybe the time for a better understanding of the importance of place is now.
There has been plenty of talk about the threat of cyber-attacks on critical national infrastructure (CNI). So what’s the risk, what’s involved in protecting CNI and why, to date, do attacks seem to have been limited?
CNI is the utility infrastructure that we all rely on day-to-day; national networks such as electricity grids, water supply systems and rail tracks. Others have an international aspect too, for example gas pipelines are often fed by cross-border suppliers. In the past such infrastructure has been often been owned by governments, but much has now been privatised.
Some CNI has never been in government hands, mobile phone and broadband networks have largely emerged after the telco monopolies were scrapped in the 1980s. The supply chains of major supermarkets have always been a private matter, but they are very reliant on road networks, an area of CNI still largely in government hands.
The working fabric of CNIs is always a network of some sort; pipes, copper wires, supply chains, rails, roads: keeping it all running requires network communications. Before the widespread use of the internet this was achieve through propriety, dedicated and largely isolated networks. Many of these are still in place. However, the problem is that they have increasingly become linked to and/or enriched by internet communications. This makes CNIs part of the nebulous thing we call cyber-space; which is predicted to grow further and faster with the rise of the internet-of-things (IoT).
Who would want to attack CNI? Perhaps terrorists, however, some point out that it is not really their modus operandi, regional power cuts being less spectacular that flying planes in to buildings. CNI could become a target in nation state conflicts, perhaps a surreptitious attack where there is no kinetic engagement (a euphemism for direct military conflict), some say this is already happening, for example, the Stuxnet malware that targeted Iranian nuclear facilities.
Then there is cybercrime. Poorly protected CNI devices may be used to gain entry to computer networks with more value to criminals. In some case devices could be recruited to botnets, again this is already thought to have happened with IoT devices. Others may be direct targets, for example tampering with electricity meters or stealing data from point-of-sales (PoS) devices that are the ultimate front end of many retail supply chains.
Who is ultimately responsible for CNI security? Should it be governments? After all, many of us own the homes we live in, but we expect government to run defence forces to protect our property from foreign invaders. Government also passes down security legislation, for example at airports and other mandates are emerging with regards to CNI. However, at the end of the day it is in the interests of CNI providers to protect their own networks, for commercial reasons as well as in the interests of security. So, what can be done?
One answer is of course, CNI network isolation. However, this simply not practical, laying private communications networks is expensive and innovations like smart metering are only practical because existing communications technology standards and networks can be used. Of course, better security can be built into to CNIs in the first place, but this will take time, many have essential components that were installed decades ago.
A starting point would be better visibility of the overall network in the first place and ability to collect inputs from devices and record events occurring across CNI networks. If this sounds like a kind of SIEM (security information and event management) system, along the lines of those provide for IT networks by LogRhythm, HP, McAfee, IBM and others, then that is because it is; a mega-SIEM for the huge scale of CNI networks. This is the vision behind ViaSat’s Critical Infrastructure Protection. ViaSat is now extending sales of the service from USA to Europe.
The service involves installing monitors and sensors across CNI networks, setting base lines for known normal operations and looking for the absence of the usual and the presence of the unusual. ViaSat can manage the service for its customers out of its own security operations centre (SOC) or provide customers with their own management tools. Sensors are interconnected across an encrypted, IP fabric, which allows for secure transmission of results and commands to and from the SOC. Where possible the CNI’s own fabric is used for communications, but if necessary this can be supplemented with internet communications; in other words the internet can be recruited to help protect CNI as well as attack it.
Having better visibility of any network not only helps improve security, but enables other improvements to be made through better operational intelligence. ViaSat says it is already doing this for its customers. The story sounds similar to one told in a recent Quocirca research report, Masters of Machines that was sponsored by Splunk. Splunk’s back ground is SIEM and IT operational intelligence, which, as the report shows, is increasingly being used to provide better commercial insight into IT driven business processes.
As it happens ViaSat already uses Splunk as a component of its SOC architecture. However, Splunk has ambitions in the CNI space too, some of it customers are already using its products to monitor and report on industrial systems. Some co-opetition will surely be good thing as the owners of CNIs seek to run and secure them better for the benefit of their customers and in the interests of national security.