Quocirca Insights

Page 1 of 2912345...1020...Last »

August 11, 2017  9:51 AM

The emergence of a new data-centric management vendor

Clive Longbottom Clive Longbottom Profile: Clive Longbottom

SecurityIt doesn’t seem that long ago where there were three main focuses on data security:

– Hardware/application security

– Database security

– Document management security

Each had its own focus; each had its own problems.  Layering the three approaches together often left gaping holes through which those with malicious intent could drive a coach and horses.

However, we are now seeing a new type of vendor coming through: ones who seem more in line with what Quocirca has termed a ‘compliance oriented architecture’ (COA) for some time.

The focus here is to pay less attention to the things that create or store the data and information, instead focusing on the data itself and how it flows across a set of constituent parties directly.

For example, if a company depends on application security and that security is compromised, the malicious individual is now within the ‘walled garden’.  Unless identified and locked out, those breaking in with sufficient privilege are free to roam amongst the data held within that application.  The same with database security: break through the onion-skin of that security and there is all the data for the malicious individual to play with.

Document management systems that are solely dependent on the use of underlying databases for storing the documents as binary large objects (BLObs) often combine both approaches: they have user/roll policies combined with database security – still not a very strong approach.

Instead, if the data is captured at the point of creation and actions are taken from that point on, security can be applied at a far more granular and successful level – the attack vectors for malicious intent are minimised.  Combine this with total control of how data is accessed – via any means, such as a piece of client software, an application-to-application call or a direct SQL/API call – and a different approach to security across a distributed platform with different end users over an extended value chain becomes possible.

One vendor in this space that Quocirca has been talking with is Edgewise Networks.  Still operating in beta mode with a select group of customers, Edgewise takes an approach of applying direct security over the connections between different aspects of the overall system.  For example, it identifies connections from devices to applications, and from application to application or service to service.  In the case of a database, it can see that the dependent application needs to access it, and so can allow the connection.  However, should there be an attempt to access the data via any other means – another application, a direct SQL call or whatever – it can block this.  It logs all of the information, enabling forensic investigations of what has been going as well.

EnterpriseWeb is another company with a compelling approach. It offers an application platform that supports the modeling of complex distributed domains and the composition of dataflow processes. It is fully dynamic, processing functional and non-functional concerns in real-time based on live metadata and real-time state. This means that EnterpriseWeb enforces security, identity and access policies per interaction, ensuring continuous and targeted enforcement. EnterpriseWeb makes it possible to have consistent policy-control over heterogeneous endpoints, systems, databases and devices. Moreover, it can extend security across domains for highly-integrated and transparent operations. It can do this both at the human and machine-level, where it can coordinate the deployment of probes and monitoring applications on to nodes for closed-loop control.

Systems such as those provided by Edgewise Networks and EnterpriseWeb could change the way that organisations operate information security across a diverse, hybrid private/public cloud platform.  By taking control of the interactions between different functional components, data can be secured as it traverses between the functions, and man-in-the-middle attacks can be prevented.  When combined with other approaches such as object-based storage and encryption of data on the move and at rest, an organisation can move away from firewall-style approaches which are failing as the edge of the network disappears to a far wider-reaching security approach that is easier to implement and manage.

Sure, Edgewise Networks and EnterpriseWeb are young companies that still must prove themselves not only as technically viable and innovative in the long term, but they also must show that they can market themselves successfully in a world where the technology comes second to the marketing message.

August 8, 2017  9:30 PM

Did Europe miss the SD WAN bus?

Bernt Ostergaard Bernt Ostergaard Profile: Bernt Ostergaard

2017 may well become the year that SD WAN (software defined wide area networking) routing and the SD WAN edge-to-cloud infrastructure paradigm is adopted by SMEs globally. It may also be the year where European telco manufacturing loses a big chunk of the global routing market to nimbler North American and Asian rivals. IDC’s most recent market figures estimated the global market in 2015 for SD WAN products and services at $225 million rising to $1.9bn this year and growing at 69% CAG through to 2021 and thus hitting $8bn.

The SD WAN industry now counts over 40 manufacturers with global distribution potential. They include all the incumbents (Nokia, Cisco, HPE, Huawei), entrants from neighbouring technologies like WAN optimisation (Silver Peak), network security vendors (Barracuda), network virtualisation (Citrix), MPLS service (Aryaka), to pure plays like Viptela, Talari and Peplink. This gives customers a wide range of choice – dictated by configurations, prices, availability, existing infrastructure and demonstrated capabilities in similar vertical industry configurations.

Most importantly, the shift to SD WAN can come at a low CapEx level and lower than hitherto OpEx costs – all with improved network performance. This is achieved by combining (and thus achieving a higher utilisation level) of the company’s existing WAN access channels (fixed, cellular, point-to-point, satellite, etc.). SD WAN creates that single virtual access – and provides the software to give quality of service (QoS) to best effort links – thus doing away with costly multi-protocol labelling service (MPLS) support for latency sensitive, mission-critical applications.

An SD WAN infrastructure allows a company to centrally configure and manage their branch office access to cloud resources.
Obviously, with that many companies emerging on the market the coming years will see consolidation down to the 5-10 global players that this market will support, once it reaches maturity in 2020. This may deter large enterprises from going down the SD WAN route just yet – they see this as a vetting period. But for the SMEs, now is a good time to engage with the SD WAN vendors who are eager to develop industry specific configurations.

European SD WAN players?

Nokia with its Alcatel take-over also acquired the US-based SD WAN company Nuage Networks. This company helps service providers including BT, China Telecom, Telefonica, and Telia to deliver fully automated and self-service SD-WAN systems. These allow enterprise customers to connect their users quickly and securely to applications in private and public clouds. Nuage Networks is the only major ‘European’ foothold in this exploding market – the rest is, to all intents and purposes, niche. In fact, I have only found two European vendors in this space:

  • Swedish vendor Icomera develops hardware/software solutions for passenger Internet access on trains and planes, as well as fleet management and telematics for remote monitoring.
  • In Germany, Viprinet’s hardware/software concatenates different types of access media (such as ADSL, SDSL, UMTS / HSPA+ / 3G, and LTE / 4G) for mobile, ad-hoc and remote location connectivity.
  • Icomera and Viprinet specialise in mobile network access

    Icomera and Viprinet specialise in mobile network access

Bypassing the stumbling blocks

The SD WAN market is price sensitive, very competitive and capital intensive. So to enter this market, the VC community needs to be more active, as do public venture funds. Hitherto, we have seen little VC interest in this field, and what interest there is does not seem to be in for the long haul. They prefer the usual 3-year get-in, get-out strategy. Public funding including the huge EU funds in the Horizon 2020 program also seem to have bypassed this market opportunity.

The traditionally strong European telco industry has never played particularly well in the consumer and small business space. So manufacturers like Ericsson, Nokia and Siemens may not feel it is in their sweet spot. However, SD WAN is very much software based. Companies like Talari in the US generate as much revenue from software and services as it does from hardware sales.  So, European software companies in the logistics and automotive business could build a new line of business in SD WAN using standardised hardware.

The European auto industry should also be very interested in this technology where mobile connections play a key role. Developing 5G-enabled SD WAN could align interests between telco vendors and auto manufacturers.

Now is the time for European software vendors to step up to this challenge. Not only are there the relatively straightforward examples as outlined above, but the emerging world of the internet of things (IoT) also offers a whole raft of new and lucrative opportunities.

It would be a pity to see such a green field site of new opportunities be defaulted to the incumbent US companies or the highly dynamic and hungry Asian companies.  Europe can make a strong play, looking back to its heartlands of strong software innovation.

 


July 26, 2017  3:43 PM

Quocirca UK ICO Watch: GDPR fines may not be as scary as the vendors are telling you

Bob Tarzey Profile: Bob Tarzey

Are you fed up with vendor scare-mongering about the challenge of complying with the General Data Protection Regulation (GDPR) and the huge fines heading your way? UK-based organisations may be better off looking at the precedents set by the Information Commissioner’s Office (ICO), the body with responsibility for enforcing data protection in the UK. How the ICO has enforced the existing Data Protection Act (DPA) may provide guidance for the future.

First, let’s get Brexit out of the way, the UK government stated its commitment to data protection in the Queen’s speech following the June 2017 General Election and stated that the GDPR will be implemented in the UK. The ICO has also confirmed this directly to Quocirca.

Under the DPA the ICO has had the power to instruct organisations to undertake certain actions to better protect personally identifiable information (PII). In serious cases, it can issue enforcement notices and, in extreme cases, monetary penalties, up to a current maximum of £500K. It also brings prosecutions against individuals that have abused PII. For example, the July 2017 case against the Royal Free London NHS Foundation Trust for mis-sharing data with Google DeepMind resulting in an undertaking, not a fine.

The ICO is open about its activities; since June 2015 is has published each case where it has taken action. As of writing, in that time it has issued 87 monetary penalties, 52 undertakings and 35 enforcement notices. It has also brought 31 prosecutions. The DPA is not the only legislation considered by the ICO in taking these actions. It also enforces the 2003 Privacy and Electronic Communications Regulations (PECR), perhaps best known for the so-called Cookie Law, but also for limiting the use of spam SMS/email and nuisance phone calls.

The ICO’s monetary penalties

The average fine issued in the last two years has been £84K; 17% of the maximum. The two largest fines to date have been £400K: one under the DPA to TalkTalk Telecom for its widely publicised 2015 leak of 156,959 customer records, and one under PECR to Keurboom Communications for 99.5M nuisance calls.

Of the 87 fines, 48 were PECR related (average £95K). A further 13 were to charities for mis-use of data (average £14K). 8 were for some sort of data processing issue (average £68K) and 18 for data leaks (average £114K). A future blog post will look at the nature of these 18 data leaks.

The ICO also maintains and publishes a spreadsheet of data security incident trends, which lists all the UK data leaks it has become aware of; these number 3,902 since June 2015. So, the 18 fines issued for data leaks represent less than 0.5% of all cases the ICO could have considered.

The ICO is too resource-stretched to pursue every data leak. As you would expect, it prioritises the worst incidents. Even then, it is reticent to fine and has rarely come near to imposing the maximum fine. The ICO’s job is to protect UK citizens’ data, not to bring down UK businesses. Sure, the ICO will have broader powers, and the possibility to impose higher penalties, under GDPR. However, if the ICO chooses to use these new powers with the same discretion as it has under the DPA, any data manager that has ensured their organisation is paying due diligence to way it handles PII, should not be losing too much sleep.

Quocirca presented these data and some other findings from its ICO research at a recent webinar sponsored by RiskIQ which can viewed HERE.


July 11, 2017  1:45 PM

The importance of Operational Intelligence in emergency situations

Clive Longbottom Clive Longbottom Profile: Clive Longbottom

flooding-article2At a recent roundtable event organised by Esri UK, representatives of the North Wales police force, Wessex Water and the Environment Agency talked about how they were using geographic information systems (GIS) in their work to provide Operational Intelligence (OI).

The focus of the discussions revolved around how each organisation needed to deal with emergency situations – particularly around flood events.

For the Environment Agency, the focus is shifting from a reactive response to a more proactive position.  Using a mix of internet of things (IoT) devices combined with more standard meteorological weather forecasting and satellite data, the department is aiming to both avoid flooding through dealing with issues before they become problems and to better respond to problems when they are unavoidable.

An example here is in using topological data in order to predict water runoff in upland areas and then to use analytics to better understand what that may mean further downstream.  Nick Jones, a senior advisor at the Agency, described how it is also using real-time analytics for OI – an example here is in holding certain stocks of items such as mobile flood barriers, but being able to get them to a point of greatest need at the right time via a just-in-time logistics model.  This allows for the optimisation of inventory – and avoids issues with the public when they find that items that could have prevented a flood were available – but in the wrong place.

Wessex Water also has to respond to such events.  Floods can force sewage up into streets and gardens, or even into people’s homes.  Andy Nicholson, Asset Data Manager at the company, described how the company was using GIS-based OI to better prioritise where to apply its resources in emergency situations.  For example, partnering with the Environment Agency to gain access to its data means that Wessex Water can track the progress of a flood – and can then both advise its customers of possible problems and allocate people and other resources to go and mitigate issues by working to potentially stop, redirect or slow down any issues caused by a flood event along the areas it has responsibility for.

Likewise, the North Wales police force has a responsibility to citizens.  Dave Abernethy-Clark, a PC with the force, described how he came up with Exodus, an OI system using GIS data to provide the force with a better means of dealing with vulnerable people during events such as a flood (but could also be a fire, civil disturbance or any other event).  Again, the real-time and predictive usage of OI can be invaluable. It is better to evacuate a vulnerable person before an event overtakes them.

However, doing this based on just basic safety assessments can mean that a vulnerable person is removed from their property where there is actually little real need to do so.  OI ensures that only those who are very likely to be impacted are identified and dealt with, minimising such upsets and saving resource costs and time.

Andy Nicholson from Wessex Water also emphasised how OI should never be a single data set approach.  By pulling together multiple data sets, the end result is far more illuminating and accurate.  The use of a flexible front end is also important – he discussed how one event resulted in a complex situation appraisal being shown on a screen.  From this view, it looked like the event could be a major one that would impact a large number of customers.  However, with a few extra filters in place, he managed to narrow this down to just a few core points being shown, and this then allowed for causality to be rapidly ascertained and the problem dealt with along with the minimum impact on just a few customers.

The takeaways from the event were that OI is an increasingly valuable tool to organisations out there.  The increasing capabilities of tools and the underlying power of the platforms they run on means that real-time OI is now possible.  The opening up of different organisations’ data sets also means that other organisations can directly plug in to existing useful – and often free – data.

The way that these three, seemingly disparate organisations worked together to deal with an emergency event such as a flood was apparent.  By all of them (plus other groups) being able to work against the same underlying data and using collaborative systems built over the OI platform, the sum total of the capability to deal with an event is enhanced greatly. How this helps all of us cannot be underestimated.


July 6, 2017  7:56 AM

The ever-growing IoT attack surface

Bob Tarzey Profile: Bob Tarzey

The theme of a recent Infosecurity Europe 2017 Strategy Talk, facilitated by Quocirca, was how to limit the ability of hackers to exploit the expanding IT attack surface created by the deployment of increasing numbers of IoT (Internet of Things) devices. The two panellists, representing the security vendors FireEye and ForeScout (the session sponsor), contended that a better integrated approach to network security was required.

Some estimates regarding future numbers of IoT devices run into the tens of billions (for example, Gartner, Juniper and McKinsey). Quocirca’s own 2016 business-focused research, European Perceptions, Preparedness and Strategies for IoT Security, based on the short-term estimates of German and UK IT managers, were more conservative. However, many may be in denial about the scale of the IoT opportunity they will be expected to enable. The need for IoT device discovery and security is prescient.

The roll-out of IoT devices may be a carefully planned and application-specific or ad hoc, as the rising tides of shadow IT and consumerisation lead lines-of-business and end users to deploy their own devices. Pragmatic IoT security must be generic and able to deal with all types of devices; whether they have been endorsed by IT management or not. Both security products and network design have a part to play.

IoT security is a pressing issue for four reasons. First, there are data protection issues; devices may transmit sensitive information (even IP addresses may be considered personal data under the EU GDPR). Second, IoT devices can be used as ingress points to broader IT infrastructure. Third, IoT devices are being recruited to botnets for the ongoing launch of denial of service attacks. Fourth, attempts to disrupt business processes may be targeted at poorly defended IoT deployments.

This last point is perhaps the most worrying, many IoT deployments are all about the better monitoring and management of critical infrastructure. Such attacks have the potential for kinetic impact beyond cyber-space, for example causing power outages, disrupting transport systems and industrial espionage.

Quocirca’s research shows that most organisations recognise the need to be able to discover and classify IoT devices. Furthermore, there is growing recognition that this must be achieved without on-device agents; the variety of devices and operating system is too great, the device processing power often limited and, of course, ad hoc devices will be unknown when they first connect to a network.

Gateway to the IoT

IoT gateways are turning out to be fundamental from design perspective (see Quocirca Computer Weekly buyer’s guide, No silver bullet for business IoT security). For planned IoT roll-outs, all devices associated with a given application can be deployed behind single gateway where security functions can be aggregated. Gateways can also help with ad hoc device attachments, for example, through isolating network segments for supply chain interactions and guest access.

IoT security needs to recognise new threats and ensure all relevant devices are protected from them. FireEye and ForeScout advocate that their integrated approaches can achieve this at the scale required for current and future IoT deployments.

ForeScout’s CounterACT technology acts as a gateway, discovering, classifying and continuously monitoring network attached devices and ensuring a given level of security. It does not require pre-installed agents to do this (although agents are available for advanced management of known devices). CounterACT can recognise hundreds of different IoT device types, ranging from sensors and probes to printers and security cameras as well as more traditional user end-points.

FireEye’s Network Threat Prevention Platform (NX Series) identifies potential threats on devices via its virtual execution engine (sandbox). It can also identify anomalous behaviours, for example, recognising and blocking the outbound call-backs made by malware to command and control (C&C) systems.

FireEye can inform CounterACT about the threats it discovers via ForeScout’s ControlFabric, a means for sharing security information. CounterACT can then scan other endpoints for presence of the newly identified threats and enact policy-based controls depending on the severity of the threat and priority of the device. End-points can be quarantined, network access limited, remedial actions taken and notifications issued.

ForeScout CounterACT and FireEye NX are complementary, for the former is not equipped to recognise previously unknown threats, whilst the latter does not monitor new devices attaching to networks. The partnership is a good example of how integrated security can achieve a greater level of protection than the sum of that achieved each product alone. Other network security and advanced threat protection products are available.


July 5, 2017  9:18 AM

GDPR and the UK: sorting fact from fiction

Bob Tarzey Profile: Bob Tarzey

How will data protection enforcement change in the UK once the EU General Data Protection Regulation (GDPR) comes in to force? Will the GDPR come into force at all, with the UK planning to leave the EU?

Insight can be gained from looking at how the UK Information Commissioner’s Office (ICO) has enforced the existing 1998 Data Protection Act in recent years. Quocirca will be presenting analysis of recent ICO actions at a webinar on July 11th 2017.

Join Quocirca and RiskIQ to find out how many data leaks the ICO has recorded in the last two years and how many have led to actions being taken in the form of fines as well as prosecutions, enforcement notices and undertakings.

The biggest challenge with GDPR for many will be ensuring the new, more stringent administrative requirements are met. These will enable the ICO’s to act in many new areas. The webinar will discuss these and how you can avoid the ICO’s attention by better understanding how and where personally identifiable information (PII) is being processed in the name of your organisation.

More details and for free registration HERE.

What you don’t know CAN hurt you. Are you GDPR PII compliant?

https://www.brighttalk.com/webcast/14683/267289?utm_source=RiskIQ&utm_medium=brighttalk&utm_campaign=267289

 


July 4, 2017  11:08 AM

Out of the shadows – IT as a service broker

Rob Bamforth Rob Bamforth Profile: Rob Bamforth

Organisations need to balance innovation and business improvement with good governance and efficient use of IT budgets and resources. But this is not easy. Plenty of people right across the organisation think they know what IT they need and can easily go out and buy it.

windThe term ‘shadow IT’ has become a popular way to describe this in recent years. But the practice of parts of the business buying IT systems, software or services without the knowledge of the IT department, is not new. It has happened since the days of the first mini servers, dot matrix printers and personal PCs.

However, a couple of things have changed.

Bypassing the IT function

IT has become more affordable. From consumer-priced mobile devices to subscription based services in the cloud, technology is widely available and relatively inexpensive. There is also an increase in personal choice and awareness of IT potential. Individuals know what they want and will often make their own choices of personal technology, hence the rise of bring your own device (BYOD).

Lines of business too have greater understanding of what is available. Despite not quite understanding the full implications, they want to get on with digitising assets and making processes more efficient. The IT function can often come to be seen as a blockage, or lacking awareness of the business drivers. So, a line of business makes its own decisions and spends its own budget on IT. This is then outside of the control and visibility of the IT function.

Anecdotal evidence suggests that in some organisations, this ‘unofficial’ IT spending can be much larger than the central IT budget. It is something that many IT departments have to live with, but potentially it will cause problems of governance. It may also be much more inefficient and costly overall than having IT co-ordinated through a central IT function.

Encouraging innovation

However, the appetite for lines of business to procure technology to bring about a ‘digital transformation’ should be encouraged. There is clearly an unfulfilled business need, and shadow IT is a symptom of trying to address that need. Rather than being overly defensive and resistive, the IT function should see this as an opportunity to take a different approach. It can then improve relationships by fostering the following process:

  • Innovate – those closest to the business are likely to know what needs to be done, but not necessarily the best way to accomplish it. IT could support and encourage innovation in the business, rather than trying, and often failing, to do it itself.
  • Accelerate –scale-up by IT and the line of business working closer together developing the implementation required to both meet the business need and fit within a supportable IT strategy.
  • Operate – once this approach is delivering results, shift to a production model. This can be orchestrated by IT, with the business offloading the task to be run by IT, as a service.

In this way, IT and the business move together through the ‘Cycle of Innovation’. This is detailed in Geoffrey Moore’s book, Dealing with Darwin (Invent, Deploy, Context, Offload). It describes where something non-mission critical but offering differentiation is invented, then fully deployed at scale to become mission critical and core to the business. As the differentiation diminishes, it moves from core to context. It still has to be managed at scale, but can at some point be offloaded to free-up resources for the next core innovation.

IT as a service broker

The IT function cannot deliver everything, but it is well placed to understand where value can be added and where there are others with the right capabilities. The process of managing how and what services have to be delivered needs to be co-ordinated. It involves the technical, commercial and legal integration of internal capabilities and those from external services providers. In doing this, IT can be a ‘service broker’ to the organisation, not an obstacle or limiting factor to change. Rather than a focus on hardware, software and services, bought and deployed in some combination to try to meet often mis-understood business needs, there is an opportunity to think differently about what IT involves and how it is consumed by the business.

Ultimately, IT can then be measured on the value obtained by the business, not the cost (or value) of the assets being employed. Diverse innovation is encouraged and supported within a well managed centrally coordinated strategy. For more details about the ‘IT as a service broker’ approach, download Quocirca’s free report here.


June 21, 2017  8:24 AM

Bad-bots in financial services

Bob Tarzey Profile: Bob Tarzey

These days, some automated financial services are taken for granted; automated telling machines (ATMs) or cash points have been around since the 1960s, internet banking since the 1990s. Few these days write cheques made out to cash or visit banks to make transfers.

Now there is a new wave of automation underway, driven by software robots (or bots), that are changing not just how financial organisations interact with customers but how they deal with each other. Among other things, bots can provide financial advice and carry out trades. Bots are especially good at complex repetitive activities such as producing wealth assessments and gathering the data to populate price comparison sites, for example to show comparative quotes for insurance.

All this automated activity can benefit both banks and their customers, however, there is a down side; criminals are using bots too. In 2017, robbing a bank is much more likely to be perpetrated by a cybercriminal with an army of bots than a mobster with a sawn-off shotgun. So-called bad-bots are hard at work cracking access credentials, seeking out vulnerabilities in web sites and online applications, obfuscating targeted criminal activity with volume denial-of-service attacks and so on.

The trouble is distinguishing the good-bots from the bad and working out what some of the ones in-between are up to. Credential cracking bots, testing bank accounts for the use are of commonly used passwords need to be blocked, only an organisation’s own vulnerability scanners should be allowed to probe infrastructure, screen-scrapping bots from price comparison sites may be OK, those from unscrupulous competitors not so.

Fortunately, there are now technologies to help sort bots out based on policy. Quocirca’s Ultimate Guide to how bad bots affect financial services is free to download and provides more detail about how bots are operating in financial services and how to let the let the good ones through the stop the bad ones in their tracks. The full list of Quocirca’s Cyber-Security Threat Series on Mitigating Bad Bots can be viewed HERE. These e-books were sponsored by Distil Networks.


June 20, 2017  11:15 AM

Passing the buck – who pays for IT?

Rob Bamforth Rob Bamforth Profile: Rob Bamforth

The word ‘investment’ is often put next to IT spending and budgets in order to make the costs sound palatable. Organisations want a ‘return on investment’, many try to measure it and some even succeed. But not many. Perhaps IT could be thought of as something to consume, not own?

The problem is that like most industrial progress, IT is still rapidly evolving and expanding its capabilities. This means more spending on IT. According to recent Quocirca research, IT spending is flat for two in five UK companies but growing for the rest. For one in five it is growing fast. Costs are rising because IT has the potential to be used by more people for more purposes. IT departments struggle to cope with this as they often lack staff or the right skills and end up having to spend so much time and effort simply supporting and managing what they already have. Innovation is hamstrung by legacy.

Simply increasing budgets (even where this is possible) is not the solution. From the research, IT capital expenditure constraints are a significant reason why it is difficult to secure IT funds, but changes in business requirements tops the list and growth in user demand is also high up. Financing needs to go hand in hand with flexibility. Clearly a different approach is required.

sunburst2

Looking to the cloud

For many, public cloud services provide an opportunity to shift expenses to the operational expenditure (OpEx) side of the ledger, which many chief financial officers (CFOs) appreciate. Quocirca’s research shows cloud adoption is widespread in around a fifth of companies – both public and private cloud – and around half of UK companies are expecting overall cloud usage to grow. The primary drivers for public cloud in particular, revolve around reducing upfront outlay and adding flexibility in both cost and headcount. Even security, once a significant concern, is now thought to be being addressed.

There are significant challenges with cloud. A pay-as-you-go cost model seems appealing, but there may need to be significant architectural changes and costs up front. Once in place, costs can grow unpredictably. Most cloud services are delivering flexible technical capacity, with investment costs spread as OpEx, but this does not necessarily match the business requirement.

Hybrid cloud, hybrid financing

While public cloud has flexibility for providing certain resources – storage, compute power and application platforms on demand – this does not account for all of IT requirements. The end user needs other IT systems in order to access cloud services. A hybrid architecture, with some elements deployed on premise, often fits well from a technical, management and governance perspective. But it may not offer complete commercial flexibility as it will require capital expenditure (CapEx) and upfront investment.

Increasingly there is an appetite for delivering more IT capabilities as a service. This includes the well-established managed print services (MPS). The research also shows increasing interest in desktop as a service, mobile device management and video and collaboration services.

This means there needs to be a more all-encompassing hybrid model for financing, based on consumption of IT services in a manner that makes sense to the business user, that ultimately is paying for it and can measure the value. This is not something that should be tackled in the ‘shadows’, hidden away from IT. Shadow IT can be embraced and managed by the IT function. This can then act as a service broker by integrating a mix of internal and external capacity and capabilities to deliver the services required by the business.

Consumption based IT

Organisations prefer to spread costs over time and dependent on usage. Over time means that upfront CapEx can be avoided, with predictable recurring costs spread like a subscription. Rather than variability in capacity an initial service level should be agreed upfront. Pre-agreed and priced ‘burst through’ capacity could be made available and then be measured and billed for, only if used. Instead of just applying this approach to individual services from the public cloud, with the right financing this could be applied to all IT systems; moving the model from ‘owning’ IT assets, to ‘consuming’ them.

With this type of financial model, organisations have the technical flexibility akin to public cloud, shifting CapEx into OpEx. IT can be consumed as a service, with the value delivered much more closely aligned to costs. The details from Quocirca’s research (commissioned by Rigby Capital Ltd) and information for organisations considering how consumption based IT might work for their business, is available in this free to download report, “The next step in digital business transformation”.


June 16, 2017  5:12 PM

Connect, collect, analyse and activate NOW!

Bernt Ostergaard Bernt Ostergaard Profile: Bernt Ostergaard

There is a chill stirring in the corridors of corporate power. It’s the data driven company paradigm that is subverting the very foundations of corporate hierarchies and established divisions of labour. Well-established positions within the corporate power structure risk being undermined by new internal alliances, new business processes and shifting revenue streams.

Historically, we see that It’s not always the strongest business that survive in times of fast paced technology change – it’s the companies that are best able to incorporate change to their advantage. Today very small companies can challenge even the market leaders using pay-as-you-go IT resources and disruptive business propositions. It’s cold at the top as as shown in an analysis of S&P 500 companies (http://www.cnbc.com/2014/06/04/15-years-to-extinction-sp-500-companies.html) demonstrated; it found that the average lifespan of their top-500 US companies has dropped to 15 years.

Successful data driven companies are forging new alliances across their production, marketing, sales and IT teams to bolster customer satisfaction levels and address changed buying behaviour. At the recent TIBCO European summit in Berlin, many of the talks focused on these levers of change.

interconnect-everything

The message coming out of the TIBCO NOW SummiIt’s an API World that requires analysis and decision-making

Very few companies – and certainly not companies with many years of existence, have IT systems and applications that interoperate effortlessly. The drag of legacy is inevitable. The traditional response has been the application program interface (API). By describing a program interface and working towards standardised gateways between applications using brokers like Google APIGEE, Red Hat 3Scale, companies can exchange data between their applications. Actually of course, there are thousands of API definitions. Just finding them and keeping them updated is a serious task.

To get into this market, TIBCO acquired Mashery, a UK based software provider of API services in 2015. This company develops and maintains a large library of APIs and provides them as-a-service. So the company has become indispensable for a wide range of companies with large and heterogeneous applications, as well as companies needing to interface with customers’ and partners’ applications. There are clear and present challenges to widespread adoption of API solutions. One is security, where API-security companies like Elastic Beam point out that signature and rules-based security tools do not detect API attacks.

The other challenge is user friendliness. If you are going to integrate lines-of-business and enable fishing in your data lake, you need to ensure that the IT gear needed is easy-to-use for everyone. Users outside the IT community have very little patience with slow or complicated systems, let alone systems that require reading a manual! Horizontal business tools require connectivity, the ability to find and sort the relevant data. Then follows the analytics phase and finally identifying what actions to undertake.

At the TIBCO event, we met some of the customers:

Small business case study:

The London Theatre Direct (LTD) company (www.londontheatredirect.com) is a ticket aggregator selling tickets on-line to London West-End performances. The company has three people in London and 30 back office developers in Prague. It is completely AWS cloud-based and relies on Mashery to keep its APIs running. Building up this business required interfacing with a host of different ticket systems and then creating a unified and easy-to-use selection and booking platform. The early contributors to the aggregator platform soon experienced faster ticket sales here than on their own. So three years into the process, the tide turned and theatres now come to LTD. The company also white labels its platform to other ticket outlets, and has expanded its operations abroad.

Large business case study:

The Mercedes Formula 1 team (www.mercedesamgf1.com) competes in the heady world of extreme motor sports. The F1 industry epitomises the constant challenge to improve efficiencies, sturdiness and adaptability. Not least when servers and all peripherals are transported around the world 21 times a year. During a race decisions must be based on extracting the right data from the 3.5TB of car performance data that is accumulated on board.  Too much data sprawl is just more noise. There is a need to economise when your download window is the 60 seconds’ pit stop for tire change and the car refuelling. To analyse the data and optimise car performance the Mercedes team relies on TIBCO Spotfire and Streambase. While this provides data at rest and streaming data analysis, Mercedes is constantly looking for better automation and machine learning tools.

Analysing high volumes of data is also a priority in the Mercedes F1 marketing department with 11 million followers on Facebook. Hitherto they have all been getting the same feeds. TIBCO data scientists are now enrolled into the Mercedes strategy planning to tailor information to the preferences of its followers. The aim is to climb higher on their Facebook wall using tools such as software regression analysis. Messages and icons on users’ walls can also be seen by others.

 

The data driven challenge

Data driven organisations need to bring together the two symbiotic elements of interconnection using APIs and augmentation using analysis of data at rest and streaming. The guiding principles are: cloud first, ease of use, and industrialisation. The software underpinning this drive must be pleasant and appealing to a wider group of users in the enterprise. Developers need to address the whole range of ‘personas’ who need to interact with the data. Ease of use and robust solution packages must be industry specific and maintained continuously.


Page 1 of 2912345...1020...Last »

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: