The word ‘investment’ is often put next to IT spending and budgets in order to make the costs sound palatable. Organisations want a ‘return on investment’, many try to measure it and some even succeed. But not many. Perhaps IT could be thought of as something to consume, not own?
The problem is that like most industrial progress, IT is still rapidly evolving and expanding its capabilities. This means more spending on IT. According to recent Quocirca research, IT spending is flat for two in five UK companies but growing for the rest. For one in five it is growing fast. Costs are rising because IT has the potential to be used by more people for more purposes. IT departments struggle to cope with this as they often lack staff or the right skills and end up having to spend so much time and effort simply supporting and managing what they already have. Innovation is hamstrung by legacy.
Simply increasing budgets (even where this is possible) is not the solution. From the research, IT capital expenditure constraints are a significant reason why it is difficult to secure IT funds, but changes in business requirements tops the list and growth in user demand is also high up. Financing needs to go hand in hand with flexibility. Clearly a different approach is required.
Looking to the cloud
For many, public cloud services provide an opportunity to shift expenses to the operational expenditure (OpEx) side of the ledger, which many chief financial officers (CFOs) appreciate. Quocirca’s research shows cloud adoption is widespread in around a fifth of companies – both public and private cloud – and around half of UK companies are expecting overall cloud usage to grow. The primary drivers for public cloud in particular, revolve around reducing upfront outlay and adding flexibility in both cost and headcount. Even security, once a significant concern, is now thought to be being addressed.
There are significant challenges with cloud. A pay-as-you-go cost model seems appealing, but there may need to be significant architectural changes and costs up front. Once in place, costs can grow unpredictably. Most cloud services are delivering flexible technical capacity, with investment costs spread as OpEx, but this does not necessarily match the business requirement.
Hybrid cloud, hybrid financing
While public cloud has flexibility for providing certain resources – storage, compute power and application platforms on demand – this does not account for all of IT requirements. The end user needs other IT systems in order to access cloud services. A hybrid architecture, with some elements deployed on premise, often fits well from a technical, management and governance perspective. But it may not offer complete commercial flexibility as it will require capital expenditure (CapEx) and upfront investment.
Increasingly there is an appetite for delivering more IT capabilities as a service. This includes the well-established managed print services (MPS). The research also shows increasing interest in desktop as a service, mobile device management and video and collaboration services.
This means there needs to be a more all-encompassing hybrid model for financing, based on consumption of IT services in a manner that makes sense to the business user, that ultimately is paying for it and can measure the value. This is not something that should be tackled in the ‘shadows’, hidden away from IT. Shadow IT can be embraced and managed by the IT function. This can then act as a service broker by integrating a mix of internal and external capacity and capabilities to deliver the services required by the business.
Consumption based IT
Organisations prefer to spread costs over time and dependent on usage. Over time means that upfront CapEx can be avoided, with predictable recurring costs spread like a subscription. Rather than variability in capacity an initial service level should be agreed upfront. Pre-agreed and priced ‘burst through’ capacity could be made available and then be measured and billed for, only if used. Instead of just applying this approach to individual services from the public cloud, with the right financing this could be applied to all IT systems; moving the model from ‘owning’ IT assets, to ‘consuming’ them.
With this type of financial model, organisations have the technical flexibility akin to public cloud, shifting CapEx into OpEx. IT can be consumed as a service, with the value delivered much more closely aligned to costs. The details from Quocirca’s research (commissioned by Rigby Capital Ltd) and information for organisations considering how consumption based IT might work for their business, is available in this free to download report, “The next step in digital business transformation”.
There is a chill stirring in the corridors of corporate power. It’s the data driven company paradigm that is subverting the very foundations of corporate hierarchies and established divisions of labour. Well-established positions within the corporate power structure risk being undermined by new internal alliances, new business processes and shifting revenue streams.
Historically, we see that It’s not always the strongest business that survive in times of fast paced technology change – it’s the companies that are best able to incorporate change to their advantage. Today very small companies can challenge even the market leaders using pay-as-you-go IT resources and disruptive business propositions. It’s cold at the top as as shown in an analysis of S&P 500 companies (http://www.cnbc.com/2014/06/04/15-years-to-extinction-sp-500-companies.html) demonstrated; it found that the average lifespan of their top-500 US companies has dropped to 15 years.
Successful data driven companies are forging new alliances across their production, marketing, sales and IT teams to bolster customer satisfaction levels and address changed buying behaviour. At the recent TIBCO European summit in Berlin, many of the talks focused on these levers of change.
Very few companies – and certainly not companies with many years of existence, have IT systems and applications that interoperate effortlessly. The drag of legacy is inevitable. The traditional response has been the application program interface (API). By describing a program interface and working towards standardised gateways between applications using brokers like Google APIGEE, Red Hat 3Scale, companies can exchange data between their applications. Actually of course, there are thousands of API definitions. Just finding them and keeping them updated is a serious task.
To get into this market, TIBCO acquired Mashery, a UK based software provider of API services in 2015. This company develops and maintains a large library of APIs and provides them as-a-service. So the company has become indispensable for a wide range of companies with large and heterogeneous applications, as well as companies needing to interface with customers’ and partners’ applications. There are clear and present challenges to widespread adoption of API solutions. One is security, where API-security companies like Elastic Beam point out that signature and rules-based security tools do not detect API attacks.
The other challenge is user friendliness. If you are going to integrate lines-of-business and enable fishing in your data lake, you need to ensure that the IT gear needed is easy-to-use for everyone. Users outside the IT community have very little patience with slow or complicated systems, let alone systems that require reading a manual! Horizontal business tools require connectivity, the ability to find and sort the relevant data. Then follows the analytics phase and finally identifying what actions to undertake.
At the TIBCO event, we met some of the customers:
Small business case study:
The London Theatre Direct (LTD) company (www.londontheatredirect.com) is a ticket aggregator selling tickets on-line to London West-End performances. The company has three people in London and 30 back office developers in Prague. It is completely AWS cloud-based and relies on Mashery to keep its APIs running. Building up this business required interfacing with a host of different ticket systems and then creating a unified and easy-to-use selection and booking platform. The early contributors to the aggregator platform soon experienced faster ticket sales here than on their own. So three years into the process, the tide turned and theatres now come to LTD. The company also white labels its platform to other ticket outlets, and has expanded its operations abroad.
Large business case study:
The Mercedes Formula 1 team (www.mercedesamgf1.com) competes in the heady world of extreme motor sports. The F1 industry epitomises the constant challenge to improve efficiencies, sturdiness and adaptability. Not least when servers and all peripherals are transported around the world 21 times a year. During a race decisions must be based on extracting the right data from the 3.5TB of car performance data that is accumulated on board. Too much data sprawl is just more noise. There is a need to economise when your download window is the 60 seconds’ pit stop for tire change and the car refuelling. To analyse the data and optimise car performance the Mercedes team relies on TIBCO Spotfire and Streambase. While this provides data at rest and streaming data analysis, Mercedes is constantly looking for better automation and machine learning tools.
Analysing high volumes of data is also a priority in the Mercedes F1 marketing department with 11 million followers on Facebook. Hitherto they have all been getting the same feeds. TIBCO data scientists are now enrolled into the Mercedes strategy planning to tailor information to the preferences of its followers. The aim is to climb higher on their Facebook wall using tools such as software regression analysis. Messages and icons on users’ walls can also be seen by others.
The data driven challenge
Data driven organisations need to bring together the two symbiotic elements of interconnection using APIs and augmentation using analysis of data at rest and streaming. The guiding principles are: cloud first, ease of use, and industrialisation. The software underpinning this drive must be pleasant and appealing to a wider group of users in the enterprise. Developers need to address the whole range of ‘personas’ who need to interact with the data. Ease of use and robust solution packages must be industry specific and maintained continuously.
May 16th and 17th 2017 saw Eskenzi PR stage its 11th 2-day IT Security Analyst Forum, as usual, in London. The morning of the second day (which was a Wednesday) was the customary CISO (chief information security officer) roundtable. A coming together of 20 or so IT security leaders from blue chip UK enterprises and public sector organisations to share their views with the analysts and IT security vendors that sponsor the event.
The timing was interesting, as although no one knew it in the run-up, the event took place just after the WannaCry weekend (the global release of a worm carrying ransomware that infected certain unpatched and unprotected devices running Microsoft Windows). Speculation was rife on the Tuesday night that there would be gaps in the CISO line-up as some absented themselves to deal with the aftermath. Not so, they all turned up.
It was already becoming apparent that the WannaCry attack was not as bad as many had speculated. True, many of the CISOs’ weekends had been interrupted to assess any impact, but by Monday most felt their update and protection regimes has done the job. There was little sympathy for the more lackadaisical organisations that had been hit. It may be true that some organisations still have to run now unsupported Windows XP devices, but this does not mean they should be unprotected from intrusions.
The CISOs had plenty of other things to talk about. The impending GDPR (the EU General Data Protection Regulation) inevitably came up a few times. However, as Quocirca speculated in its recent Computer Weekly GDPR buyer’s guide, most CISOs felt their organisations were well on their way with their GDPR compliance plans. Generally it was considered that personal data has already been secured to deal with existing requirements, so the main new challenges were around data processing rules. There was a generally positive view of the regulation’s enhanced privacy aims and, for the record, the CISOs did not think Brexit would make much difference to UK data privacy laws.
Other issues arising included the need for CISOs to engage with their boards, to demonstrate the value of investing IT security and especially measured against the value of data. To this end, some saw WannaCry as a wakeup call, others as a distraction (this is an attack from yesteryear, which should have been repelled unnoticed). There was strong message to the vendors to focus more on integration within and between their products. DDoS (distributed-denial-of-service) attacks are still a problem but one that can be overcome.
A gripe that more could be done to keep crime at bay on the internet (if criminals were running down the high street on a smash and grab raid, law enforcers would not standby). That said, there was generally acceptance that public cloud providers were building platforms that could meet enterprise security needs and that most issues arose from the users of such platforms poorly managing access rights.
The event would not happen without the security vendors who sponsor it. The first day is a kind of speed-dating where the analysts get to meet each vendor and hear its latest news and value proposition.
Corero’s SmartWall, blocks DDoS attacks at scale, especially for service providers, cleaning up internet traffic for us all. Barracuda firewalls could have protected networks from WannaCry, if well managed. To that end, FireMon, has broadened its multi-vendor firewall management capability to other network devices and cloud-based deployments. If you are going to store sensitive data in the cloud, then perhaps you should consider bringing your own encryption keys, perhaps as enabled by Eperi.
Even if WannaCry had managed to get past network defences, then Darktrace’s threat detection with its machine learning should recognise the abnormal behaviour on your network and block it; Cylance would do the same on individual end-points. One of your security products may have safely tested WannaCry in a Lastline sandbox (although the WannaCry kill switch aimed to prevent this).
When it comes to controlling access, a new offering from MIRACL’s Trust ZFA uses software tokens on devices to authenticate users and devices, including Internet of Things (IoT) devices with direct end users. Although when it comes to the heavy weight stuff, it could be worth turning to Belden-owned specialist Tripwire which has a number of existing and new offerings for securing the industrial Internet of Things (IIoT).
And if your organisation does not want to do any of this itself, then NTT Security has pulled together the resources of its sister companies Dimension, NTT Comms and NTT Data to provide robust security across its portfolio and on to its customers.
Thanks to Eskenzi PR for another great event, the CISOs for their time and the vendors for their sponsorship. The WannaCry criminals should also be acknowledged for providing a talking point, which despite the ill-intent, did not have the impact that at first seemed possible; perhaps in-part due to some of the innovation on show last week.
The EU General Data Protection Regulation (GDPR), which takes effect on 25th May 2018, could prove to be a catalyst to change the existing haphazard approach to print security.
Networked printers and multifunction printers (MFPs) are often overlooked when it comes to wider information security measures. Yet these devices store and process data, and as intelligent devices have the same security vulnerabilities as any other networked endpoint. With Quocirca’s recent research revealing that almost two thirds of large organisations have experienced a print-related data breach1, organisations cannot afford to be complacent. The biggest incentive to rethink print security is the substantial potential fines imposed by the GDPR. Infringement can attract a fine of up to 4% of total global annual turnover or €20m (whichever is the higher).
Securing the print environment
Today’s smart MFPs have evolved into sophisticated document processing hubs that in addition to print and copy, enable the capture, routing and storage of information. However, as intelligent, networked devices, they have several points of vulnerability. A printer or MFP, is effectively an Internet of Things (IoT) device and as such, left unsecured, is an open door into the entire corporate network. Without the appropriate controls, information on the device, in transit or on the device can be accessed by unauthorised users. The risks are real – recent Quocirca research indicating that almost two thirds of large organisations have suffered a print related data breach.
There are two key issues – the printer/MFP as an access point to the network, and the printer/MFP as a storage device for “personally identifiable information” (PII).
Mitigating the print security risk and addressing GDPR compliance
As critical endpoints, printers and MFPs must be part an overall information security strategy. This should ensure that all networked printers and MFPs are protected at a device, document and user level. This means, for instance, that data is encrypted in transmission, hard drives are encrypted and overwritten, print jobs are only released to authorised users and devices are protected from malicious malware.
Many organisations may believe that they are covered by existing technology, but in many cases this does not protect against the latest threats. Consequently, operating a large, mixed fleet of oled and new devices, can leave gaping security holes.
Given the complexity of print security in large organisations, particularly those with a diverse fleet, Quocirca recommends seeking guidance from vendors that understand the internal and external risks and the risk of unprotected data on printer/MFP devices. Organisations should select vendors that can address both legacy and new devices and offer solutions for encryption, fleet visibility and intelligent tracking of all device usage. This should ensure the ability to track what information is being printed or scanned, for instance, where and on what device, therefore enabling faster breach remediation.
Managed print service (MPS) providers should be the first port of call, as they are best positioned to advise on print security technology. The emergence of advanced managed print security services (offerings vary from vendors that include those from HP, Lexmark, Ricoh and Xerox) aim to improve resilience against hacking attempts on devices, rapidly detect malicious threats, continually monitor the print infrastructure and enhance security policies and employee awareness.
Look for comprehensive print security services that offer:
- Assessment: A full security assessment of the printer infrastructure to identify any security gaps in the existing device fleet. This should be part of the broader Data Protection Impact Assessment (DPIA) that an organisation may conduct internally or using external providers. Recommendations can be made for ensuring all devices use data encryption, user access control and features such as hardware disk overwrite (the erasure of information stored on the MFP hard disk). Also look to use endpoint data loss prevention (DLP) tools at this stage to gain insight as to what likely PII could be transferring via an MFP (for instance scanning personal information via the MFP to email or cloud storage).
- Monitoring: In order to monitor and detect breaches, ongoing and proactive monitoring ensures devices are being used appropriately in accordance with organisational policies. More advanced print security controls use run-time intrusion detection. Integration with Security Information and Event Management (SIEM) systems can help accelerate the time to identify and respond to a data breach, which is key to GDPR compliance. Consider third-party managed services support in order to streamline data logging and security intelligence gathering.
- Reporting: GDPR’s demanding reporting requirements can be addressed through reporting usage by device and user. This will highlight any non-compliant behaviour or ‘gaps’ in controls so that they can be identified and addressed, and allow audit trails to be created to support the demonstration of compliance.
GDPR is a reminder that organisations should proactively assess their security position. Organisations must move quickly to understand the legislation and put appropriate measures in place. Ultimately print security is part of a broader GDPR compliance exercise, and it is vital that organisations act now to evaluate the security of their print infrastructure.
Fore more information on the steps that should be taken to protect the print environment in light of GDPR, please contact Louella.Fernandes@quocirca.com
Keeping anything safe and secure involves multiple considerations. Avoid putting yourself in danger, put up a protective ‘shield’, detect when that is compromised, take mitigating action.
When it comes to communication and the modern hyperconnected world based on open protocols, it’s much harder to avoid danger. Hence shields and detection are the significant part of the online security proposition.
Mitigation is another matter. While some of it can be accomplished financially or through after the event actions such as insurance claims, smarter or a more holistic use of technology can also offer useful and timely support reducing the negative impact of incidents.
Consider email. From a business perspective, many will feel it is a double-edged sword – can’t live with it, can’t live without it. The general expectation with email is that it should work, be used sensibly to deliver important messages which arrive promptly and be addressed to the correct people! Emails should be uncluttered by lots of surrounding dross, especially malware.
The reality is of course all too often the exact opposite. While the internet might have robust survival in its core packet switching protocols, some of the applications layered over it can be flimsy at times. Email systems need enhancing, at the very least with anti-SPAM and anti-malware shields and data leak prevention. The major security players have built up considerable expertise in this sector, including Trend Micro, Sophos, McAfee and Symantec.
However, there is no such thing as 100% protection. Given email’s significance and business dependence on it, many could benefit from going further with additional layers of defence. This does not mean a protectionist approach of ‘closing the door’, but better monitoring of communication flow, and critically, an ability to learn, adapt and follow up in real time.
Recognised malicious software should be detected by antivirus scanners and stopped on entry to an organisation well before delivery to its final destination inbox. But this is a moving feast. New malware appears, before scanners are updated, meaning there will be traffic that has already passed through the secure perimeter.
This is where solutions that take a retrospective view, such as Retarus’ Patient Zero Detection, can add value to the internal flow of email within an organisation. In addition to perimeter protection, it uses an innovative follow-up approach based on post-delivery identification of attacks. This takes advantage of the time lag between an email entering an organisation’s email system and then being accessed. It applies a trace to all incoming messages so that this can be done rapidly, without resorting to searching all inboxes.
Imagine the scenario. Email with new virus that is not picked up by the barrier protection passes into the organisation and through to inboxes without being noticed. Normally when it is accessed, by one recipient or another, the problem will manifest…or perhaps not, but the damage is done.
With the Patient Zero Detection approach, each incoming email is given a unique digital fingerprint/hash code so that it can be traced. If an email containing a new virus gets through, but the virus is subsequently identified, recipients can be identified retrospectively. Administrators (and if required, recipients) can be immediately warned and appropriate action taken to prevent further impact.
This approach not only supports rapid and targeted responses to potential problems, it also allows for more forensic exploration of the flow of messages – both safe ones and those containing malware. It is not foolproof, and needs to be part of a wider email security strategy. But it does provide an additional layer of support in avoiding the worst possible impacts of malware and user behaviour.
Security strategy – a portfolio of tools and educated users
It is often said that security is everybody’s responsibility. This is true, but everybody also needs as much automated support as possible. This means not only looking at protection and detection, but also remedial action. Tackling all of email security challenges requires a range of tools and addressing every stage of the communications flow. That should help ensure that one side of the email ‘sword’ is sharper than the other!
Computer Weekly has now published Quocirca’s buyer’s guide to the General Data Protection Regulation (GDPR), Dealing with data under GDPR. The guide outlines how mid-market organisations can reduce the risk of potentially big fines for mishandling personal data; either by taking themselves out of scope for the regulation or outsourcing the administration and data security requirements.
For those still trying to make sense of it all, here is Quocirca whistle stop tour of the GDPR:
The GDPR applies to data controllers (organisations that collect and store personal data to support business processes) and data processors (third parties that process data on behalf of data controllers). The regulation applies to any organisation that deals with data regarding EU-citizens whether they are based in the EU or beyond its borders.
- Personal data is anything that can be used to directly or indirectly identify a person; e.g. names, photos, email addresses, social media posts, medical records and IP addresses (so simply gathering information on devices via an IoT application may bring an organisation into scope).
- The maximum fines are big; the greater of 4% of annual global turnover or €20 Million. Fines can be levied for both data breaches and for failing to meet administrative requirements.
- Privacy by design and by default must be built in to relevant processes and applications (in other words, the systems that process data must be secure and timely data breach detection capabilities must be in place).
- Data Protection Impact Assessments (DPIA) of the risk to data subjects (the likes of you and me) may be required before personal data is processed with bi-annual Data Protection Compliance Reviews
- Consent for processing must be obtained from data subjects to process their data.
- When data is leaked, there must be timely breach notification to both data subjects and the relevant authorities (in the UK the Information Commissioner’s Office or ICO).
- Data subjects have a right to access their data and for it to be supplied to them in a form that enables data portability.
- Data subjects can request data erasure (the so called right to be forgotten). This is not an absolute right, there are statutory obligations to keep certain data and it is allowed for legitimate research purposes.
- Only organisations that conduct regular and systematic monitoring of data subjects on a large scale need to appoint a Data Protection Officer (DPO).
Quocirca’s buyer’s guide to the GDPR, Dealing with data under GDPR, can be viewed on Computer Weekly at this link: https://www.computerweekly.com/feature/Dealing-with-data-under-GDPR
Have video conferencing peaked? Not really, but video is evolving. This earlier blog outlines why the traditional view of video conferencing might make it appear to have peaked, particularly in the market for expensive and specialized endpoints.
The business use of video, addressing specific needs and applications is however, definitely growing. The sector is evolving into a virtual model, centred around software and services. Video technology has become more affordable, with cameras either cheap accessories or a standard element of most desktop or mobile devices. It is also now benefiting from a cloud or subscription model for service delivery. This makes video connections simpler to access, with scalable adoption so its use can be better aligned to business needs.
Video is shifting from ‘souped up’ phone calls using specialised endpoints to something that can add value to any business process, anywhere. No longer video ‘conferencing’, but video enhanced applications using open standard technology and networks.
Some of this has been catalysed by consumer adoption of video, but enterprise video does not always follow consumer user experiences. Specialised, vertical applications are also increasingly benefiting from the incorporation and integration of video. This is not simply about the endpoints, but the end-to-end integration required to fully exploit visual media in a broader context of available data. The cloud is now fundamental to ensuring that the reach of visual media extends to match the business need.
Video in the wild
Benefits from in-office use cases are one thing, but the tele-connection of video really comes into its own out in the field. A good example of this is the use of body worn cameras used by emergency services. In particular there are ones used by police forces, such as the Si500 video camera/microphone/radio speaker from Motorola Solutions. This endpoint is similar in concept to action cameras and dash cams. However, it is clearly more rugged and is designed to enable officers to collect and stream live rich media.
The endpoint device is smart in itself. However, its real value comes from being used in combination with Motorola Solutions cloud-based digital evidence management service, Command Central Vault. Visual content is easily gathered and stored in a secure and compliant environment. It can then be immediately accessed and used by all parts of the operational group, wherever they are. It combines reassuringly simple and robust user experience with the potential for smart use of big data by the organisation. Reassuring too for all that depend on such services for public safety.
While the cost of endpoints and connectivity has fallen there still needs to be a purpose to justify the effort. Holding meetings (conferences) remotely to save travel time and money will only go so far. The real benefits from visual information come from making business processes more effective or efficient and making participants more comfortable, safe or involved. Organisations need to look at video as a strategic consideration, not simply a communications tool.
Video conferencing may not strictly have peaked, but the use of video is entering a new phase.
In some respects, it may have. But before there are panicked responses (or flaming torches and pitchforks) from many across the video vendor sector, let’s look at why this might not be an entirely bad thing.
Around 2011, many industry analysts were predicting strong growth in enterprise video conferencing market revenues. The consensus seeming to reach over $5B by 2016. The reality has probably been a bit below that. There has been a dip in total market sales over a couple of the intervening years, and certain video endpoint technologies (telepresence and desktop systems) have not done as well as some hoped or expected.
This does not mean video overall is unpopular. True, it seems to be taking some people a bit longer than anticipated to ‘get over’ being on camera. There also remains a persistent uncertainty that everything will always work smoothly, especially when external video connections are involved. However as consumers, the younger generation has casually accepted video, whilst the older generation uses video to stay in touch with distant relatives and those in between are getting accustomed to less formality at work. Together this should mean that video will become more widely acceptable as a tool in the workplace.
So why not exponential growth in revenues?
Cost reduction is potentially a reason, but also adoption is being held back. Not by lack of bandwidth or usability, but by lack of access and application. However, the market is undergoing a series of changes which will improve matters. The proprietary and sometimes clumsy nature of engaging with video conferencing systems is evaporating fast. Interoperability and ease of use have been tackled on the technical front. Subscription or operational models of pricing have eased commercial adoption.
Better, smarter, high resolution cameras have become a low-cost desktop accessory and high quality audio is more commonplace. While fully immersive telepresence systems have sold well in certain quarters, investment costs make them exclusive to high end markets and a narrow group of relatively infrequent users. Elsewhere much video technology is moving into everyday, rather than exceptional, use.
However, it is not hardware advances that is making video progress, but software, services and end-to-end applications. Increasingly, video capability is being delivered as a cloud-enabled service. This also allows it to be accessed and used anywhere often ’embedded’ to enhance applications. This is far better than being seen as a separate medium to be unified with other forms of communications.
This encourages behaviour change
Video is shifting from being focused on specialised endpoints delivering ‘souped up’ phone calls over dedicated network capacity, to being something that can seamlessly add value to any business process, anywhere. This is no longer video ‘conferencing’, but video enhanced applications using open standard technology and networks.
These applications are not simply about establishing a connection or ‘communication’, but reaching a ‘conclusion’. The idea is to get something of value accomplished. This might be a broad or horizontal application, such as ‘better collaboration’, or a vertically aligned one, such as tackling crime.
Better collaboration is not just people simply communicating on a pre-arranged schedule – conferencing – but all participants enjoying ad hoc rich media involvement and being able to focus on and work towards a goal. This implies access to video by anyone, anytime, anywhere on any device. Plus it must be seamless. This can be best accomplished by shifting all the complexity into the cloud and smartly integrating to provide as close to a simple, reliable and zero touch experience as possible.
Many of the traditional business video conferencing players have been moving towards cloud in the last couple of years. Other companies have focused there from the start. The recent integration of Blue Jeans Network’s onVideo and Primetime products with Workplace by Facebook indicate a strong commitment to video on its enterprise platform and increases the pervasiveness of video by embedding it in other applications and delivering as a service.
There are others too in the enterprise video software and cloud camp, such as Vidyo and StarLeaf. And then there is Skype. Microsoft has clearly moved this proposition from a consumer play into enterprise video contention. It is also working closer with industry stalwarts like Polycom, plus prompting integration and interoperability programmes with the likes of Cisco and Lifesize.
Video conferencing may have reached an inflexion point, but video innovation and adoption is showing no signs of slowing. Enterprises should no longer view video as a tool for a set group to do ‘conferencing’ with dedicated equipment. It is something to be delivered anywhere and everywhere, integrated as a service to provide organisational value and individual ease of use.
To deliver this, enterprises should look to the cloud for video, but also look to those who deliver it as a solution to a business problem and not simply a way to add head and shoulders onto a phone call. Video conferencing may not strictly have peaked, but the use of video is entering a new phase.
Over the last quarter century the Internet has become a fundamental utility that businesses, governments and consumers rely on; being off-line is less and less acceptable. And yet, a 2017 Quocirca research report, Winning the Domain Game (sponsored by Neustar), shows that 72% of UK business face internet down time regularly or occasionally; 61% suffer performance problems.
The problems blamed for this, range from server down-time to DDoS attacks, with around one third citing the domain name system (DNS) for at least some their internet access woes: DNS itself suffers from downtime, attacks and other inefficiencies. DNS is the Internet’s own fundamental utility which links users with online resources, translating hard to remember internet protocol addresses (e.g. 22.214.171.124) into meaningful names (e.g. bbc.co.uk).
DNS problems are probably worse than these figures suggest. By its very nature, DNS is transparent to users, so the role it plays in impacting internet access may go unreported. That users do not recognise DNS issues is unsurprising, but IT managers are also likely to overlook it; 55% report poor visibility in at least one aspect of DNS Management.
This is partly due to a lack of tools (the majority lack many DNS management capabilities), but also due to the complex way in which DNS services are provisioned. More than three quarters of organisations have five or more different ways of accessing DNS, ranging from in-house servers to internet registrars and service providers. Correlations within Quocirca’s research show that they tolerate this for a reason, having multiple paths to DNS improves availability – but, it also degrades overall internet performance.
Due to the varied needs of users and profusion of online services, few organisations expect to end up with a single management point for all their DNS needs. However, those that have committed to a specialist DNS service provider reduce DNS complexity. This has a big impact, improving visibility in all areas and providing access to a wide range of value-added DNS features, ranging from the ability to route internet traffic to blocking unwanted content.
No organisation can manage long without reliable internet access, so it follows that reliable DNS services are needed too. Poor management of the latter is likely to be responsible for problems with the former more often than is currently understood.
Quocirca’s report, Winning the Domain Game is free to download HERE.
It used to be the Hadoop Summit, but the strategic focus at Hortonworks the enterprise-ready open source Apache Hadoop provider, has evolved. So, this year it was renamed DataWorks Summit. The company now encompasses data at rest (the Hadoop Data Platform now in version 2.6), data in motion (the Hadoop Data Flow) and data in the cloud (the Hadoop Data Cloud). Hortonworks aims to become a multi-platform and multi-cloud company. The focus is on the data in data driven organisations. Just a few years ago Hortonworks connected with IT architects. Today it’s launching conversations with lines of business and chief marketing officers.
Since the company launch in 2011 backed by Yahoo, Hortonworks has grown to over a thousand employees in 15 countries and customers in sixty countries. Its European presence is operated out of the UK with sales staff in North and Central Europe. It’s a young organisation with many newly graduated employees, strong on technology but lacking business domain insights. Many have maintained their links with universities to address big data and IoT issues. Hortonworks is involved in several joint R&D projects, in what Hortonworks co-founder Owen O’Malley terms the ‘community over code’ approach. One such project is the Digitisation of Energy, aiming to connect 1 million electrical car batteries to the grid to act as a sustainable energy reservoir.
Where’s the money coming from?
Sustained strong growth still evades Hortonworks. In response it is shifting product focus from selling converged Hadoop systems to IT departments, to selling data platforms to lines of business. Of its two main competitors, MapR remains a VC backed private company, while Cloudera is in the IPO funnel, touting its hybrid open source software (HOSS) model which ties open source elements with proprietary software for its enterprise‑grade platform. So Hortonworks may be tempted to add more proprietary elements to the open source Hadoop platforms, to increase its profitability.
Critical to maintaining an open source focus are the fast expanding fields of artificial intelligence and machine learning. Hortonworks is investing a lot of resources in developing open source code, and sees significant revenue opportunities across all business verticals. This is exemplified by its Hadoop data lake developments that encompass data analytics, mobility and IoT using Hadoop Distributed File System (HDFS) and persistent memory data structures. With increasing legal requirements for data to reside in specific geo-locations, computing must come to the data. This requires data tiering for ‘hot’, ‘warm’ and ‘cold’ data storage to optimise local computing power requirements.
Partners on the Hortonworks Data Platform include IBM, HPE, Dell EMC, Pivotal, Teradata and Microsoft. Data Flow partners have not been named yet, but several major carriers are Hortonworks customers, and may soon become partners. Especially if the Federal Communications Committee under Trump abandons its net neutrality stance and allows carriers to offer different Internet QoS (quality of service) levels. Hortonworks will help them develop differentiated services for their customers. Data Cloud partners are the two majors AWS and Microsoft Azure. Hortonworks also has domain expertise alliances with Accenture, Cap Gemini and Deloitte to roll out industry wide IoT and cyber security offerings.
Where’s the future for Hortonworks
Hybrid cloud, IoT, hyper-convergence, big data and AI all point to massive data accumulation and the need for mobile and multi-tier data processing. These are all areas where Hortonworks is active. This was exemplified by an automotive case study. Mercedes, a front-runner in the automotive market, operates with five levels of development, from yesterday’s ‘assisted driving’ to today’s ‘partially automated’ and tomorrow’s ‘conditional automation’. Then follows ‘high automation’ in 2021, and finally ‘full automation’ in 2025. Today’s top-of-the-line cars generate around 500GB of data per day. In ‘full automation’ mode, data volumes will go up to 50TB a day. That requires intelligence at the edge and real-time hand-off to cloud computing processes.
Hortonworks wants to be on that journey, not just with the automotive industry, but across many other verticals. The company believes that only open source can evolve fast enough and create the standards needed to keep up with the data frenzy.