Quocirca Insights

Page 1 of 3012345...102030...Last »

March 6, 2018  3:54 PM

DevOps – Pragmatic production, not evangelism

Rob Bamforth Rob Bamforth Profile: Rob Bamforth

Despite the potential of ‘digital transformation’ and IT in general, many organisations find the reality a little disappointing. Development takes longer than expected, quality is lacking and what is delivered often fails to match the business requirements. Despite the continuous, rapid innovation and evolution of technology, with faster, cheaper and more readily available hardware, the software process seems to still struggle to deliver.

There have been many attempts to improve matters from computer aided software engineering (CASE) tools and methodologies to object orientation, 4th generation languages and tools for ‘citizen development’. So far, no silver bullets in either tools, methods, or superhero developers. One problem is that the software challenge has expanded and become increasingly monolithic. Roles have become specialised and the process structured, typically around a traditional engineering model with well-defined stages. Fine in principle, but inflexible and slow to adapt.

A break with the model was inevitable and shifted (typically) towards fragmentation, blurring of teams and speed with the Agile ‘manifesto’. This favours; individuals and interactions over processes and tools; working software over comprehensive documentation; customer collaboration over contract negotiation; responding to change over following a plan.

Agile often leads in turn to more fluid and continuous development and deployment processes, generically termed DevOps. Based on how Agile and DevOps are sometimes presented with almost alternative programming lifestyle enthusiasm, it might seem easy to believe they require some sort of cult following.

Certainly, the word ‘culture’ or phrase ‘cultural shift’ occurs very frequently when talking to enthusiasts. So much so, that many organisations become nervous of the almost revolutionary zeal. They wonder how they will be able to cope with adopting the apparently wholesale change.

While there are significant differences in its approach, DevOps should not be seen as an all-or-nothing dangerously disruptive force. It requires a certain way of thinking and organising teams, but this does not have to change and challenge the entire organisation. It might, over time, have more broad-reaching impacts, but these are based on its results, not its initiation.

Adopting a DevOps strategy

The primary requirement is commitment. Management must be bought into the process, and set out the strategic vision and goals. The starting point is – what are we trying to achieve?

While there are always clear end benefits to aim for – increase revenue, reduce costs, mitigate risk – it is the intermediate drivers that shape and derive value from DevOps. Being more responsive to customers, improving software quality, delivering services that meet business needs. The focus for DevOps adoption has to be decided upfront.

Some tasks are far better suited to the potential benefits of DevOps than others. Well-established software and systems that record and manage well-defined data requirements are unlikely to benefit the most. A better place to start would be aspects of customer or external engagement where changes in market, social, political or personal context open up opportunities for innovation. The questions are, what will be required and does anybody have any clear views yet?

So, next is an acceptance that this will require some experimentation. Some may call this ‘fail fast’, but a much better way to view it is ‘learn fast’. The experimentation is being done for a purpose based on the strategy. Small steps are taken, tested, refined and retried in pursuit of the goal.

Doing this requires a tightly-coupled effort based around different skills; design, programming, quality assurance, build, deployment, customer feedback. Rather than distinct stages or disparate teams, these are provided by close collaborators aiming to simplify and streamline the process.

Pragmatic DevOps

The culture for accomplishing this does not need to pervade the entire organisation, nor does it have to be applied to every challenge. DevOps has to be done with serious intent and commitment – from the top as well as those directly involved – but only for one simple reason. To address a problem hotspot and get the results that the strategy was put in place to achieve.

The reality is that those who put DevOps in place for sound reasons do enjoy the benefits. The frequency of software delivery for assessment by the user is increased and ‘good enough’ initial solutions can be honed to be better. Plus, with direct feedback the whole team feels and takes on responsibility for quality.

DevOps is not a panacea or a cult requiring blind commitment, but nor is it a passing fad. Used pragmatically it moves software development away from either exacting engineering or casual coding to poised production at pace. Getting better business value from software should be what ‘digital transformation’ is all about. Find the right project, set suitable goals and give DevOps a go. Who knows, your business culture might change once it’s benefited from the results?

February 20, 2018  11:21 AM

The road to hybrid cloud

Clive Longbottom Clive Longbottom Profile: Clive Longbottom

Front cover of BCS Evolution of cloud computing book

There is little argument that cloud is having a major impact on how organisations design, provision and operate their IT platforms.  However, there still seem to be major arguments as to what an overall cloud platform actually should be.

For example, should the end result be 100% private cloud, 100% public cloud or something in between?  Where do existing systems running on physical one-server or clustered systems fit in to this model?  Should any private cloud be housed in an owned facility, in a colocation facility or operated by a third party on their equipment?  Who should be the main bet when it comes to public cloud services – and how should these services be integrated together, either with other public cloud services, or with existing functions running on owned platforms?  How about the cost models lying behind different approaches – which one will work best not only at an overall organisation level, but at a more granular, individual workload-based model?

Once a basic platform has been decided upon, then the fun starts.  How should data be secured, both from an organisation’s intellectual property point of view, and also from a legal point of view as when it comes to areas such as GDPR?  How do organisations make sure that users are provided with the best means of access functions without having to remember a multitude of different usernames and passwords through the use of single sign on (SSO) systems?

How can data leakage/loss prevention (DLP) and digital rights management (DRM) help to ensure that information is secure no matter where it resides – not only when the organisation has direct control of it on its own network, but also as that information passes from its control to other parts of the hybrid cloud and to partners, suppliers and customers on completely separate networks? It is also important to look at how other security approaches, such as encryption of data at rest and on the move, along with hardware, application and database security fit in with this.

Once a cloud platform is in place, it can lead to a completely different approach in how functions are created, provisioned and managed.  Many organisations have found that a hybrid cloud is an ideal platform to support and make the most of DevOps, with continuous development and continuous delivery being far easier than using cascade/waterfall project approaches against a less flexible physical or virtualised platform.

However, effective DevOps requires effective orchestration – tooling has to be carefully chosen to ensure that the right capabilities are in place to enable the right levels of intelligence in how functions are packaged, provisioned and monitored in a contextually aware manner such that the overall platform maintains the desired performance and capabilities the organisation demands.

This then brings in a need for better technical approaches in how functions are packaged:  the rise of microservices provided via containers is taking over from large, monolithic packages and virtual machines (VMs).

Systems management becomes a different beast:  not only do IT staff need to monitor and maintain functions that they own within their fully managed private cloud environment, but they also must monitor in a contextually aware manner those functions that are running in a public cloud environment.  Ensuring that root cause analysis is rapidly identified and that remediation can be carried out in real time, with workloads being replatformed and redirected as required is a key requirement of a hybrid cloud platform.

A key area that many have struggled with is that although they know at a technical level that cloud is a better way forward, they have found it difficult to sell the idea to the organisation is ways that the business can understand.  However, a simple approach known as a Total Value Proposition provides technical people with a better means of getting their point across to the business – and so acquiring the funding required to implement such a major change in technical platform.

Hybrid cloud is the future.  No matter where you are in your journey to this, there are many pitfalls to avoid, and many areas of additional possible value that are too easy to miss out on.

A new BCS book, “Evolution of Cloud Computing: How to plan for change” is available that covers all these areas – and more.

February 13, 2018  9:12 PM

Baited breadth – the rise of phishing

Rob Bamforth Rob Bamforth Profile: Rob Bamforth

Reviews of organisational security can be viewed in many positive ways, but all too often with trepidation or resignation. The rise of phishing, where spoof, but increasingly credible, messages try to obtain sensitive information, is a particularly troublesome challenge. It exploits one of the weakest links in digital security, people, to get them to click on something they should not.

The latest (4th) annual State of the Phish report from Wombat Security Technologies is an interesting overview of the issue. This article quotes some of its data. It is based on feedback from infosec professionals, technology end users and Wombat’s customers. The report outlines the breadth and impact of phishing and what organisations might try to do to address it.

The scale of the challenge

The scale of the issue is staggering. Around half of organisations are seeing an increase in the rate of attacks, and the approaches are diversifying. Over half are experiencing spear fishing – targeting specific individuals, roles or organisations – and 45% are experiencing phishing by phone calls or text messages (‘vishing’ and ‘smishing’).

To combat this, companies can go beyond basic training courses and awareness campaigns, and simulate actual phishing attacks to see how employees behave and how they might respond to specific phishing styles. Those in the survey typically use four different styles of email templates to assess how end users react:

  • Cloud emails, relating to accessing documents from cloud storage or using other cloud services.
  • Commercial emails such as simulated shipping confirmations or wire transfer requests.
  • Corporate emails, which look like the sort of messages normally circulated internally such as ‘memos’ from IT or HR departments.
  • Consumer emails related to social media, retail offers or bonuses and gift cards notifications.

Although individuals are starting to get wise to the issue and average click rates have fallen from 2016 to 2017, sophisticated attacks can be very effective. Two particular simulated corporate templates had almost 100% click rate – one pretending to include an updated building evacuation plan, the other a database password reset alert. Other high deceivers were messages about corporate email improvements or corporate voicemails from an unknown caller. The only high rated consumer attack was about online shopping security updates.

The impact of phishing attacks seems to be growing, or at least is being more widely recognised as it has a strong impact on IT professionals. Almost half of organisations noted malware infection as a consequence of phishing. Compromised accounts and loss of data were the other most significant responses. It was also noted it causes a greater burden on IT in terms of, for example, helpdesk calls. So too was the potentially more far-reaching business consequences in terms of cost, time and disruption.

Addressing the issue

The key to mitigating the risk is to change end user behaviour. This is much more than simply making users aware of the consequences, although many organisations do have severe punishments. ‘Repeat offenders’ will face counselling in three quarters of organisations, removal of access to systems in around a quarter and in one in ten organisations it may result in being sacked. (The research was conducted in the US, UK and Germany, and employment regulations may differ significantly.)

As phishing seems not only to be inevitable (around three quarters of organisations say they are experiencing phishing attacks), but also very damaging, the most pragmatic approach would be to tackle the issue head on. Better to avoid any of those unfortunate consequences and address the problem at source by highly targeted training. This would benefit enormously from ongoing reinforcement since both the threats and the workforces having to deal with them will change over time.

This is where training using regular simulated attacks appears to help greatly. The majority of organisations are doing this very frequently with 40% training quarterly and 35%, monthly. Why? Well, an increasing number, now over three quarters of organisations, measure their own susceptibility to phishing attacks. Over half say they have been able to quantify a reduction based on their training activities. Given those metrics and the growing risks and potentially significant business disruption and impact, who wouldn’t do more to avoid being caught in the phishing net?

January 31, 2018  6:27 PM

Immersive Reality – a business opportunity

Rob Bamforth Rob Bamforth Profile: Rob Bamforth
augmented reality, Virtual Reality

This is the third in a series of articles exploring the challenges and opportunities facing the audio visual (AV) industry, which has its annual European flagship event, ISE2018, in February. This article looks at how innovative and immersive technologies can improve the user experience.

A deeper user experience?

Technology companies used to talk about ‘intuitive’ interfaces and ease of use as if this was an end in itself. It is the overall outcome that really matters. Many still spend too much time and effort making individual elements ‘easier to use’, without improving the user experience and the effectiveness of the entire process.

The result is that individuals have dozens of separately ‘easy to use’ applications and businesses have too many disconnected and siloed repositories of data. This leads to poor use of information; data is incomplete, inaccurate or delivered late from a process perspective. From an individual point of view users feel overloaded and overburdened.

The relationship between data and those who have to use it needs to change. Users need more effective tools to deal with the mass of data they face. Recent innovations in visualisation offers some welcome opportunities.

Mixing Realities

Immersive visual technologies have always generated a lot of hype. While much of the innovation is new and disruptive, many elements have been around for some time. The difference now is that compute power, storage and network capabilities have grown to a level that delivers a high quality experience and removes disconcerting delays. There are several key immersive technologies:

  • Virtual Reality (VR) is probably the best known and longest established immersive experience. It is also perhaps the most intrusive through the use of enclosed headsets, which have only in recent years dropped in size and cost. VR replaces the real world with an entirely digital one.
  • 360 (360 degree imagery) has become possible from advances in both camera and drone technologies. The experience is immersion in either images or videos previously captured in 360 degrees and can be displayed stereoscopically if desired. Headsets are not always required as handheld mobile devices can be almost as effective.
  • Augmented Reality (AR) has moved from sci-fi films screens into people’s hands and became widely popularised by the Pokemon Go game. AR overlays digitally created content into the real world. It can work using live camera feeds on mobile device displays, as images overlaid onto smart glasses or even as projected holograms. The flexibility and lower levels of intrusion means AR is a great fit into many application areas.
  • Mixed Reality (MR) is often confused or conjoined with AR. The differences are slight, but important. MR offers a more seamless experience where digital content is not simply overlaid, but it can also react to, and interact with, the real world. There is more sensor and input interpretation used in order to make this happen.


An opportunity being partially addressed

All immersive technologies are being well marketed. Numbers of specialised immersive devices such as headsets are growing, but still small relative to other areas of visual technology. The total number of VR headsets sold in 2017 was around 12 million units. There was probably similar numbers again shipped as low cost, but still offering an immersive experience, cardboard headsets. Entertainment and gaming plays a big part and the installed base of Sony Playstation VR headsets has passed 2 million.

As an opportunity, this is not something that the AV industry will have to itself. While the early VR industry from the 1990s was very specialised and thus fraught with investment challenges, the current immersive sector is much more mainstream, with significant investments from major IT vendors.

Microsoft launched its HoloLens in 2015 as a gaming oriented consumer product. Now it is businesses that have picked up what is widely regarded as a mixed reality headset. It is even certified for use as basic protective eyewear and there is a hardhat accessory currently in production.

Google’s approach has been more oriented on VR and AR, but it has been promoting business use in an assistive way. It prefixes typical user experience scenarios with the term “Help me”. These are oriented on helping the user to learn, create, operate or sell. As an approach this is very pragmatic, focusing much more on the outcome and less on the technology. Google has also been a strong advocate of ‘immersion for all’ through the promotion of cardboard headsets.

Apple’s executives reportedly held meetings with AR device parts suppliers at CES2018 and may be planning to follow up with an update to its ARKit software later this year. If hardware, such as an AR headset, were to arrive next year that would not be a massive surprise. It would also have a big impact, but Apple tends to delay or even cancel products if it does not feel they are ready. Its focus on software and content will in any event give a boost to the AR ecosystem.

Making an immersive experience acceptable and enjoyable

The broad market drive is good, but many technologies flounder through insufficient focus on the user. This is where the experience of the AV sector could be vital. If there is lag or display inconsistencies (such as for people who wear glasses) or even a tendency for motion sickness, then most users will find this unacceptable, except for short periods of usage. Some are trying to reduce the impact and companies such as MagicLeap.  Its Lightwear goggles promise a more natural and comfortable overlay of digital and physical worlds. It is not yet known how these approaches will be adopted for business use.

The delivery of visual stunning performance on large screen technology is something very familiar to the AV sector and some have put immersive technologies around users in the physical world. Barco’s multi-sided cave display provides an immersive environment to fully surround people so they can stand inside a VR or 360 space. Realfiction has taken this open immersion a step further with its DeepFrame augmentation of reality through holographic projection. This removes the need for specialised eyewear or goggles to deliver visual information in a way that sparks viewer engagement.

Delivering on outcomes

In many cases visual appeal will not be enough. Achieving business goals from engaging or immersive access to visually stimulating information will often depend upon what can then be done with the information, and perhaps most importantly, who with? What is required is a shared immersive experience that permits not only access to information but also effective collaborative tools for interaction across a disparate group or team.

It is also vitally important for collaboration technology to enable the sharing of information across multiple devices. Teams bring laptops, tablets and smartphones to meetings to take notes and to reference work in real-time. With the immersive collaboration tools, such as Oblong’s Mezzanine, teams can share data instantly for everyone to see and understand.

The most sophisticated and productive collaboration rooms provide participants with the means to share data simultaneously, utilise the most intuitive controls like gesture and/or touch, and instantly access information to make collective decisions. When collaborators share a united workspace, regardless of their location – meetings are more engaging, enlightening and immersive.

These are the outcomes that most businesses are looking to achieve. Better use of data, more effective teams. To get more insight into how different immersive technologies are being applied to improve both user experiences and business outcomes, by a significant number of innovators across the AV sector, visit ISE2018 in Amsterdam in February.

January 15, 2018  12:04 PM

The Express Route to a hybrid Azure platform

Clive Longbottom Clive Longbottom Profile: Clive Longbottom

Cloud computing is promising much – but is failing in many areas as users get to grips with some of its more complex areas.  An example here is when organisations start to look at how best to use multiple cloud platforms across a private and public environment – what is known as a ‘hybrid cloud’.

In essence, such a cloud sounds easy:  an organisation maintains certain workloads on its own equipment, using public cloud where and when suitable to meet the needs of a process.

However, the devil is in the detail.  The first problem starts where the choice of technical cloud platform has to be made.  Using different cloud technologies can make workload and data mobility difficult.  For example, using an OpenStack private cloud and a Microsoft Azure public cloud means that compromises must be made in certain areas.

However, Microsoft has now addressed this with the launching of Azure Stack – a highly engineered hardware/software system that can be made available to organisations to create a private cloud that is highly compatible with the Azure Public Cloud.

Great – a major step forward.  However, the next problem comes with performance.  Having an Azure Stack instance in an organisation’s own datacentre and then trying to connect through to the Azure Public Cloud means that data (and, in many cases, application workloads) has to traverse across a wide area network (WAN) connection.

If this is a public internet connection, then performance will generally be bounded by a ‘best efforts’ constraint.  If it is a dedicated link, then costs may be a problem.

Again, Microsoft has tried to address this, introducing its own dedicated connections into the Azure Public Cloud.  These connections are offered under the ExpressRoute banner, and provide ultra-low latency, high bandwidth paths into Microsoft’s cloud facilities.

Wonderful – except that these connections do not terminate in a general customer’s premises.  To achieve suitable scalability within cost constraints, Microsoft has struck deals with other facility providers to offer points of presence (PoPs) for ExpressRoute terminations.  End customers then need to strike a deal with these PoP providers, who will then provide dedicated connections using quality of service technologies to connect into the end customer’s facility – so there is still the issue of this last link that must be monitored and managed outside of the Microsoft environment.

Such complexity is hard to avoid, but can lead to problems for those aiming for a seamless logical hybrid cloud platform.  Luckily, there is one way around all of this: the use of a colocation provider who is also an Express Route termination point.

Here, the end customer takes space within the colocation facility and places their Azure Stack equipment within it.  Using intra-facility connectivity speeds, they then connect through to the ExpressRoute environment, giving an Azure Stack/Azure Public Cloud experience that is essentially one consistently performing platform.

By choosing the colocation provider carefully, it can be possible to further open up the options around connectivity.  Good colocation providers will have both dedicated, high performance networks underpinning connectivity between their own and other facilities, but will also offer direct connectivity into the main public clouds.

By ensuring that a colocation provider has the right portfolio of services, end user organisations can ensure that they maintain flexibility over how they continue their journey into a fully-functional hybrid cloud, and can make the most of the opportunities that such a platform can provide.

Quocirca has written a short report, commissioned by NGD, on how collocated Azure Stack using ExpressConnect can provide the best option for organisations wanting to utilise a hybrid cloud.  It can be downloaded for free here.

January 11, 2018  11:53 AM

Augmenting audio visual with intelligence and automation

Rob Bamforth Rob Bamforth Profile: Rob Bamforth
Collaboration, wearable

This is the second in a series of articles exploring the challenges and opportunities facing the audio visual (AV) industry, which has its annual European flagship event, ISE2018, in February. This article looks at how artificial intelligence (AI) and sensor data driven automation might transform the use and perceptions of AV.

Increasing complexity

IT connected AV systems have become pervasive. Low cost flat panel displays are replacing projection and can be placed anywhere that workers or customers might congregate. Many already are, or will have to be, connected to the network. Screen sharing, remote participants, unified communications and video conferencing tools are becoming more widely deployed. Organisations are expecting productive collaboration and individuals are increasingly expecting to be on camera and sharing data with colleagues. They will, however, still quickly lose confidence after a bad experience.

AV technology is now as sophisticated as anything else on the IT network, but there remain some fundamental usage challenges. Cabling difficulties or getting a screen or video collaboration system working should no longer be an issue, but total system complexity might be. Which begs the question, could intelligence embedded in the AV systems themselves make for simpler and more effective usage?

Intelligent audio visual

Keeping meeting room and display technology under control and working effectively is increasingly a complex IT task, with some asset management challenges thrown in. However, few organisations would be looking to deploy more people just to help support or augment this, despite potential user frustrations from un-integrated or unusable expensive AV displays and equipment. Artificial and automated intelligence needs to be applied.

Automation is already playing increasingly important roles in other areas of connectivity. Networks are becoming smarter with software defined approaches allowing for the intelligent centralisation of control with distribution of impact or power to the edge. Sensors and the internet of things (IoT) are gathering masses of data available for use for machine learning and automation.

The combination of smart edge devices and smart networks means that once manual processes can now be intelligently automated. For AV, this can be applied to enhance the already important user experience to new levels.

Joined up meetings

Since a large element of AV is about supporting people conveying information to other people, an obvious area to apply AV intelligence is to meetings. Many organisations will find that much of information shared and discussed during meetings will be lost or forgotten. Collaboration tools and repositories help, but only if everyone is sufficiently aware and disciplined to remember to use them.

This challenge can be addressed with a bit of joined up thinking. For example, Ricoh and IBM have together created the Intelligent Workplace Solution. Rather than just providing an interactive whiteboard, it includes IBM’s Watson as an active participant capturing meeting notes and action items, including side conversations that might otherwise be missed. It logs attendance using smart badges allowing participants to keep track of the meeting content and uses simple, voice control so that any participant, whether present or located remotely, can easily control what is on the screen through simple voice commands.

The intelligence can also be used to augment other aspects of more complex meetings. For example, the system can translate speakers’ words into several other languages. These can then be displayed live on screen or in a transcript. Applying automation to integrate the visual, audio and network elements in this way improves the experience for the participants. It also makes the overall meeting process much more efficient.

Smarter command and control

In many mission critical settings AV is already widely used to view live content. This may be from numbers of cameras, industrial sources, social media feeds and computer visualisations. Videowalls or large screens complemented by smaller displays are often used to allow large numbers of information feeds to be simultaneously monitored. Increasingly this centralised command and control approach is becoming a constraint rather than an ideal solution.

Firstly, there is a risk from having a single location and single point of failure. But also getting all the right individuals to one place to see and absorb the information is a challenge. Expertise may be widely spread; many employees will choose to work remotely or need to be mobile and so the control ‘centre’ needs to be distributed.

This more sophisticated model for command and control requires more intelligence in the AV network. Information must be shared to those who need to see it, but too much information could easily overload networks. This is especially true given that video content quality has advanced through high definition (HD) and is now increasing 4K. Information needs to be shared intelligently.

Higher definition increases the possibility to apply advanced recognition systems to the higher quality images. Intelligent analytics can be applied to do more automated monitoring. This means that the total volume of data does not need to be shared and manually monitored. Smart applications of this type of technology will make command and control systems more effective. They will also allow more worker flexibility and ensure that individuals (and networks) are not being overwhelmed by unnecessary data.

Wearable and IoT data

Technology innovations, such as sensors, IoT and wearable devices will increasingly impact on the AV world. This will add further intelligence and live data feeds to systems already dealing with masses of video and audio content. Coping with the vast array of sources, let alone the levels of data will be an increasing challenge.

As the data variety and volumes increase, intelligent systems are required to automatically capture and analyse this ‘big data’ live. Then, human operators can react quickly and appropriately. For example, combining conventional surveillance cameras or audio capture systems with machine learning or artificial intelligence systems will help to automatically detect anomalous data or abnormal situations.

Human operators, no matter how good or well-trained become tired or lose focus over time, especially if the task in monotonous. However, augmenting their decision making with automated systems will improve their responses. This is  applicable to issues such as security, but also to any type of application where change monitoring is required. This could be changes in an industrial process indicating failure or a drop-in quality, or patient vital signs in healthcare. New data sources and AV need to be well integrated within the overall system in order to fully realise the benefits.

Automatic for the people

In all situations, the ability to rapidly visualise and comprehend the implications of changes or trends in masses of data sources highlights how critical AV is to the user experience can be and how it needs to be integrated into a broader IT systems approach. Many organisations are already evaluating how AI and IoT can augment and automate some of their business processes. Many of these processes now rely heavily on AV infrastructure and especially the use of video. It will become increasingly important to think about this holistically as an AV/IT integrated architecture and not one where the visual elements and displays are added a potentially expensive afterthought. To get more insight into how intelligence is being applied in many different ways to the AV sector, visit ISE2018 in Amsterdam in February.

December 13, 2017  3:52 PM

The digital transformation of AV infrastructure.

Rob Bamforth Rob Bamforth Profile: Rob Bamforth

This is the first in a series of articles exploring the challenges and opportunities facing the audio visual (AV) industry, which has its annual European flagship event, ISE2018, in February. This article looks at how open systems are starting to transform AV infrastructure.

SD-WAN wireless cellular

The relentless march of open systems

Open systems and interoperability are vital to deliver communications, and yet proprietary ‘standards’ still linger longer in many technology sectors. Many argued that open standards would be slow, unsafe or unstable. The reality is more to do with protecting supplier revenues, rather than customer experiences. No sector is immune, including the audio AV market. It is already on a convergence path with IT and will also feel the dramatic impact of the shift to open standards.

For IT itself, it was only really in the 1990s that universally acceptable standards based around the protocols of the internet transformed computer networking to open systems. This was despite the early adopters starting an open movement a decade earlier. The telecoms industry hung onto its proprietary approaches a little longer. It eventually switched predominantly over to digital networking using IP, the internet protocol, in the 2000s.

Now every ‘thing’ is becoming a digital asset, converged onto and connected to a ubiquitous open wired and wireless network – the Internet of Things (IoT). This is absorbing proprietary sensors, monitors and supervisory control and data acquisition (SCADA) systems as it goes.

This convergence opens up huge opportunities, but also causes massive disruption in the value chains for manufacturers, channel partners and consultants that thought things would carry on as they were.

The demands on AV infrastructure

The impact on the AV sector, which due to its focus on customer experience excellence, high fidelity sound and high definition video distribution has so far only seen part of what digital transformation means, is likely to be equally significant. Once an open standard has passed the required capability threshold, the arguments for proprietary certainties no longer hold.

Market disruption can then be sudden, despite the vested interests that once held sway. The best approach is to understand what is driving the change and embrace it. The drivers which are already enabling changes in the AV sector are gathering pace:

  • Mobility and location. Users could be anywhere, want to connect simply to anyone and expect to be able to use any device. This often means small and mobile, but also not necessarily using corporate assets or connecting from within a corporate network. AV systems are historically designed to deliver high fidelity over known networks and cables in a well-managed and controlled environment. The received wisdom was that bringing in remote video had to be highly planned in order to be well executed. Now the user expectation is an excellent ad hoc experience anywhere. Dedicated systems, specialised adaptors and wires all get in the way and slow things down.
  • Data once existed in proprietary formats and dedicated networks. Specialised endpoints, encoding and distribution equipment was once the norm in telephony and has been in AV. Universal and commoditised devices means all media can be combined and are anticipated to be usable anywhere. Once, high definition AV formats, such as HD and 4K seemed to require dedicated hardware. The expectation now is that all content is data and software driven. The only separation is between control and data planes.
  • Scale and capacity. Dedicated systems mean inflexibility, along with over-capacity where money is no object, but under capacity for most, often only discovered at the most critical moment. A hyperconnected global world means that demand is highly variable. Organisations need to be able to scale up (and down) to adapt. Crucially, they also need costs to scale as OpEx alongside the value being received and there is an expectation of service delivery, not product delivery based on upfront CapEx.

All of this can be intensely unsettling for those involved in building and delivering compelling AV experiences. Manufacturers are including many of the right core technologies, but in their rush to innovate are sometimes failing to recognise that the best experience comes from the entire picture, not just several amazing elements.

For the industry, this not only means closer co-operation, as often discovered in other sectors with competitors (co-opetition), but also with adjacent groups. In this case, it is the already heavily committed IT sector which needs to be more closely involved. Networks, security and the adjustment to an as-a-service model are already a primary focus and lessons already being learned could be shared in a couple of areas.

Opportunities for change

One aspect to address is mobile or wireless connectivity. There have been and remain any number of proprietary standards, but those oriented around 802.11 and Wi-Fi have gained most traction and widespread adoption. The trend for greater capacity, spread and individual performance has moved through the alphabet of ‘b’,’a’,’g’ and ’n’ to now reach ‘ac’. This delivers sufficient performance for connecting multiple end user devices to AV systems, such as meeting rooms and video conferencing, but probably not enough for the highest performance requirements for video.

For wireless video, there has been a plethora of potential standards from both AV and IT sectors. WiDi, WHDI, and WirelessHD have all thus far mostly failed to convince vendors, integrators and end users that they provide sufficient benefit. 802.11ad, also known as WiGig looks like a different matter. It might be the standard that finally convinces many to cut the AV cord.

Wireless connectivity simplifies matters for the users, but can make systems installation more complex. AV integration is becoming a much more sophisticated and IT-heavy process than the ‘hanging and banging’ installation challenges of old. These issues are not new in IT and most IT departments have learned to reduce their integration challenges in several ways:

Firstly, channel partners are chosen more carefully based on the value add that they can bring to the whole process. It is no longer a matter of just easing selection and procurement, but also of providing support through the entire lifecycle to deployment and usage. Professional integration services and ongoing support are vital. A good choice of channel partner will make installation less of a burden. Some in the existing AV channel will need to adapt to meet the growing integration needs of their customers.

Next, outsourcing the heavy lifting. This is where ad hoc or on demand services can be delivered from the cloud. If infrastructure flexibility can be delivered as an operational cost it can be scaled up or down to meet changing demands.

Finally, automation. From development to deployment, tasks are being automated though smart software architecture choices, application programming interfaces (APIs) and scripting languages. Automating repetitive tasks, not only removes effort, but also reduced the risk of error.

Time to build new relationships

Altogether this represents a huge disruption to the AV sector. It might seem like a simple matter of moving to open standards, but this enables more complex and sophisticated integration and convergence into many elements of IT. This will also necessitate a shift in relationships.

AV vendors and their increasingly IT integration-aware channel partners will no longer be focusing on selling AV/IP to facilities managers, but to the heart of the IT function as well as the lines of business functions. It’s not about AV switching, encoding and decoding, but delivering a compelling communications and collaboration experience. To get a first-hand view of the companies already tackling the challenge of driving the open transformation of AV infrastructure, visit ISE2018 in Amsterdam in February.

December 3, 2017  10:48 PM

VeloCloud takes aim at outcome-driven networking

Bernt Ostergaard Bernt Ostergaard Profile: Bernt Ostergaard

With corporate network evolution focused on performance, adaptability and lower cost, it begs the question: Can the corporate SD-WAN network optimise on outcomes? Can corporate managers translate desired business outcomes into network performance requirements? Some solution elements are well known: Abstraction & automation combined with hybrid premise and cloud-based processing and storage. But is it enough to get us to outcome-driven networking?

For larger companies with more complex networks and business needs, the first-generation of device-centric SD-WAN didn’t really fit. They need a complete service package, that addresses multiple applications and different types of users as well as security and compliance requirements. This has given telcos and network service providers an opportunity to launch SD-WAN-as-a-Service. All they need are new software-based SD-WAN partners.

SD-WAN catering for telcos

Technology from three young companies dominate this marketplace today:

  • The market leader in SD-WAN for telcos is VeloCloud founded in 2012 (recently bought by VMware – itself bought by Dell). It bills itself as the only SD-WAN company to support data plane services in the cloud. Telco customers include Deutsche Telekom, AT&T, TelePacific and Sprint.
  • Versa Networks (with Verizon as a major investor) has been deployed by carriers like COLT and Verizon. It provides a multi-tenant solution that can seriously scale. This allows telcos to support large customers and retail service providers on a single platform.
  • Viptela (bought by Cisco) has been deployed by major carriers including Verizon and Singtel to deliver managed SD-WAN services. The Viptela Fabric is a purpose-built solution from ground-up to provide secure, scalable, resilient WAN applications performance.

From optimisation to outcome

The new element that VeloCloud adds to its Outcome-Driven Networking is self-Learning & adaptation. So, its latest service wrapper provides customers with self-learning analytics at scale and predictive insights in relation to set business outcomes.

In the hybrid corporate network that entails dynamic mid-flow, sub-second steering and remediation of WAN traffic to maintain outcomes as defined on a per-application basis. Companies that need to set up new endpoints, can on-board a new data centre or a recent acquisition, create routes for prioritization, exit points, and transit points – all defined on VeloCloud’s GUI. This ensures that traffic is then dynamically routed from branches to the new data centre.

Also security is improved, because users can isolate traffic by segments. Each segment has unique security, business, and access policies. This allows for custom policies on a per segment basis across the network.

Bring on the partners

VeloCloud also brings in a range of partner capabilities as virtual network functions (VNF). So companies like Zscaler, IBM, Infoblox, Symantec, ForcePoint and Radar can insert security services on the VeloCloud Edge to provide firewalls, VPN tunnelling etc. into network function virtualisation (NFV) enabled infrastructures.

AT&T’s Indigo Project

AT&T is using VeloCloud to develop an application-aware concept called ‘Indigo’, which builds on a software-centric core, and creates a network that is not only software-centric but also data-driven. This is a service concept which blends the software defined network (SDN) with AT&T’s ECOMP orchestration platform, big data analytics, artificial intelligence (AI), machine learning, cybersecurity and 5G elements. Together, AT&T believes it will create a new level of outcome driven, data-sharing network for its largest corporate customers.


When virtualisation makes life easier for the WAN customer, it actually shifts complexity to the network and cloud providers. Many of these operators find their progress towards NFV challenged by a lack of technical maturity and the intricacy of operational changes required to virtualise networks. Management of a multi-vendor environment increases complexity dramatically. The shift to NFV requires significant operational changes involving internal processes, culture and redefining the organisational set-up.

Another sticky point to this new approach will be cost. When so many vendors are contributing to the service delivery, how is it going to be priced? Will the distributed services from partners end up as an expensive network architecture for the customers? Consequently, lacking mature standards (open standards like ONAP’s Amsterdam are only just emerging), leading edge customers may have a hard time extracting themselves from this service? Buyers should carefully map out the parts of the service that are proprietary.

November 29, 2017  2:20 PM

The impact of IT incidents on your business

Bob Tarzey Profile: Bob Tarzey

A new research report from Quocirca, Damage Control – The impact of critical IT incidents, shows the scale of the challenge faced by organisations as they struggle to address the volume of incidents that impact their IT infrastructure, especially those considered critical. The research was sponsored by Splunk.

The average organisation logs about 1,200 IT incidents per month, of which 5 will be critical. It is a challenge to wade through all the data generated by the events that lead to these incidents and prioritise dealing with them. 70% say a past critical incident has caused reputational damage to their organisation, underlining the importance of timely detection and to minimise impact.

The mean cost to IT of a critical incident is US $36,326, the mean downstream cost to business is an additional US $105,302. These two costs rise together, suggesting high cost to IT is a proxy for poor event and incident management, which has a knock-on effect for business operations.

80% say they could improve their mean time to detect (MTTD) incidents, which would lead to faster resolution times and decrease the impact on businesses. The mean time to repair (MTTR) for critical incidents is 5.81 hours, this reduces if there are fewer incidents to manage in the first place. On average, a further 7.23 hours are spent on root cause analysis, which is successful 65% of the time.

Duplicate and repeat incidents are a persistent problem. 97% say their event management process leads to duplicates, where multiple incidents are created for the same IT problem; 17.2% of all incidents are duplicates. 96% say failure to learn from previous incidents through effective root cause analysis (RCA) leads to repeat incidents; 13.3% of all incidents are repeats.

The monitoring of IT infrastructure to log events and identify incidents could be improved; 80% admit they have blind spots, leading to delayed detection and investigation of incidents. The complexity of IT systems and the tools that monitor them leaves many organisations without an adequate, holistic end-to-end view of their IT infrastructure.

Dealing with the volume of events generated by IT monitoring tools is a challenge. 52% say they just about manage, 13% struggle, and 1% are overwhelmed. Those with event management processes which enable them to easily manage the volume of events have a faster mean time to detect incidents and fewer duplicate and repeat incidents.

Quocirca will be presenting the report findings in a series of webinars, in conjunction with Splunk.

Europe 5th December

Americas 7th December

Asia and Australia 12th December

October 16, 2017  11:40 AM

Augment the business, build on the reality of ‘things’

Rob Bamforth Rob Bamforth Profile: Rob Bamforth
augmented reality, Internet of Things, Virtual Reality

The ability to mix the virtual world of digital content and information – text, sounds, images and video – with the physical world of three-dimensional space we inhabit has long appealed. Science fiction has envisaged this in two ways; the fully immersive synthetic environment of Virtual Reality (VR), or the digital overlay onto the real world of Augmented Reality (AR).

There is no doubting the excitement of the more immersive VR. Although it has been around for many years, the technology now performs sufficiently well for relatively low-cost headsets (sometimes very low cost cardboard/smartphone combos) to give a truly impressive experience. No wonder VR has been doing well in games and entertainment. It also work well in the ‘infotainment edge’ of selling. This includes pre-visualising high value goods like premium car ranges or luxury yachts, but it equally could have wider appeal.

While there are many other promising uses for VR in the business world – training, guidance, product servicing etc – users are ‘tethered’ to a physical place while immersed in their virtual world. AR is the ‘mobile-like’ experience that can be carried through the real world and which could make it much more universally applicable.

The rise of Augmented Reality

Awareness of the potential of AR has grown significantly in recent years. This is thanks most recently from a public perspective to the success of games like Pokemon Go. Despite this, some still view AR as the slightly less exciting counterpart to VR. Both are now occupying a space increasingly referred to breathlessly (at least by marketing folk) as Mixed Reality (MR).

Most AR applications do not require the user to wear a headset. Simply looking at the real world through a mobile screen – handheld like smartphones and tablet or wearable device such as smart glasses – and seeing an overlay of digital information, is sufficient. The information presented as overlay can be sophisticated, three dimensional and animated, or simply a piece of relevant textual data. The value comes from the contextual juxtaposition of the data with the place, moment and gaze of the user. This is all about enhancing the user experience by making data relevant.

Some of the early demonstrations tended to be cute or entertaining. The use of AR by IKEA in its Place application demonstrates the potential for AR in everyday, pragmatic and useful settings. Place allows users to experience what furniture choices from an IKEA catalog of 2,000 items would look like in their own rooms, before purchase. It deals with lighting and shading and the auto scaling of objects as they are placed. The app is built with widely available innovative technology. Key to its appeal is the simple and direct user experience, coupled with the complete catalog of pre-built items. Smart AR software without the right data will not be effective.

The applications for AR in business are significant, but need to be effectively managed, particularly with respect to how data is used. Despite the occasional need for completeness in applications such as IKEA’s, ’less’ is most definitely ‘more’. Otherwise AR risks simply becoming another way to digitally overload individuals trying to make use of it.

Augmented Internet of Things

Curiously then, another technology that fits very well with AR is the growing use of Internet if Things (IoT) applications. Here again there is a risk of data overload. Some of the key skills required are akin to those of a professional librarian – curatorial and editorial. The pragmatic application of machine learning could automate much of this.

However, the combination of IoT and AR holds immediate promise even with or without further automation. With so much potential information available from sensors and devices, visualising useful insights can be difficult. How much better if, for example, looking at physical things or places causes relevant data to appear? At a glance, systems could diplay their wear and load characteristics. Devices in the workplace might have been running too hot or have higher than expected power consumption; these could be highlighted as facilities staff walk through and gaze at them. Smart city or smart campuses could display Wi-Fi usage hotspots for network engineers to make invisible radio network coverage and capacity, visible. In each case, the ability to tie relevant information to the context of place and direction of where it applies makes it easier to visualise and understand and mitigate its impact.

The importance of AR, unlike VR, is the way it is rooted to real places and real things. While there has been a lot of  hype in both areas, finding commercially justifiable business use cases has been harder. In combination IoT and AR are worth more than the sum of their parts. One adds real-time data and connections to the real world; the other places it visually in context to deliver the anticipated benefits. Now would be a great time to explore both in combination.

Page 1 of 3012345...102030...Last »

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: