Quocirca Insights

Page 1 of 3112345...102030...Last »

June 13, 2018  10:18 AM

Fixing the future of Fax – more important than you might have thought?

Rob Bamforth Rob Bamforth Profile: Rob Bamforth

There is a term which crops up a lot in the technology sector – ‘legacy’. Occasionally, especially when talking to very long-established companies, the term ‘heritage’ occurs. It is useful therefore to consider the important difference between legacy and heritage:

  • Legacy tends to be used to refer to something previously purchased (probably by someone else) which you wish you no longer had. Or perhaps as part of a sales or marketing message trying to persuade you to think that way. “Rip out your legacy doodads and replace them with our new shiny widgets!” Once the term legacy has been deployed, the outlook turns negative.
  • Heritage is something already in place that you cherish and still use. Typically, in the technology world, this is because it still works effectively, or no one can figure out how to replace it. Nurturing your heritage can be a very pragmatic approach.

For many businesses their pragmatic heritage needs are often ignored by an industry chasing the latest and greatest currently overhyped meme. If there’s no blockchain, AI or virtual reality involved, then it’s probably not worth considering. This means that significant business needs are being overlooked because they are perceived as ‘a bit boring’. These are real opportunities typically with very convincing business drivers. They are also often fundamental parts of an overall ‘digital transformation’ trend. So, as well as bringing in revenues, should satisfy some of any ‘legacy’ need to appear ‘cool’.

One such area of strong heritage within business communications is fax. Some may think that sending facsimiles of pages of paper over networks died out long ago.  Perhaps replaced smartphone photos shared via cloud storage and social media? Not so. IDC’s Fax Survey in February 2017 found that overall 43% of organisations had seen growth in fax usage. Only 19% reported a decline.

Getting the message

The reasons for the traffic are pragmatic and sensible and will not change readily. The same IDC survey predicted a growth in fax traffic going forward. The drivers for using fax are related to sound business needs:

  • Reach – Fax is a globally adopted standard and pretty much every organisation has a fax number and machine somewhere. It can be simply deployed anywhere. It can support diverse communication needs without high performance Internet connectivity, with only a phone line and piece of paper. Apply it to everything from general purpose to application specific communications, such as meal orders into takeaway kitchens from food delivery aggregators.
  • Verifiable source – the originating organisation is known and cannot be masked. This has given fax messages a legal status likely to be trusted where email will not be. Many organisations will only accept requests via fax.
  • Data security – The content of a fax is encoded and transmitted as a burble of sounds. Interception en route is of no benefit, and images transmitted (including any signatures) are not stored in the network. Physical security at the point of receipt of incoming transmissions is the only requirement.
  • Paper trail – Confirmation of receipt of document transmitted can be requested by the sender, which is only sent once a message gets through in its entirety. This does not require human intervention and cannot be vetoed. The claim of “I didn’t receive it” is much harder to make with fax.

Sales of dedicated fax machines are diminishing with a shift towards software, fax servers and multifunction peripheral (MFP) devices with fax. However, despite the arrival of digital networks – with digital telephony (VoIP) and Fax over IP – there remains a significant number of dedicated hardware fax machines.

Respondents to IDC’s 2017 fax survey still expected over a quarter of fax volume to be from standalone fax machines in two years’ time. If traffic from fax servers and MFPs with fax is added to that using dedicated fax machines, over two thirds of all fax traffic will still be based around hardware. This is despite the anticipated strong growth in cloud-based fax services.

Pulling the plug

The reality is that these machines will mostly connect to an analogue phone line. The issue is that analogue telephone networks are being switched off by carriers in their move to roll out fully digital/IP networks. In the UK, BT is scheduled to stop selling any new analogue connections in 2020 and to switch off it off altogether in 2025. Telcos across Europe all have similar plans, which are often more aggressive in terms of timing. Businesses have already embraced VoIP for phone calls, but the heritage of fax communication embedded deep within critical business processes now needs attention.

It may not appear immediately urgent, but the impact on business process and change management required should not be underestimated. Fax is particularly heavily used in conservative and critical sectors such as legal, finance and especially healthcare. In many of these organisations the use of faxing messages is probably deeply embedded in processes. Or it was put in place by people who have long since left. Perhaps the organisation might be unaware that it uses fax so much. It will have probably been quietly performing so well that nobody took much notice.

As the analogue telco network switchover is looming, it would be wise to take another look at the fax sooner rather than later. Identify what use cases are required. Then consider a move to virtualise fax and shift from fax hardware and phone lines to fax software, networks and cloud services.

April 26, 2018  10:43 AM

Commercialising Innovation – scaling tech start-ups

Rob Bamforth Rob Bamforth Profile: Rob Bamforth

Innovation and innovative people are spread evenly around the globe. But there is more of an uneven challenge to bring products and services successfully to market. Many times this is due to a lack of resources or funding and sometimes a lack of a supportive ‘community’. Hence all the ‘Silicon Geography’ models dotted in countries around the world.

However, despite many potentially ‘disruptive’ technologies and propositions, it is often the inertia of the marketplace, aided by the dominance of major players, that holds back the commercial success of technically interesting ideas and concepts.

Perhaps there is another way to tackle this market challenge?

It is an old adage oft-quoted in a now dispersed Californian hotshot open systems hardware company (Sun Microsystems), that the key is not to be first to market, but first to volume.

This requires momentum. The technology, and crucially connectivity, available to today’s tech start-ups removes much of the friction and permits web scale growth, but they still need to focus on ensuring that commercialisation will scale as well.

DevCom 5

Perhaps the Agile and DevOps approaches to software development can offer some alternative thinking to help commercialise technology innovation and more rapidly scale? Here are a set of steps to consider:

  1. Keep the value chain short. If an innovation relies on aligning the agendas of too many organisations, it is going to get bogged down. The term ‘herding cats’ is popular for a reason. Get a tight, small and supportive supply chain and over-collaborate within that team.
  2. Ensure everyone gets their cut. Companies used to talk about ‘money being left on the table’ and then try to make sure they picked it up. This is a short term win from an accounting perspective, but rapid scaling of the value chain needs fuel. Make sure that all involved are sharing the revenues fairly.
  3. Deliver real value at the end (user). How do you know? Check. Get feedback, encourage interaction between users, accept what comes back and learn and adapt. “Lessons will be learned” rarely are in large monolithic institutions, but successful rapid scaling operations learn constantly.
  4. Innovate in rapid cycles. It is a worthy goal to get it right first time, but who knows what ‘right’ really is? Customer research may help at times, but only if the context is well understood. Being able to develop, test value to customer, refine and re-release allows innovative improvement to align most closely to customer needs. Innovation for its own sake is rarely going to bring in sufficient rewards to match the effort involved. Technology companies often try to out-innovate each other, when in reality they should be trying to be ‘most relevant’ to customer requirement.
  5. Automate to scale. Once the innovation/value cycle is gathering pace, accelerate the process by optimisation. Look for the opportunities that deliver most speed up and gain over the whole cycle, rather than within themselves. Halving a 24 hour configuration process is nowhere near as valuable as a 20% reduction in a 2 week deployment process. Take a holistic and systemic approach.

Using this model of tightly-focussed teams, innovating rapidly and delivering customer value is not just for small start-ups. The two pizza guideline employed in Amazon (any team should be small enough to be fed by two pizzas) seems to be at the heart of what has helped it to experiment, learn and scale rapidly since July 1994.

Perhaps there is something in the approach?


April 10, 2018  2:51 PM

ThreatQuotient ups the ante for dealing with security incidents

Bob Tarzey Profile: Bob Tarzey

ThreatQuotient ups the ante for dealing with security incidents

The hardware and software that constitutes the average organisation’s IT infrastructure records millions of events a day which are recorded in log files. This is known as machine data. Nearly all such events are benign and of little interest to IT operators. However, some represent anomalies that may indicate problems arising. Dealing this with such incidents was the subject of a 2017 Quocirca research report sponsored by Splunk – Damage Control: The Impact of Critical IT Incidents.

Recognising incidents is one thing, understanding what they mean and prioritising how they are dealt with is another. This requires enriching the machine data with information from other sources. Splunk’s operational intelligence platform does this for IT incidents in general but also specifically for security incidents, which Quocirca’s report identifies as the top concern for IT managers.

When it comes to dealing with security incidents the process is known as security information and event management (SIEM). Here Splunk has several competitors including Micro Focus’s ArcSight, LogRhythm, IBM’s QRadar and McAfee’s Enterprise Security Manager.

SIEM tools enrich machine data to provide context. However, any one tool may not provide all the insight needed to deal with and prioritise all security incidents. Some organisations use multiple operational intelligence and SIEM tools, furthermore, the range of sources for enriching and guiding the process of dealing with security incidents are myriad. These include:

  • Threat intelligence feeds that indicate what a security incident might mean, for example, is a there known criminal activity that is leading to certain types of events. Providers of threat intelligence feeds include Digital Shadows, CrowdStrike, Recorded Future and FireEye’s iSIGHT.
  • Databases of know malware and scams such as Virus Total, Spamhaus and Malware Domain List.
  • Vulnerability management tools which know about current software bugs, the threats they represent and fixes available, such as Qualys and Tenable.

Bringing together all the information from these sources and applying them to the security incidents is daunting task. That is the challenge the ThreatQuotient has taken on with its ThreatQ platform. All the organisations listed above are among the 70 plus partners that integrate with ThreatQ.

ThreatQ was first released in 2013 and launched in Europe in 2016, where ThreatQuotient now has operations in the larger countries and a growing customer base. This week it is upping in the ante with the release of a new interface called ThreatQ Investigations.

ThreatQ Investigations supplements ThreatQ’s existing tabular interface with a graphical tool that shows core incidents with links to all the sources of information that may help deal with them. With a few clicks an operator may be guided from an anomalous event on a firewall to news of a recently detected surge in activity by a criminal gang seeking to exploit a newly found software vulnerability. ThreatQ Investigations aims not just to empower individual operators but to improve collaboration across the teams that come together, often war-room style, to deal with security incidents.

As cybercrime becomes ever more widespread and the actors involved diversify, targeted organisations must become more sophisticated and timely in their ability to detect and respond. ThreatQuotient and the tools its ThreatQ platform brings together can help achieve this.

Bob Tarzey is and freelance analyst and writer formerly of Quocirca:

https://www.linkedin.com/in/bob-tarzey/


March 25, 2018  9:00 PM

Information Security the next generation

Rob Bamforth Rob Bamforth Profile: Rob Bamforth

The position of Chief Information Security Officer (CISO) has become well established in recent years, but where is it heading next? For many it is often perceived as an inward directed role more accustomed to saying ‘no’ than anything else. But is this really fair and does it represent the modern CISO?

Most organisations are under intense pressure to be flexible as well as secure to protect their own assets as well as the privacy of customer data. Going forward, a more pragmatic approach has to blend the agile needs of the business with the continuing challenges of security.

All organisations like to base success on results. Part of the challenge is that when some initially look at what this means for security, it is often about preventing things from happening (bad things), rather than doing good things for the organisation. While this may still be true, it is not a great yardstick for encouraging best behaviours and attitudes. It runs the risk of fostering inaction and retrenchment, rather than moves in a positive directions.

The term ‘Next Gen CISO’ might not be entirely new, but it surfaced again in a recent discussion with LogMeIn CISO, Gerry Beuchelt. This revolved around the evolving relationship between business and security and how by changing behaviours CISOs can add real value to the business as well as keeping it safe.

So what are the attributes of a Next Gen CISO?

Outward facing

The first attribute that a Next Gen CISO needs is to be outward looking. It helps of course to be acutely aware of the challenges faced by other organisations and changes in the market landscape. However, the outward looking CISO needs to be much more than that. They need to be able to engage with, and understand, their organisation’s customers. This should involve working alongside the sales force and channel partners. Why? To understand and appreciate the commercial challenges of any organisational security issue it really helps to see the impact it has on customers.

Risk in context

This outward perspective assists with another attribute for the Next Gen CISO, awareness of the business reward/risk spectrum. Good CISOs will already understand the risks being faced by their organisation and be aware of their vulnerabilities. But it is rarely their responsibility to decide if those risks are worth accepting, depending on the consequential impact on the business. Nor is this a decision for those who do have the business responsibility to take, without being fully aware of the facts.

The Next Gen CISO should be able to present the risks and consequences of different actions (or inaction) in the context of the consequences they will have on the business. It is no good simply presenting information about speed of patching, number of phishing attacks or level of malware exposure. These may be relevant performance indicators within the security function, but mean little in the context of the business overall. Neither should CISOs hide or diminish the risks being faced. The important thing is to make clear and transparent the impacts that different security issues will have on specific aspects of the business. This is about putting security and risk into a clear and understandable business context.

Employee engagement

As well as reaching out to customers and fellow C-level staff, the Next Gen CISO needs to be able to engage with employees from across the organisation. Security is not a pinpoint issue that affects only certain individuals or business processes. All roles have some element of security and risk for which they have to accept some responsibility. It might have seemed fine at one time to focus this in the hands of one individual, but that is too onerous. The risk then is the default reaction is that individual would be overly defensive and too often say “no”.

The Next Gen CISO needs to be able to understand business progresses and empathise with the challenges faced by those that undertake them. This helps spread involvement and understanding of the importance of security and what everyone needs to do, to the widest possible audience. By reaching out and engaging with fellow employees, the Next Gen CISO is also extending their threat intelligence and impact assessment information network.

Embedding understanding

Building understanding and changing behaviour towards security across the organisation then becomes a realistic goal. But this is rarely accomplished with tick box assessments or tedious training courses. Computer based training can play a part in building awareness, but risks downplaying the importance of specific security threats. A Next Gen CISO will enthuse and engage using more pervasive training models. These will include simulation and live role play to ensure the security message hits home and remains embedded in the organisational culture.

Conclusion

The CISO role may be built around information and security. But it is delivered through a passion for protection that aligns and fits closely to the needs of the business. The Next Gen CISO needs hybrid attributes to which many management roles should aspire. That and an ability to assess the value of technical aspects, with a realisation that success will depend on human ones.

 


March 24, 2018  11:18 AM

Why print channel partners must shift gears and build IT expertise

Louella Fernandes Louella Fernandes Profile: Louella Fernandes

Quocirca’s Global Print 2025 report reveals that print manufacturers are set to lose their influence on customer relationships in favour of IT service providers that deliver print services as part of a broader offering.  Businesses are increasingly looking for suppliers that can demonstrate IT expertise and be strategic partners to both IT and various lines of business (LOB). Building IT services capabilities would present print manufacturers with the opportunity in the small and mid-sized business (SMB) market, helping to offset declining legacy revenue. However, to do so manufacturers must ensure the right mix of channel and technology partnerships.

Changing channel dynamics

The changing ways SMBs wish to purchase, consume and pay for their IT is redefining the role of the channel, fundamentally changing business models and relationships. While print channel partners are gradually transitioning to a managed print services (MPS) model, extending this to other aspects of IT will be the key to sustaining growth.

While printing is not set to disappear any time soon – overall 64% of businesses expect to still rely on printing by 2025 – digitisation efforts are also accelerating, and security is a top concern. This convergence demands a new breed of supplier that can support the business transformation needs of SMBs.

In the SMB market, print vendors have an opportunity to offset diminishing revenues from traditional hardware-centric business models by advancing services portfolios. The Global Print 2025 study reveals that by 2025, 26% of SMBs expect their organisations to have the deepest relationship with IT service providers, increasing from 23% today. This is at the expense of print manufacturers, which see their influence drop from 27% today to 13% in 2025.  A further 17% of SMBs expect a stronger relationship with MPS providers in 2025, up from 14%.

The evolving technology needs of SMBs

SMBs are diverse, ranging in scale and ambition, from fast-growth start-ups to stable, medium-sized businesses. SME technology investment plans vary depending on business focus and size, but according to the Global 2025 report IT security and cloud top the agenda.

Just like larger companies, SMBs are interested in deploying new technology, but are constrained by budget and limited IT expertise.  This lack of expertise is good news for suppliers which understand their customers’ business and industry needs and have the technical expertise to deliver a broader array of solutions and services.   SMBs are increasingly adopting low-cost, cloud-based services and managed services to reduce operational costs, remain competitive and improve efficiency. Consequently they are placing increased demands on suppliers.

Quocirca’s Global Print 2025 research reflects these changing requirements. In organisations with 100-249 employees, 57% are looking for a provider that can be a strategic partner to both IT and LOB – this rises to 60% in organisations with 500-999 employees. Over half of SMBs expect a supplier to have strong IT security expertise, rising to 65% in SMBs with 500-999 employees. Other top requirements are industry specific expertise, business process automation capabilities and providing analytic insight.

Can the channel shift gear?

Although some print-focussed channel partners have successfully made the transition to managed print services (MPS), the majority remain focused on  hardware-centric transactional sales. When it comes to IT services, traditional print partners often lack the skills, experience and capabilities to be credible providers. There may not be the incentives or knowledge in place to sell broader IT solutions, and provide a consultative sales approach which is a core capability of many broader IT service providers.  As a result, many print channel partners may view a move to IT services as high risk requiring too much investment and time.

Typically, SMBs do not typically look to traditional print channel partners or print vendors as a source of innovative services beyond print. They are more likely to turn to existing IT service providers focused on business outcomes, rather than speeds and feeds.

So how can the print channel step up its game, and build IT services credibility and reputation? Consider the following recommendations:

  1. Change the conversation. Channel partners must change their expertise, shifting from the outdated print-centric reselling model to embrace a new role as trusted and strategic advisors to their customers. They must change the nature of the conversation they have with SMBs, engaging with the influential business decision-makers responsible for strategy. The conversation must be around the how to drive efficiency and productivity – not just about technology or products. As businesses turn to them for guidance and support,  channel partners will need to be able to deliver consultative services and expertise.  This also means tapping into adjacencies such as digitisation and security, which are increasingly part of the broader printing proposition.
  2. Partner for IT expertise. Partnering with accredited and experienced IT service providers gives print channel partners access to both a broader product portfolio and provides a direct route into the IT services market, supported by specialist technology sales and support resources. For instance, this enables partners to potentially offer print security services and solutions as part of a broader managed security service offering. For manufacturers or large channel organisations, acquiring IT providers can be an effective means of gaining specialised expertise to develop and augment IT services in-house. It is also a direct means accessing experience in selling or supporting IT services. Some manufacturers including Konica Minolta, Ricoh and Sharp have already made the shift, expanding their managed IT service capabilities largely through acquisition.
  3. Become specialised. The shift to margin-rich services means developing industry specific expertise. Invest in the skills needed to deploy and connect a range of technologies – both across hardware and software – and consider developing vertical specific offerings.
  4. Focus on delivering business outcomes. As SMB purchasing decisions are increasingly influenced by non-IT decision makers, channel partners will need to expand their influence to multiple stakeholders. For larger businesses, the channel needs to focus on building skills in delivering business outcomes to LOB buyers, while retaining a strong relationship with the IT department.
  5. Monetise solutions. Channel partners that invest in software development to expand their offerings should consider monetising and building the resultant intellectual property (IP) through delivering applications. Building a portfolio of applications wrapped around the core business – for example, MPS or document workflow – should lead to new opportunities. Consider including assessment or consulting services as well as integration services.

 The channel must shift gears and change its business model in order to increase engagement with SMBs.  Repositioning as an IT services and solution provider may seem high risk, but by developing credible converged IT offerings, channel partners may be able to increase their relevance, create differentiation and create longer term and more profitable relationships.

Learn more about the changing SMB printing landscape at www.print2025.com.


March 18, 2018  5:39 PM

Meetings with outcomes, not just communication

Rob Bamforth Rob Bamforth Profile: Rob Bamforth
Collaboration, Meeting

Despite the opportunities for technology to be really disruptive, it is surprising how often it simply digitally replicates existing processes. There is one common business process where the results could be described as patchy – meetings.

Meetings tend not to divide opinion that much. Most would say they have too many, they go on for too long and often appear to accomplish little. Decades of training courses and humorous videos have had some impact, but clearly not enough. Surely technology should be able to make it easier for people to work and collaborate efficiently and effectively in meetings?

Tools supporting collaboration (or claiming to) often try to impose a new working agenda of their own, or YACS (yet another communications stream). This might be innovative messaging based on timelines mirroring those that many have become accustomed to in personal lives via social media. It might include more visual interaction with video and screen sharing. But in most cases the focus is more on the media and ‘unification’ rather than their use. This is more like unified plumbing than unified communications from the individual’s perspective.

Some tools might offer significant improvements, especially if all potential users can be compelled to switch over to them or be encouraged by grass roots adoption. The problem is this rarely occurs smoothly. There are often issues in the edge cases – individuals, processes and data – where the new super collaboration system doesn’t fit well. So people move back to their trusty default approaches, typically email and more meetings.

Moves to reduce email sound good in principal, but the reality often disappoints. However, since meetings occupy probably even more working hours that email, surely it would be a good idea to shift the emphasis to them?

There are many important tasks that occur during meetings; sharing information, discussion, decisions, and allocating actions. Information and communication are important, but only on the path towards beneficial outcomes ie decisions and action. Otherwise the outcome is more, and more and more meetings….

Before and After, as well as During

So where is the effort actually expended in meetings? It is useful to consider the whole process as meetings have a ‘before’ and ‘after’ as well as ‘during’. While there has been a lot of technology applied to ‘during’, not enough attention has been applied to the efficiency opportunities across the entire process.

‘Before’ has to mean more than sharing a calendar invite, of obtuse conference call codes, or to a far flung location via a fire and forget email. To get the right people together at the right time, even with decent remote audio or video conferencing equipment, requires some intelligent juggling and scheduling.

This requires time and effort. But since the information about potential meeting participants is often already there, the intelligence employed could be ‘artificial’. More auto-scheduling effort to streamline and simplify arrangements would pay dividends in terms of time saved and would be appreciated.

Even during meetings, for many the use of technology has focused on the medium of sharing. Despite this, getting connected (the video adaptor challenge, followed by the function key shuffle) and getting remote colleagues involved (does anyone know the dial in code or where the remote is, or how to contact support?) seem to take more time than they should.

Capturing information during meetings and sharing accurately afterwards, jogging the memories of those present and informing non-participants, would be hugely beneficial in steering towards these positive outcomes. Technology to voice record, intelligently transcribe to text would make sharing and searching simpler, and is readily available. The key is to seamlessly integrate this into the collaboration tools that participants are already using for their meetings.

Shifting beyond collaboration

This involves a shift in thinking from the unified communications and conferencing industry. Most have already made the jump towards a focus on collaboration. This is necessary, but not sufficient. The next step is to recognise that the long-entrenched models of how people work together will be hard to change. The whole lifecycle of meetings needs to be enhanced, and where possible, automated.

Meetings might seem tedious and wasteful, but few organisations are going to replace them entirely with virtual timelines, shared repositories or interactive online realities. There is a need to look at the elements of greatest inefficiency, apply technology to make incremental improvements, assess the results and then repeat.

This looks a lot like the agile and DevOps approaches now being used in software development. These are yielding great results in terms of both speed and quality. Isn’t that an outcome all organisations would like to see for meetings as well? Look for tool vendors that are moving beyond the audio and visual media. The ones that are extracting meaning and understanding from how people are communicating are putting real business value into collaboration.


March 6, 2018  3:54 PM

DevOps – Pragmatic production, not evangelism

Rob Bamforth Rob Bamforth Profile: Rob Bamforth
DevOps

Despite the potential of ‘digital transformation’ and IT in general, many organisations find the reality a little disappointing. Development takes longer than expected, quality is lacking and what is delivered often fails to match the business requirements. Despite the continuous, rapid innovation and evolution of technology, with faster, cheaper and more readily available hardware, the software process seems to still struggle to deliver.

There have been many attempts to improve matters from computer aided software engineering (CASE) tools and methodologies to object orientation, 4th generation languages and tools for ‘citizen development’. So far, no silver bullets in either tools, methods, or superhero developers. One problem is that the software challenge has expanded and become increasingly monolithic. Roles have become specialised and the process structured, typically around a traditional engineering model with well-defined stages. Fine in principle, but inflexible and slow to adapt.

A break with the model was inevitable and shifted (typically) towards fragmentation, blurring of teams and speed with the Agile ‘manifesto’. This favours; individuals and interactions over processes and tools; working software over comprehensive documentation; customer collaboration over contract negotiation; responding to change over following a plan.

Agile often leads in turn to more fluid and continuous development and deployment processes, generically termed DevOps. Based on how Agile and DevOps are sometimes presented with almost alternative programming lifestyle enthusiasm, it might seem easy to believe they require some sort of cult following.

Certainly, the word ‘culture’ or phrase ‘cultural shift’ occurs very frequently when talking to enthusiasts. So much so, that many organisations become nervous of the almost revolutionary zeal. They wonder how they will be able to cope with adopting the apparently wholesale change.

While there are significant differences in its approach, DevOps should not be seen as an all-or-nothing dangerously disruptive force. It requires a certain way of thinking and organising teams, but this does not have to change and challenge the entire organisation. It might, over time, have more broad-reaching impacts, but these are based on its results, not its initiation.

Adopting a DevOps strategy

The primary requirement is commitment. Management must be bought into the process, and set out the strategic vision and goals. The starting point is – what are we trying to achieve?

While there are always clear end benefits to aim for – increase revenue, reduce costs, mitigate risk – it is the intermediate drivers that shape and derive value from DevOps. Being more responsive to customers, improving software quality, delivering services that meet business needs. The focus for DevOps adoption has to be decided upfront.

Some tasks are far better suited to the potential benefits of DevOps than others. Well-established software and systems that record and manage well-defined data requirements are unlikely to benefit the most. A better place to start would be aspects of customer or external engagement where changes in market, social, political or personal context open up opportunities for innovation. The questions are, what will be required and does anybody have any clear views yet?

So, next is an acceptance that this will require some experimentation. Some may call this ‘fail fast’, but a much better way to view it is ‘learn fast’. The experimentation is being done for a purpose based on the strategy. Small steps are taken, tested, refined and retried in pursuit of the goal.

Doing this requires a tightly-coupled effort based around different skills; design, programming, quality assurance, build, deployment, customer feedback. Rather than distinct stages or disparate teams, these are provided by close collaborators aiming to simplify and streamline the process.

Pragmatic DevOps

The culture for accomplishing this does not need to pervade the entire organisation, nor does it have to be applied to every challenge. DevOps has to be done with serious intent and commitment – from the top as well as those directly involved – but only for one simple reason. To address a problem hotspot and get the results that the strategy was put in place to achieve.

The reality is that those who put DevOps in place for sound reasons do enjoy the benefits. The frequency of software delivery for assessment by the user is increased and ‘good enough’ initial solutions can be honed to be better. Plus, with direct feedback the whole team feels and takes on responsibility for quality.

DevOps is not a panacea or a cult requiring blind commitment, but nor is it a passing fad. Used pragmatically it moves software development away from either exacting engineering or casual coding to poised production at pace. Getting better business value from software should be what ‘digital transformation’ is all about. Find the right project, set suitable goals and give DevOps a go. Who knows, your business culture might change once it’s benefited from the results?


February 20, 2018  11:21 AM

The road to hybrid cloud

Clive Longbottom Clive Longbottom Profile: Clive Longbottom

Front cover of BCS Evolution of cloud computing book

There is little argument that cloud is having a major impact on how organisations design, provision and operate their IT platforms.  However, there still seem to be major arguments as to what an overall cloud platform actually should be.

For example, should the end result be 100% private cloud, 100% public cloud or something in between?  Where do existing systems running on physical one-server or clustered systems fit in to this model?  Should any private cloud be housed in an owned facility, in a colocation facility or operated by a third party on their equipment?  Who should be the main bet when it comes to public cloud services – and how should these services be integrated together, either with other public cloud services, or with existing functions running on owned platforms?  How about the cost models lying behind different approaches – which one will work best not only at an overall organisation level, but at a more granular, individual workload-based model?

Once a basic platform has been decided upon, then the fun starts.  How should data be secured, both from an organisation’s intellectual property point of view, and also from a legal point of view as when it comes to areas such as GDPR?  How do organisations make sure that users are provided with the best means of access functions without having to remember a multitude of different usernames and passwords through the use of single sign on (SSO) systems?

How can data leakage/loss prevention (DLP) and digital rights management (DRM) help to ensure that information is secure no matter where it resides – not only when the organisation has direct control of it on its own network, but also as that information passes from its control to other parts of the hybrid cloud and to partners, suppliers and customers on completely separate networks? It is also important to look at how other security approaches, such as encryption of data at rest and on the move, along with hardware, application and database security fit in with this.

Once a cloud platform is in place, it can lead to a completely different approach in how functions are created, provisioned and managed.  Many organisations have found that a hybrid cloud is an ideal platform to support and make the most of DevOps, with continuous development and continuous delivery being far easier than using cascade/waterfall project approaches against a less flexible physical or virtualised platform.

However, effective DevOps requires effective orchestration – tooling has to be carefully chosen to ensure that the right capabilities are in place to enable the right levels of intelligence in how functions are packaged, provisioned and monitored in a contextually aware manner such that the overall platform maintains the desired performance and capabilities the organisation demands.

This then brings in a need for better technical approaches in how functions are packaged:  the rise of microservices provided via containers is taking over from large, monolithic packages and virtual machines (VMs).

Systems management becomes a different beast:  not only do IT staff need to monitor and maintain functions that they own within their fully managed private cloud environment, but they also must monitor in a contextually aware manner those functions that are running in a public cloud environment.  Ensuring that root cause analysis is rapidly identified and that remediation can be carried out in real time, with workloads being replatformed and redirected as required is a key requirement of a hybrid cloud platform.

A key area that many have struggled with is that although they know at a technical level that cloud is a better way forward, they have found it difficult to sell the idea to the organisation is ways that the business can understand.  However, a simple approach known as a Total Value Proposition provides technical people with a better means of getting their point across to the business – and so acquiring the funding required to implement such a major change in technical platform.

Hybrid cloud is the future.  No matter where you are in your journey to this, there are many pitfalls to avoid, and many areas of additional possible value that are too easy to miss out on.

A new BCS book, “Evolution of Cloud Computing: How to plan for change” is available that covers all these areas – and more.


February 13, 2018  9:12 PM

Baited breadth – the rise of phishing

Rob Bamforth Rob Bamforth Profile: Rob Bamforth
Uncategorized

Reviews of organisational security can be viewed in many positive ways, but all too often with trepidation or resignation. The rise of phishing, where spoof, but increasingly credible, messages try to obtain sensitive information, is a particularly troublesome challenge. It exploits one of the weakest links in digital security, people, to get them to click on something they should not.

The latest (4th) annual State of the Phish report from Wombat Security Technologies is an interesting overview of the issue. This article quotes some of its data. It is based on feedback from infosec professionals, technology end users and Wombat’s customers. The report outlines the breadth and impact of phishing and what organisations might try to do to address it.

The scale of the challenge

The scale of the issue is staggering. Around half of organisations are seeing an increase in the rate of attacks, and the approaches are diversifying. Over half are experiencing spear fishing – targeting specific individuals, roles or organisations – and 45% are experiencing phishing by phone calls or text messages (‘vishing’ and ‘smishing’).

To combat this, companies can go beyond basic training courses and awareness campaigns, and simulate actual phishing attacks to see how employees behave and how they might respond to specific phishing styles. Those in the survey typically use four different styles of email templates to assess how end users react:

  • Cloud emails, relating to accessing documents from cloud storage or using other cloud services.
  • Commercial emails such as simulated shipping confirmations or wire transfer requests.
  • Corporate emails, which look like the sort of messages normally circulated internally such as ‘memos’ from IT or HR departments.
  • Consumer emails related to social media, retail offers or bonuses and gift cards notifications.

Although individuals are starting to get wise to the issue and average click rates have fallen from 2016 to 2017, sophisticated attacks can be very effective. Two particular simulated corporate templates had almost 100% click rate – one pretending to include an updated building evacuation plan, the other a database password reset alert. Other high deceivers were messages about corporate email improvements or corporate voicemails from an unknown caller. The only high rated consumer attack was about online shopping security updates.

The impact of phishing attacks seems to be growing, or at least is being more widely recognised as it has a strong impact on IT professionals. Almost half of organisations noted malware infection as a consequence of phishing. Compromised accounts and loss of data were the other most significant responses. It was also noted it causes a greater burden on IT in terms of, for example, helpdesk calls. So too was the potentially more far-reaching business consequences in terms of cost, time and disruption.

Addressing the issue

The key to mitigating the risk is to change end user behaviour. This is much more than simply making users aware of the consequences, although many organisations do have severe punishments. ‘Repeat offenders’ will face counselling in three quarters of organisations, removal of access to systems in around a quarter and in one in ten organisations it may result in being sacked. (The research was conducted in the US, UK and Germany, and employment regulations may differ significantly.)

As phishing seems not only to be inevitable (around three quarters of organisations say they are experiencing phishing attacks), but also very damaging, the most pragmatic approach would be to tackle the issue head on. Better to avoid any of those unfortunate consequences and address the problem at source by highly targeted training. This would benefit enormously from ongoing reinforcement since both the threats and the workforces having to deal with them will change over time.

This is where training using regular simulated attacks appears to help greatly. The majority of organisations are doing this very frequently with 40% training quarterly and 35%, monthly. Why? Well, an increasing number, now over three quarters of organisations, measure their own susceptibility to phishing attacks. Over half say they have been able to quantify a reduction based on their training activities. Given those metrics and the growing risks and potentially significant business disruption and impact, who wouldn’t do more to avoid being caught in the phishing net?


January 31, 2018  6:27 PM

Immersive Reality – a business opportunity

Rob Bamforth Rob Bamforth Profile: Rob Bamforth
augmented reality, Virtual Reality

This is the third in a series of articles exploring the challenges and opportunities facing the audio visual (AV) industry, which has its annual European flagship event, ISE2018, in February. This article looks at how innovative and immersive technologies can improve the user experience.

A deeper user experience?

Technology companies used to talk about ‘intuitive’ interfaces and ease of use as if this was an end in itself. It is the overall outcome that really matters. Many still spend too much time and effort making individual elements ‘easier to use’, without improving the user experience and the effectiveness of the entire process.

The result is that individuals have dozens of separately ‘easy to use’ applications and businesses have too many disconnected and siloed repositories of data. This leads to poor use of information; data is incomplete, inaccurate or delivered late from a process perspective. From an individual point of view users feel overloaded and overburdened.

The relationship between data and those who have to use it needs to change. Users need more effective tools to deal with the mass of data they face. Recent innovations in visualisation offers some welcome opportunities.

Mixing Realities

Immersive visual technologies have always generated a lot of hype. While much of the innovation is new and disruptive, many elements have been around for some time. The difference now is that compute power, storage and network capabilities have grown to a level that delivers a high quality experience and removes disconcerting delays. There are several key immersive technologies:

  • Virtual Reality (VR) is probably the best known and longest established immersive experience. It is also perhaps the most intrusive through the use of enclosed headsets, which have only in recent years dropped in size and cost. VR replaces the real world with an entirely digital one.
  • 360 (360 degree imagery) has become possible from advances in both camera and drone technologies. The experience is immersion in either images or videos previously captured in 360 degrees and can be displayed stereoscopically if desired. Headsets are not always required as handheld mobile devices can be almost as effective.
  • Augmented Reality (AR) has moved from sci-fi films screens into people’s hands and became widely popularised by the Pokemon Go game. AR overlays digitally created content into the real world. It can work using live camera feeds on mobile device displays, as images overlaid onto smart glasses or even as projected holograms. The flexibility and lower levels of intrusion means AR is a great fit into many application areas.
  • Mixed Reality (MR) is often confused or conjoined with AR. The differences are slight, but important. MR offers a more seamless experience where digital content is not simply overlaid, but it can also react to, and interact with, the real world. There is more sensor and input interpretation used in order to make this happen.

 

An opportunity being partially addressed

All immersive technologies are being well marketed. Numbers of specialised immersive devices such as headsets are growing, but still small relative to other areas of visual technology. The total number of VR headsets sold in 2017 was around 12 million units. There was probably similar numbers again shipped as low cost, but still offering an immersive experience, cardboard headsets. Entertainment and gaming plays a big part and the installed base of Sony Playstation VR headsets has passed 2 million.

As an opportunity, this is not something that the AV industry will have to itself. While the early VR industry from the 1990s was very specialised and thus fraught with investment challenges, the current immersive sector is much more mainstream, with significant investments from major IT vendors.

Microsoft launched its HoloLens in 2015 as a gaming oriented consumer product. Now it is businesses that have picked up what is widely regarded as a mixed reality headset. It is even certified for use as basic protective eyewear and there is a hardhat accessory currently in production.

Google’s approach has been more oriented on VR and AR, but it has been promoting business use in an assistive way. It prefixes typical user experience scenarios with the term “Help me”. These are oriented on helping the user to learn, create, operate or sell. As an approach this is very pragmatic, focusing much more on the outcome and less on the technology. Google has also been a strong advocate of ‘immersion for all’ through the promotion of cardboard headsets.

Apple’s executives reportedly held meetings with AR device parts suppliers at CES2018 and may be planning to follow up with an update to its ARKit software later this year. If hardware, such as an AR headset, were to arrive next year that would not be a massive surprise. It would also have a big impact, but Apple tends to delay or even cancel products if it does not feel they are ready. Its focus on software and content will in any event give a boost to the AR ecosystem.

Making an immersive experience acceptable and enjoyable

The broad market drive is good, but many technologies flounder through insufficient focus on the user. This is where the experience of the AV sector could be vital. If there is lag or display inconsistencies (such as for people who wear glasses) or even a tendency for motion sickness, then most users will find this unacceptable, except for short periods of usage. Some are trying to reduce the impact and companies such as MagicLeap.  Its Lightwear goggles promise a more natural and comfortable overlay of digital and physical worlds. It is not yet known how these approaches will be adopted for business use.

The delivery of visual stunning performance on large screen technology is something very familiar to the AV sector and some have put immersive technologies around users in the physical world. Barco’s multi-sided cave display provides an immersive environment to fully surround people so they can stand inside a VR or 360 space. Realfiction has taken this open immersion a step further with its DeepFrame augmentation of reality through holographic projection. This removes the need for specialised eyewear or goggles to deliver visual information in a way that sparks viewer engagement.

Delivering on outcomes

In many cases visual appeal will not be enough. Achieving business goals from engaging or immersive access to visually stimulating information will often depend upon what can then be done with the information, and perhaps most importantly, who with? What is required is a shared immersive experience that permits not only access to information but also effective collaborative tools for interaction across a disparate group or team.

It is also vitally important for collaboration technology to enable the sharing of information across multiple devices. Teams bring laptops, tablets and smartphones to meetings to take notes and to reference work in real-time. With the immersive collaboration tools, such as Oblong’s Mezzanine, teams can share data instantly for everyone to see and understand.

The most sophisticated and productive collaboration rooms provide participants with the means to share data simultaneously, utilise the most intuitive controls like gesture and/or touch, and instantly access information to make collective decisions. When collaborators share a united workspace, regardless of their location – meetings are more engaging, enlightening and immersive.

These are the outcomes that most businesses are looking to achieve. Better use of data, more effective teams. To get more insight into how different immersive technologies are being applied to improve both user experiences and business outcomes, by a significant number of innovators across the AV sector, visit ISE2018 in Amsterdam in February.


Page 1 of 3112345...102030...Last »

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: