Quocirca Insights

April 20, 2017  7:27 AM

Winning the Domain Game

Bob Tarzey Profile: Bob Tarzey

Over the last quarter century the Internet has become a fundamental utility that businesses, governments and consumers rely on; being off-line is less and less acceptable. And yet, a 2017 Quocirca research report, Winning the Domain Game (sponsored by Neustar), shows that 72% of UK business face internet down time regularly or occasionally; 61% suffer performance problems.

The problems blamed for this, range from server down-time to DDoS attacks, with around one third citing the domain name system (DNS) for at least some their internet access woes: DNS itself suffers from downtime, attacks and other inefficiencies. DNS is the Internet’s own fundamental utility which links users with online resources, translating hard to remember internet protocol addresses (e.g. into meaningful names (e.g. bbc.co.uk).slide1

DNS problems are probably worse than these figures suggest. By its very nature, DNS is transparent to users, so the role it plays in impacting internet access may go unreported. That users do not recognise DNS issues is unsurprising, but IT managers are also likely to overlook it; 55% report poor visibility in at least one aspect of DNS Management.

This is partly due to a lack of tools (the majority lack many DNS management capabilities), but also due to the complex way in which DNS services are provisioned. More than three quarters of organisations have five or more different ways of accessing DNS, ranging from in-house servers to internet registrars and service providers. Correlations within Quocirca’s research show that they tolerate this for a reason, having multiple paths to DNS improves availability – but, it also degrades overall internet performance.

Due to the varied needs of users and profusion of online services, few organisations expect to end up with a single management point for all their DNS needs. However, those that have committed to a specialist DNS service provider reduce DNS complexity. This has a big impact, improving visibility in all areas and providing access to a wide range of value-added DNS features, ranging from the ability to route internet traffic to blocking unwanted content.

No organisation can manage long without reliable internet access, so it follows that reliable DNS services are needed too. Poor management of the latter is likely to be responsible for problems with the former more often than is currently understood.

Quocirca’s report, Winning the Domain Game is free to download HERE.

April 19, 2017  3:16 PM

Open for business: Hortonworks aims for open source profitability

Bernt Ostergaard Bernt Ostergaard Profile: Bernt Ostergaard

It used to be the Hadoop Summit, but the strategic focus at Hortonworks the enterprise-ready open source Apache Hadoop provider, has evolved. So, this year it was renamed DataWorks Summit. The company now encompasses data at rest (the Hadoop Data Platform now in version 2.6), data in motion (the Hadoop Data Flow) and data in the cloud (the Hadoop Data Cloud). Hortonworks aims to become a multi-platform and multi-cloud company. The focus is on the data in data driven organisations. Just a few years ago Hortonworks connected with IT architects. Today it’s launching conversations with lines of business and chief marketing officers.

The company

Since the company launch in 2011 backed by Yahoo, Hortonworks has grown to over a thousand employees in 15 countries and customers in sixty countries. Its European presence is operated out of the UK with sales staff in North and Central Europe. It’s a young organisation with many newly graduated employees, strong on technology but lacking business domain insights. Many have maintained their links with universities to address big data and IoT issues. Hortonworks is involved in several joint R&D projects, in what Hortonworks co-founder Owen O’Malley terms the ‘community over code’ approach. One such project is the Digitisation of Energy, aiming to connect 1 million electrical car batteries to the grid to act as a sustainable energy reservoir.

Where’s the money coming from?

Sustained strong growth still evades Hortonworks. In response it is shifting product focus from selling converged Hadoop systems to IT departments, to selling data platforms to lines of business. Of its two main competitors, MapR remains a VC backed private company, while Cloudera is in the IPO funnel, touting its hybrid open source software (HOSS) model which ties open source elements with proprietary software for its enterprise‑grade platform. So Hortonworks may be tempted to add more proprietary elements to the open source Hadoop platforms, to increase its profitability.

Critical to maintaining an open source focus are the fast expanding fields of artificial intelligence and machine learning. Hortonworks is investing a lot of resources in developing open source code, and sees significant revenue opportunities across all business verticals. This is exemplified by its Hadoop data lake developments that encompass data analytics, mobility and IoT using Hadoop Distributed File System (HDFS) and persistent memory data structures. With increasing legal requirements for data to reside in specific geo-locations, computing must come to the data. This requires data tiering for ‘hot’, ‘warm’ and ‘cold’ data storage to optimise local computing power requirements.

Who’s helping?

Partners on the Hortonworks Data Platform include IBM, HPE, Dell EMC, Pivotal, Teradata and Microsoft. Data Flow partners have not been named yet, but several major carriers are Hortonworks customers, and may soon become partners. Especially if the Federal Communications Committee under Trump abandons its net neutrality stance and allows carriers to offer different Internet QoS (quality of service) levels. Hortonworks will help them develop differentiated services for their customers. Data Cloud partners are the two majors AWS and Microsoft Azure. Hortonworks also has domain expertise alliances with Accenture, Cap Gemini and Deloitte to roll out industry wide IoT and cyber security offerings.

Where’s the future for Hortonworks

Hybrid cloud, IoT, hyper-convergence, big data and AI all point to massive data accumulation and the need for mobile and multi-tier data processing. These are all areas where Hortonworks is active. This was exemplified by an automotive case study. Mercedes, a front-runner in the automotive market, operates with five levels of development, from yesterday’s ‘assisted driving’ to today’s ‘partially automated’ and tomorrow’s ‘conditional automation’. Then follows ‘high automation’ in 2021, and finally ‘full automation’ in 2025. Today’s top-of-the-line cars generate around 500GB of data per day. In ‘full automation’ mode, data volumes will go up to 50TB a day. That requires intelligence at the edge and real-time hand-off to cloud computing processes.

Hortonworks wants to be on that journey, not just with the automotive industry, but across many other verticals. The company believes that only open source can evolve fast enough and create the standards needed to keep up with the data frenzy.

April 7, 2017  10:18 AM

Bad-bots and CNP fraud

Bob Tarzey Profile: Bob Tarzey

The use of bad-bots to further payment card not present (CNP) fraud.

According to Trustwave’s 2016 Global Security Report 60% of cybercrime incidents target payment card data. Half involve magnetic stripe data (generally stolen via point-of-sales devices) whilst the other half involves card not present (CNP) data; data stored by organisations that transact online.

Of course, any organisation that deals with CNP data should be PCI-DSS (Payment Card Industry Data Security Standard) compliant. Followed to the letter, this should put CNP data beyond the reach of cybercriminals. The real-world experience of many consumers suggests that all too often CNP data is being compromised and used fraudulently.

One of the reasons for this is that thieves do not need to rely on stealing complete and up to date payment card records. A CNP data record should consist of just three data items; the card holder name, the 16-digit primary account number and the expiry date (there is also a service code with magnetic stripe data). The CCV code, which is needed to complete many CNP transactions, should never be stored.

With a substantial heist, criminals can waste a lot of time trying to use card details that are no longer valid. However, they have a few tricks up their sleeve, such as using software robots (bots) to enrich their data. These techniques are described by OWASP (the Open Web Application Security Project) in its Automate Threat Handbook; carding, card cracking and cashing out.

Carding works through long lists of payment card data to checking each card number against a target merchant’s online payment process to find which ones are still valid. There are even specialist card checking sites for this. Card cracking enables missing or out-of-date expiry dates and CVC codes to be added by testing the range of possible values (which is small) against target sites. Cashing out helps with the monetisation of completed payment card records, often using multiple micro-payments.

Any of these techniques can turn even the most PCI-DSS compliant organisation into a victim. Sites may be targeted for validation purposes, impacting performance for other users, or may be targeted for monetisation. These payment card bots are just three of a broader set of automated threats listed by OWASP that can impact online resources. Fortunately, there are range of bad-bot mitigation techniques which are described in a series of e-books written by Quocirca and sponsored by Distil Networks.

Quocirca’s Transaction Fraud eBook can be viewed at this link:


For a full list of the Cyber-security threat Series of e-books follow this link:


April 4, 2017  2:39 PM

The focus on IoT in the arable food chain – and the worries around its use

Clive Longbottom Clive Longbottom Profile: Clive Longbottom

The arable food chain, consisting of farms, logistics/warehousing, food processing and retail, is a complex one with a major focus on food hygiene and pest management. In research carried out by Quocirca for Rentokil Initial in late 2016, the views of those responsible for managing these areas were found related to where the internet of things (IoT), cloud computing and big data could help them.

When first asked where their focus was on technology investment, figure 1 shows that end-to-end traceability of goods was a top priority for most, with predictive analytics of data coming a way back.

Both of these areas would seem to be ideal candidates for the use of IoT devices – the capability to add, for example, radio frequency identification (RFID) or near field communication (NFC) tags to foodstuffs as they move along the processing chain would make sense.

Also, the way that multiple different types of IoT devices along that chain can create data that can then be aggregated and analyzed via cloud platforms would also make sense.

Figure 1

Figure 1

However, the research also showed that few organizations were planning on large adoptions of IoT projects in the near – or even far – future. A degree of this was undoubtedly based on a lack of knowledge of what the IoT really was (as covered in a previous blog here). It was also apparent from the research that there were other areas where the research respondents had worries that were keeping them away from using the IoT.

Figure 2 shows the analysis of responses where interviewees were asked to rank their top three issues when it came to implementing connected technologies. As can be seen, data privacy was the top issue for them, with a perception that the IoT would create a greater number of process vulnerabilities a close second.

Somewhat surprisingly, the costs of implementing an IoT project barely registered in respondents’ top three issues.

Figure 2

This would seem to be bad news for those technology vendors and service providers trying to push IoT systems into the market. The interviewee profile for the research were not technology people – and the gap between other research carried out by Quocirca (such as the findings with technology people in organisations for ForeScout here) and this research could not be more stark.

How the technology community bridges this chasm and makes sure that the business value of the IoT is seen and understood by those in the business itself will be its next challenge. Technology vendors need to try to prove to those holding the purse strings that the IoT is a valid direction across a whole value chain.

Certainly, the research does show that the market is ready for the IoT – as long as it is demonstrably fit for purpose, that it does result in desired business outcomes and that the perceptions around its shortcomings have been dealt with.

In the specific case of the arable food chain, there is not only a business need for the IoT, but a sustainability one. Only through effectively dealing with pest and hygiene issues can the growth in need for foodstuffs by a growing population be adequately met.

Time to stop focusing on the technology of IoT and major on the business benefits.

The full report on the findings of the research carried out for Rentokil Initial can be accessed for free here.

March 22, 2017  3:26 PM

The key to application success? Usability…

Clive Longbottom Clive Longbottom Profile: Clive Longbottom

canstockphoto7718964The business has made a request to IT for something to be done. IT has done all its due diligence and has come up with a system that meets every technical requirement laid down by the business. IT acquires the software, provisions it and sits back waiting for the undoubted thanks from the business for a job well done. Instead, the business gets quite irate – what does IT think it was doing in forcing such a half-baked system on the end-users?

What tends to be the case here is that IT (and often the business as well) forgets the one really important issue – any system has to be as intuitive and transparent in use to the users as possible. Anything that is seen as getting in the way of the user will be worked around. And this working around can make the original problem worse than it was.

Amongst more technical areas where this tends to be the case are data aggregation, analysis and reporting, along with many areas of customer relationship management (CRM) and enterprise resource planning (ERP). However, an area that should be of very high importance is one that impacts pretty much everyone involved in the business – document management, or more to the point, information management. Many enterprise document management (EDM) systems require documents to be placed in the system either through an import or specific export mechanism. During such action, the user is expected to input a lot more information on the document, such as what level of classification it is, tags around what the document is about and so on.

Instead, users either take default settings or just don’t bother to put the document in the system at all. Certainly, such lack of transparent usage means that it is only the ‘really important’ documents that are deemed worthwhile for all that trouble.

Just what is ‘really important’ though? Sure, those documents that the organisation is mandated by law to submit to a central entity are. Anything that is to do with mergers and acquisitions probably are as well. How about that document that Joe down the corridor has been working on looking at the future pricing of raw materials used in the organisation’s products? How about the results of the web search that Mary has done looking at the performance of the organisation’s main competitors?

Further, what about all the extra people who are key contributors to the organisation’s value chain these days – suppliers, customers, contractors, consultants, etc – how can information be shared by and to them in an efficient and secure manner?

In comes enterprise information management (EIM). By managing information assets from a much earlier stage of their lifecycle, the business gets the control and management of the assets that it requires.

However, the system must not make usage harder for users: any extra input required by the user must be offset by the overall value that they perceive coming out from the system.

So – rather than a system that requires the user to make a physical decision to put the information asset in the system, start with templates. Use metadata around these templates so that document classification is decided as soon as the user starts work on the document. As such, a document on a general subject – say, ‘Summary of discussions on usage of tea bags in the canteen’ – can be worked on by opening a ‘Public’ template. One on ‘Expected future pricing of raw materials from suppliers’ can be created from a template with a ‘Commercial in confidence’ classification. And so on.

As the document is worked on, versioning can be applied. Through the use of a global namespace, the documents do not need to be stored in a single, large database – they can be left where they are created with logical pointers being stored in the system to provide access to them. The documents can be indexed to provide easy search and recovery capabilities across the whole enterprise.

Those in the extended value chain can be invited to work on the documents through the provision of secure links – and their activities around the information asset logged at every stage.

At every stage, the user is helped by the system, rather than hindered. The value to the individual and the business is enhanced with very little, if any, extra work involved from the user. The business gains greater governance, risk and compliance (GRC) capabilities; the individual gains through having greater input into decisions being made.

Ease of usage in any system is key. Hiding the complexities of enterprise systems is not easy, but without it being done, even the most technically competent and elegant system is bound to fail.

Quocirca has authored a report on how an EIM system must adhere to the KISS (Keep It Simple, Stupid) principle, commissioned by M-Files. The report can be downloaded here.

March 16, 2017  8:33 AM

Bad-bots, the new charlatans of healthcare

Bob Tarzey Profile: Bob Tarzey

Healthcare providers have many challenges, but if you stick with the mainstream, you can usually still expect a reassuring bedside manner from healthcare professionals; you have to actively seek out charlatans in the 21st Century! However, healthcare professionals are busy and consultations are often hurried. Anything that can help them save time is welcome and, as in many other industries, the healthcare sector is turning to automation.

In healthcare automation is often in the form of software robots (or bots) that can automate certain tasks. Admin bots make appointments, provide access to clinical records, answer billing queries and process payments. Chat-bots can deal with routine ailments, freeing healthcare professionals to deal with more complex ones. Artificial intelligence (AI) will see the field move forward apace with advanced symptom checkers like Babylon Health and there are already a number of healthcare projects based around artificial intelligences like IBM’s Watson and Google’s DeepMind.

However, there is a downside, charlatans may find their way back into mainstream healthcare in the form of automated threats or bad-bots. These bots can be used to gain access to online healthcare systems, either via brute force entry of personal accounts or seeking out and exploiting software vulnerabilities. Once in, the criminals that drive the bad-bots steal valuable data (a full US Medicare record sells for around $500) or perpetrate insurance and payment card fraud.

These bad-bots may not be harming patients by dishing out poor medical advice like the charlatans of old, but their effects can be just as harmful. They impact the availability of healthcare applications, invade privacy and undermine the confidence in what should be a brave new round of automation in the sector which frees healthcare professionals to deal with complex problems.

Fortunately, there are ways to  identify, control and, when necessary, block bots, which are now estimated to be responsible for 46% of all online interactions. Quocirca has written a series of e-books on the problem in conjunction with Distil Networks, a provider of direct bot detection and mitigation technology. The latest e-book in the series, The ultimate guide to how bad-bots affect healthcare can be viewed HERE.

March 15, 2017  3:33 PM

Information management – when metadata is king

Clive Longbottom Clive Longbottom Profile: Clive Longbottom

canstockphoto7719120Consider a document.  It makes no odds as to whether it is a Microsoft Word document, an Adobe pdf file, an Autodesk file or whatever.  Just what can you find out about it?

Well, every file has a digital fingerprint associated with it:  an operating system can look at more than just the file extension to identify just what type of file it really is.  Within the zeroes and ones of the binary content of the file on the disk is a ‘wrapper’, a set of details that describe what the file is.

Once the wrapper is understood, the contents of the file can be indexed, so that systems can search this index as well as the actual document contents. For example, on my Windows device, a ‘search’ in for the term ‘information management’ would pull up this document (and many other files) up in the Windows File Explorer.

However, although this has uses, there are some problems.  Much metadata is not immutable.  As an example, open a Microsoft Word document.  Click on the ‘File’ tab and then look at the right-hand pane marked ‘properties’.  You should see an author marked there.  However, if you click on the ‘properties’ marker itself, you can choose ‘advanced properties’ – here, you can change the author to anything you want.

Likewise, much of the metadata associated with the document can be changed.  Someone with very basic knowledge and the right tools can change the content of a document, along with its dates and make it look to all intents and purposes that it was the original document.  As such, should a conflict arise between the actual creator of the file and the recipient of the same file who has then changed it, it becomes a case of one person’s word against another.

However, if immutable metadata is used, then things change.  By storing the file with extra information where all modifications are logged, such content changing is no longer possible.  By ensuring that the original author is logged and held against the document, along with all dates and times that the document has had an action taken against it (opened, edited, emailed, printed, whatever), full governance, risk and compliance (GRC) needs should be covered.

Let’s just start with document classification.  By assigning a simple set of metadata tags, such as ‘Public’, ‘Commercial’ and ‘Private’ to documents, a lot of process flows can be made more intelligent.  A Public document can be left unencrypted and moved along a process flow with very little interruption.  It can also be passed through email systems without too much scrutiny, apart from a content check to ensure that certain types of data or alphanumeric strings aren’t found within the document for data loss prevention purposes.  A Private document may need to be encrypted, and can only be made available to certain named individuals or discrete roles within the organisation.  The credentials of the sender and receiver of such a Private document should also be checked before it can be sent as an attachment to an email.

Enterprise information systems (EIM) make extensive use of metadata as it enables so much more to be done.  It can do away with folder and file constraints, as pointers to the document are metadata in themselves and the documents can reside anywhere.  Rather than taking an old-style enterprise content management (ECM) approach of pushing files into a relational database as binary large objects (BLObs), EIM content can stay where it is, using the EIM index and global namespace (the database of the pointers and all the metadata held on the files) to find the files themselves.

With EIM, when an individual searches for something, the system searches the metadata.  When they want to read or edit a document, the pointer shows the path to the file and enables access to it.

This provides a much more flexible information management approach, and by copying the metadata store across multiple different locations, provides a level of high availability without the need for expense on dedicated systems using synchronised content databases.

A metadata-driven EIM system also improves security.  A cyclic redundancy check (CRC) can be carried out on each file as it is embraced by the system.  This creates a unique code based on the content of the file.  Should anyone change that file outside of the system, for example by using a hex editor at the hard drive storage level, the EIM system will know that this has happened, as the CRC check will identify that something has changed.

All told, in the new world of highly open information sharing chains, immutable metadata is a need, not a nice to have.

Quocirca has authored a report on the subject, commissioned by M-Files, which can be downloaded here.

March 14, 2017  12:17 PM

Real-world views on the role of IoT in the farm-to-fork food chain

Clive Longbottom Clive Longbottom Profile: Clive Longbottom

Late on in 2016, Quocirca carried out primary research for Rentokil Initial, looking at perceptions about the current and future impact the internet of things (IoT) will have on organisations.  The respondents were from large companies in the farm, logistics/warehousing, food processing and retail industries in Australia, China, the UK and the USA.  None of the respondents was in a technical position – they were all chosen because they had responsibility for food hygiene within their organisation.

And herein was where a wake-up call to all those technology companies that believe that the IoT is fully understood within their target organisations, for example when it comes to the likely number of devices involved.  In research carried out by Quocirca for ForeScout earlier in 2016, where the respondent profile was senior IT decision makers in German-speaking countries and the UK, the average number of IoT devices expected to be in use within an organisation within 12 months was 7,000.

What quantity of IoT devices do you expect to deploy in the coming 24 months? (From Rentokil Initial research)

Figure 1: What quantity of IoT devices do you expect to deploy in the coming 24 months? (From Rentokil Initial research)

Compare this to the Rentokil Initial research, where only 10 respondents out of the 400 expected to have more than 1,000 devices – with nearly half expecting “very few” (less than 10) (see Figure 1).

Why the discrepancy? A more granular drill down into the data, hints at the reasons.  Within the farm-to-fork food chain, the logistics function is already a big user of IoT devices.  Chilled transport uses temperature detectors and cab-based GPS generally linked to central control systems; some advanced logistics companies are using multi-function systems that not only monitor temperatures, but also things like when and where the lorry or container’s doors were opened; the G-forces on the food packed in transit; CO2, nitrogen and other gas levels and so on.

It would have been expected that amongst the 100 logistics and warehousing companies interviewed, more would have had such capabilities – and therefore, the number of IoT devices already in use would already exceeded 1,000.

Food processing lines also tend to be full of IoT devices – for example, devices monitor the quality of food; the temperature of blanching or cleansing water; look for any problems along the line.

Rentokil Initial carried out some roundtables with some of their customers to drill further into perceptions around IoT.  One respondent stated that they had never even heard of the term IoT.  Others stated that they had specific needs – but did not see things such as the monitoring of how employees dealt with personal hygiene as an IoT issue.

It becomes apparent that whereas technical staff are seeing all of these as areas where IoT is of use, less technical staff see them as general tools of the trade – something that is part and parcel of what is needed, but not part of a more coherent, joined up environment.

In the context of managing food safety within your environment, how important are the following pieces of information?

Figure 2: In the context of managing food safety within your environment, how important are the following pieces of information?

“Having enough data to rapidly and effectively deal with an infestation/hygiene incident” was the number one concern.  However, other parts of the research showed that tying this in to a need for a more standardised IoT platform where such data could be pulled together so that this can be done was not being thought about.

It is apparent that there is a deep chasm between those in the technology space who are building up a knowledge of the IoT and those in the line of business who are actually trying to deal with the day-to-day problems.

Vendors with an interest in IoT approaching IT departments may well find that they are shouting in the echo chamber – the people that they are targeting will agree with what they say, but will not be able to raise the funds necessary for funding meaningful IoT projects.

Instead, these vendors must construct solid business messages around why the IoT matters to the business; they must have solid use case scenarios that use the right language to empathise with the line of business’ needs.

Otherwise, IoT projects will be carried out in silos of usage, leading to the age-old IT problems of islands of data that cannot be pulled together easily and analysed.  This then minimises the value of the IoT and fails to provide the distinct value that benefits the business.

March 13, 2017  12:20 PM

Collaboration innovation – re-thinking the workplace

Rob Bamforth Rob Bamforth Profile: Rob Bamforth

Most organisations are looking for ways to foster collaboration and grow team productivity. How this is achieved is less obvious. For a while it has been assumed that if you throw sufficient communications media (ideally unified into a single tool) at people then they will spontaneously collaborate. This is rarely the case. What happens is either over-communication and information overload if there is a sharing culture, or siloed, secretive, business as usual, if there is not.


More radical approaches employ smart use of facilities or create collaboration spaces within the working environment. These might simply be comfortable seating in a relaxed and accessible part of the workplace for a few people to ‘huddle’, (such as this novel idea from Nook) or some forced Californian cool of beanbags, table football, bright décor and a limited edition coffee served by an on site barista.

Walking or standing

While a comfortable working environment plays a part, there is something about the posture of participants that affects how they collaborate too. Are they walking, standing or sitting?

For those walking, the chances of meaningful collaboration are low. Already multi-tasking, their communication tends to be focused; issuing commands, some information sharing, but complex interaction between multiple participants? Unlikely. All useful, responsive and timely, but it is not collaboration – it tends to be more command and control.

Standing keeps people (literally) on their toes, and has been suggested as a way of holding shorter meetings. Attendees are less able to relax, so more likely to participate and reach decisions quickly. But does it lead to more or better collaboration?

One area where meeting space technology has advanced and become more widely available, does support the notion of collaboration while standing around. The success of tablets has led to wider availability of touch screen displays. What started as recording and copying whiteboards has evolved into large touch enabled interactive screens. These are often smart and connected to the network, enabling remote as well as local interaction and access.

Is this the solution to collaboration?

It depends.

While this will work well for sharing information – presentations, classrooms – it is not necessarily collaboration. One person presents or shares at a time. They might have their back to their audience while they interact with the screen. It works very well in a one to many scenario, and of course presenters can take turns. But this is not really multiple people working together and at the same time in free-flowing collaboration. Ideas may occur to individuals, but by the time they get their ‘turn’ the momentum has been lost or the discussion has moved elsewhere.

Sitting around

Most meetings involved attendees sitting. Keeping people engaged, especially when their email and favourite social media site is only a glance away, is a challenge. Sit them in remote places with only an audio connection on a conference call and the temptation to be distracted in boring moments might be too great. Being there in person or holding shorter meetings might be better, but that is not always possible. Adding video to the connection might help, but in a group setting, with everyone in the room looking at a distant screen, the situation is similar to a standing presenter.

Two recent product developments put their own distinct twists on how to do it differently and improve interaction.

One is Polycom’s portable video unit, the RealPresence Centro. Four screens with integral cameras and microphones make this connected Dalek the centre of attention in a meeting. Those involved sit, or stand, and talk to each other facing the device, which can be connected to remote participants on any other video device. It might seem quirky, but rapidly feels natural and engaging for everyone, who can participate locally and remotely, facing everyone else across the unit or across the network. With the concept of ‘huddle’ spaces proving popular, the RealPresence Centro might have found an interesting niche.

The other is more unusual, but familiar to anyone who has seen the film, Minority Report. Oblong’s Mezzanine employs a series of large screens and a wand pointing device (not yet holograms and hand gestures, Tom Cruise fans) to share and interact. Participants are surrounded and therefore immersed by information presented on the screens. These are replicated remotely for those beyond the room.

Content on screen can be interacted with, inserted, moved, parked and, crucially, visually presented using a third dimension of depth or distance away. Moving it closer makes it larger, moving in front of other content in a satisfying application of perspective. Everything is coordinated via the wand, but participants can bring and use their own devices to share and integrate into the experience – locally or remotely. Inserting new content, comments and flags is simple and seamless.

Mezzanine definitely has a different feel compared to other systems, and does need the room to be suitably equipped. The approach allows for much more free flowing interaction, avoiding stalling or interrupting thought patterns.

Getting everyone engaged

Technology companies and products have made it much easier to communicate. But this does not always make the process collaborative, engaging or ultimately effective at reaching a desired conclusion. All too often new communications media are dominated by those who ‘shout loudest’ or are restricted to those in senior or special positions. Thinking differently about the process and environment from a human perspective might provide the impetus to make collaboration something that everyone wants to and can participate in and that their contributions are recognised and valued.

March 8, 2017  1:50 PM

From ECM to EIM: the need for control

Clive Longbottom Clive Longbottom Profile: Clive Longbottom

canstockphoto27605501Enterprise content management (ECM) has long been necessary for organisations in highly regulated industries.  From SoftSolutions through to Documentum and OpenText, companies have implemented systems that control the flow and access of information within their business.

However, the problem is that these products tend to be used purely to manage a small subset of an organisation’s content.  Most organisations wait until an information asset has gone through some of the early stages of its lifecycle before it is entered into the system.  This may be based on reasons of cost (per-seat licencing for ECM systems tends to be high) or process: if an ECM system has been put in place to manage a single set of processes, such as those required for Federal Drug Administration (FDA) in the pharmaceutical or Civil Aviation Authority (CAA) in the aviation industries.

Whatever the reason, putting only a subset of information into a system is dangerous.  When an individual carries out a search across the system, they will (unsurprisingly) only get returned what is in that system. If they are then going to make a decision on what is returned, they could be missing out on pertinent information that is still outside the system: documents that are still in the early stages of their lifecycle.

These early-stage documents will be the ones that contain information that is most up-to-date, as they are the ones that are still being worked on.  These could therefore carry the information that can make or break the quality of the decision. Increasingly, such documents may not even be stored within the direct control of the organisation – they may be held in the cloud using services such as Dropbox or Box; they may be elsewhere in the chain of suppliers and customers the organisation is dealing with.  As these assets are not in the ECM system, they are less controlled – access rights are not managed; information flows are not monitored and controlled.

Rather than converging on a system that fully manages information, organisations seem to be struggling to control the divergence of information types and locations – and this can be damaging.

A rethink of ECM that moves thorough to an enterprise information management (EIM) system is required.  EIM is approach where information is captured as close to the point of creation as possible and managed all the way through its complete lifecycle to secure archiving or disposal.

Based around an underpinning of metadata, large amounts of information can be controlled.  Rather than pull all the documents themselves together into a massive, binary large object (BLOb) database, these files can be left where they are and only the metadata needs to be managed.  Such a metadata system will be a fraction the size of the overall information, plus it can be mirrored and replicated across the overall technology platform, providing high availability for searching and retrieving single items from the information asset base.

Through these means, all information sources can be included, so enhancing an organisation’s governance, risk and compliance capabilities.  It makes decision-making more complete and accurate, enabling an organisation to be more competitive.  It also provides better capabilities for collaboration around content, as single sources of original information combined with versioning and change management can be managed through the metadata.

In the first of a series of short reports on the subject, Quocirca looks in more depth as to how an organisation needs to readdress its needs around information management in the light of increasingly diverse information assets and growing GRC constraints.  The report is available for download here.

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: