A recent spate of targeted denial of service attacks on organisations such as Spamhaus and Bitcoin serve as a reminder that such attacks are widely used. Denial of service is the best way to attempt to halt or slow key internet based services by those with a motive to do so. Many IT managers probably look-on, shrug their shoulders and say, “why would they target us? We are not a high profile internet service.”
May be so, however, recent Quocirca has shown how reliant all organisations are now the internet to communicate with both customers and partners (free report here “Digital identities and the open business). This is a double-edged sword. Of course, the internet has become key to enabling high speed automated transactions for many businesses, but from an IT security perspective it also means that those who want to can more easily disrupt the activity of given business for any number of reasons. This can have both tangible and intangible consequences, for example slowing/stopping business or damaging reputation.
Denial of service is just one vector of attack. Another recent Quocirca research report shows that many European businesses have been impacted by a range of other network related attacks. Often these are not aimed at service disruption or damaging reputation but the theft of personal and/or financial data, in particular that relating to payment cards (see free report here “The trouble heading for your business).
“Low profile” businesses that do not deal much with personal data may still feel they are unlikely to be targeted. Don’t be so sure. Quocirca was talking with a small engineering firm the other day that was of just such it view. Later in the conversation it said it would be bidding for some work on the proposed controversial High Speed-2 (HS2) rail link. Hacktivists see small suppliers working on such projects as weak links and targeting them as a way of undermining the overall project. Any organisation can unexpectedly become a target.
There is a growing awareness of the dangers of both cybercrime and hacktivism shown by Quocirca’s recent research. Organisations are starting to invest in the defence measures necessary to defend themselves. This includes better understanding what is happening on the networks they rely on especially as the formal network edge has dissolved in to a virtual perimeter that cannot be policed using traditional measures such as firewalls and intrusion prevention systems (IPS).
How European business are going about this and the degree of success they are having will be the subject of a webinar Wednesday April 17th titled “It’s time for a new perimeter – protecting your IT infrastructure from malicious attacks” hosted by network defence specialist Corero; for more information and to register click HERE.
Naming a company you founded after yourself can be problematic. OK, no one tries to place the blame for HP’s recent woes on Bill Hewlett or Dave Packard (anyway, according to HP’s current management a big turnaround in fortune is underway http://www.hpnext.com). However, the ups and downs of Dell are still closely associated with its eponymous founder Michael Dell, especially as he bids to take the company private again, a battle The Economist believes he may lose. For McAfee the recent antics of its founder, John McAfee, were mainly embarrassing (went into hiding after being linked to murder enquiry).
So, it was a brave decision back in 1997 when Eugene and Natalya Kaspersky named the anti-virus company they founded, Kaspersky Lab, after themselves. The name sounds, and is, Russian and although the company now operates as a UK legal entity, it originates from Russia and many of its functions are still based there. Russia is perceived as a hotbed of organised crime and cybercrime, so why would you trust one of its companies with your online security?
In fact, compared to the examples listed in the first paragraph, Kaspersky is not widely known outside IT security circles (except in Russia itself, where it is a well-known consumer brand). There are two reasons for this. First, although its revenues, in excess of £600M, put it in the top 10 IT security companies, only the biggest are that well known, namely Symantec and McAfee (which is why the recent story about John was so widely covered).
Second is the way Kaspersky goes to market (outside of Russia). It has created a widespread network of OEMs (original equipment manufacturers) and ISVs (independent software vendors) that embed its anti-virus in their own products to provide that particular capability for their own offerings. OEMs and ISVs do not always reveal what is under the bonnet unless asked, however a long list of technology partners on Kaspersky’s web site includes; IBM, Alcatel-Lucent, Cisco, Juniper, Blue Coat, Check Point and D-Link.
Such prestigious partners have underlined the pedigree of Kaspersky’s anti-malware products and convinced many others to place their trust in the vendor; worldwide there are now over 400 million end-points under Kaspersky’s protection. Technology partners now account for just 20% of its business with a further 30% coming from businesses across Europe and beyond via 5,000 plus resellers. The balance comes from consumers.
If Kaspersky relied on just selling anti-malware its long term future would be in doubt. As two recent free Quocirca research reports have shown, traditional IT security is no longer good enough on its own to defend against the growing numbers of targeted attacks and other emerging threats (see these links The trouble heading for your business; Advanced cyber-security intelligence). All IT security vendors have had to adapt and Kaspersky has done so with a number of additions and modifications to its product set over the years.
Bringing it all together is the Kaspersky Security Network, a global network that gathers data from over 60 million end-points from contributing Kaspersky customers, providing rapid protection by keeping all users’ devices up to date with the latest information about malware and dangerous network links. However, such a capability is table-stakes for any IT security vendor and does not in itself defend against previously unseen (zero-day) threats.
So, the latest release of Kaspersky End-point Security for Business (KESB) includes a set of features designed to counter zero-day attacks. These include sandboxing, virtual keyboards, whitelisting, blacklisting, behavioural and heuristic analysis etc. The range of end-points protected has been extending to include tablets, smartphones and virtual devices. There is also an an overall device management tool to manage patching, usage policy etc.
In addition, Kaspersky System Watcher introduces a context aware security capability by combining information from Kaspersky’s firewall, behaviour analyser and cloud-based reputation server to provide a broader overall risk assessment of suspected malware.
Kaspersky admits it is often not first to market but says this is to the long term benefit of its users as all of its technology is built in-house and therefore tightly integrated. Customers might not agree if they get caught out by some new threat whilst Kaspersky’s innovations are still in its Lab. That said, many may be unaffected if, as is often the case, Kaspersky is used alongside other security technology.
Kaspersky is an important player in the IT security industry and with its continuing innovation it seems set to remain so. It is likely protecting your organisation against various security threats somewhere, even if you do not know it. It is one of the few Russian software companies with a global footprint and has achieved a level of trust many western business would envy; a jewel indeed.
There is a principal that internet service providers (ISPs) and governments should treat all data crossing the internet equally. It should not matter what type of device is being used, who the user is, or what site or application the data is being coming from/going to – net neutrality should mean no difference in charging models, no discriminating between the different use cases.
The arguments go back and forth as to whether this should be enshrined in legislation as a right, or allowed to drift in a competitive open market.
Despite the arguments and the capacity of technology to advance there are restrictions due to the laws of physics and certain resources that are therefore limited. This might not be too much of an issue with the massed bundles of fibre optics at the heart of fixed-line networks, but wireless networks have to balance range, capacity, power and the frequency spectrum in what is increasingly ‘noisy’ environment. Ideally without ‘frying’ anything en route.
While the resources are constrained, the boundless enthusiasm and appetite to access mobile data and applications is not. Nor, given the numbers of subscribers and devices, is the number of endpoints diminishing. In fact, with a re-awakened interest in machine-to-machine communications (M2M), or an ‘internet of things’, this is likely to accelerate further.
So what about unwired net neutrality?
There are already differential services that break the spirit, if not the letter, of the principal. To see how this happens consider how the way hotels have been offering Wi-Fi is changing. Initially it appeared to be a new revenue stream, but then establishments realised it was costly to get right. As more venues started to offer it, the differentiation was lost and it became a ‘table-stakes’ offering of free Wi-Fi once hoteliers realised that actually they really made money from renting out rooms and selling food and drinks.
Not all have reached this point yet, but the more progressive organisations have already gone a step further. They offer ‘basic’ Wi-Fi for free, but have a premium service that offers greater bandwidth, improved latency etc – what might be described as ‘professional’ Wi-Fi, compared to currently simple ‘hotspots’. Basic allows a bit of email and gentle browsing, but the premium service would be good enough for consumers’ IP telephony, gaming and video streaming or virtual desktops and unified communications for the enterprise user.
Then there are cellular networks. Some carriers are premium-pricing their higher speed 4G offerings compared to the tariffs on their 3G networks. Of course with differential caps on usage it also gets a little confusing as to which is the best service for an individual user. In countries where only one or a few of the mobile networks are offering 4G today, there will be rapid pricing changes as operators switch between land grab, maximising revenue and maintaining network quality modes.
Given that users have different needs – from M2M applications that might only require a few guaranteed kilobytes to video streaming gamers who need high bandwidth and low latency – there will have to be different types of services offered. Setting caps on how many minutes of communication or megabytes of capacity will be bundled and then charged for will no longer be sufficient.
Different qualities of service will need to be differentially priced. This might require application bundling, e.g. all the social media you can eat, but video is charged by the megabyte or guaranteed service levels, e.g. all gaming traffic in sub XYZ latency, but email transmitted as ‘best efforts’.
It will be a real challenge for rating, billing and marketing, but there is no dark fibre in the sky and all the innovative use of spectrum has its eventual limit, which with ever more users and usage is close by.
The superfast mobile net is unlikely to be very neutral, but that might work out to be beneficial in the long run.
Anyone with a personal mobile phone will have seen the odd big bill, perhaps as a result of roaming or with tariffs where the bundled time, text and megabytes did not match the actual usage, or maybe just too many international calls. But most people only get stung once or twice. Once they understand the consequences of their usage, they can use less hungry data apps, perhaps get a different tariff, or switch from voice calls to text.
Or just pay, after all, tariffs are getting cheaper, bundles bigger and there’s always free Wi-Fi, right?
However, as soon as you introduce business use – whether on a work supplied device or a ‘bring your own device’ (BYOD) – the picture gets a little murky.
First who pays, and for what? In the recent halcyon days of all work related mobile devices being corporate supplied on business tariffs most businesses would deal with the contract side covering all costs with some recovering from employees the cost of personal calls, if they could identify them.
Many employees would never see their individual bills and in some organisations only the finance department would have any idea, until things got really expensive. But hey, these mobile phones boosted the productivity of traders, sales and field service people so was it really a big deal? Not really, until many mobile users appeared, usage patterns changed, bills went up, budgets became tighter and organisations started to think about telecoms expense management (TEM). There are also legislative and tax issues that surround the who pays for personal usage issue.
Now with heavy data usage and employees as consumers wanting to, willing to and doing just about everything on their personally owned mobile phones and other devices, the business/personal usage line is almost impossible to draw.
These devices typically come with Wi-Fi, so that’s a free option? No, it may be free in certain quarters, but according to the latest research from enterprise mobility provider iPass, almost 60% of mobile workers have had to pay $20 or more for one-time Wi-Fi access. While some mobile and internet accounts have Wi-Fi access or minutes bundles, more often than not with a disjointed cacophony of providers, limited Wi-Fi account ‘roaming’ and quirky logins, much Wi-Fi usage outside the office is going to be paid for in an ad hoc manner, expensed and not tracked.
Does BYOD take the issue away? Not necessarily, as it depends whether there is BYOC (contract) as well, and even here the costs do not fall clearly.
Everything appears fine if the employee wants to pay for everything – business and personal use – on their own contract and tariff.
But that may not necessarily reduce costs overall. For a start, the organisation, especially if large or multi-national, would probably have a good deal on its corporate tariff, that personal tariffs just cannot match, so employees are likely to be paying higher rates than when contributing to business contracts.
Business tariffs will also be with one provider and might link into the fixed phone system so that ‘internal’ or ‘on net’ calls would be free or very low cost. With employees bringing their own contracts it is likely that multiple operators would be involved and inter-employee calling made more expensive than otherwise.
Employees may also balk at paying for business use or having business use take them closer to their personal data usage caps – but how are they going to claim? One off claims for Wi-FI etc. may be easy, but this is again often going under the radar from the enterprise perspective if it only shows up on expenses rather than a telecommunications budget, so not really acceptable longer term. Finally, if business use is starting to dominate then changing behaviours to limit business usage for personal cost reasons undermines the whole idea of using mobile technology to enhance productivity.
The alternative of “employee-choice with BYOD, but employer picks up the tab” is also fraught with challenges, as personal usage could go completely unchecked incurring not only a direct cost on the monthly bill, but also the indirect cost of time spent not working. This is always a risk, but if the employer is paying for everything on a personally chosen device, could easily be a big problem.
The reality is even more complex as employees will increasingly have a clutch of devices – smartphone, tablet, laptop – each with some element of work and personal use, some of which may be corporate supplied, others not. It may not be sensible or even possible anymore for employers to lock this whole situation down, but it is necessary to understand what is going on in order to keep some control of costs.
More thoughts about mobile expense management are in this recently revised and re-published Quocirca report.
The impact of new mobile devices such as tablets and smartphones might not altogether remove the need for desktop computers, but it does open up the potential for a really radical shift in how workplaces of the future might look.
For a start, the subtle way that even simple mobile phones increase flexibility in the working environment, even inside its boundary – no one needs to return to their personal desk to make or receive a call. With smart phones and tablets, all forms of communication can be achieved on the move – voice, text or video – and can be ‘unified’ around a corporate platform or ‘social’ around a consumer (or perhaps enterprise) platform.
The concept of ‘in’ and ‘out’ trays therefore seems a little dated, although most would admit the paperless office is still a distant dream So, does everyone need their own personal desk while in the building?
Since many now have working practices (and technology) that allows them to be productive outside the office environment – at home or out and about mobile – is there a case for revisiting the concept of shared desks to cover for the odd time when someone is in?
This idea of flexible working, hot-desking, or ‘hoteling’ is not new, but advances in mobile technologies, the ubiquity of wireless networks and the personal appetite for working on the move and seeing the office as a place for occasional use all gives it an extra boost.
So too does the potential for cost saving.
The cost of providing a typical desk in a city like London can easily run to over £10,000 per year, and the average across the UK is almost £6,000. Providing one for every employee, whether they are going to use it all the time or not, starts to look like an unnecessary extravagance, especially if all it is doing for many working hours is acting as a support for a few personal photos, memorabilia from past training courses and a never-inspected pile of (often unnecessary) paperwork.
Despite this, many companies as well as individuals find it difficult to kick the mahogany (or aluminium and chipboard) habit. According to recent research conducted for Vodafone, just over a third of companies had not even considered flexible working to reduce costs, thought reducing desks was ‘inappropriate’ for their business or thought it would have a negative impact on teamwork.
A lot of the people-related preparatory work for switching to a flexible office can be a bit daunting and de-humanising. Terms such as ‘stacking density’ do little to boost morale and while most organisations and individuals would like to think they measure success by results rather than time in the office, presentee-ism still prevails and being seen in the office is perceived to have promotional value.
Technology can help with this, especially as so many consumers have been ‘converted’ to mobile, but it still needs careful management.
First the devices. Now that so many expect to BYOD (Bring Your Own Device) to use at work, there are more types of devices to deal with, all with different and personal applications. User expectations are high, but still the organisation needs to secure its assets, especially data. Controls, policies and procedures need to be applied and although user education has to be at the heart of it, automated management controls are vital to avoid costs spiralling, otherwise everyone might as well be given a desk.
Next come the networks. Most organisations have an infrastructure designed around people sat in fixed and known locations, and even desk swapping raises issues – “that’s my PC!” or “why can’t this phone ring with my incoming calls?”. Wireless networks, where they are present, are often oriented around laptops. So connectivity may be available in the places where people can sit and ‘de-camp’, but there may be insufficient coverage and capacity to deal with lower powered radios in devices such as most smartphones AND tablets.
The network capacity will also need to be increased, but also in a flexible, dynamic and automated way. Increased use of video and ‘chatty’, more social collaboration – good for bringing diverse and dispersed teams closer together – impacts on the network, especially if users are mobile and video usage is ad hoc and unpredictable.
In a flexible office, even the traditional desktop (yes, they’re unlikely to disappear completely just yet) is affected. The network needs to be able to cope with delivering services to different users in different places at different times. User authentication and delivery of their services to the spot they’re currently occupying requires sophisticated and predictable management.
The working world may be coming much more mobile, but in the flexible office one thing is still fixed – the need to manage everything as simply, seamlessly and automatically as possible.
For many years, technology vendors have promised companies systems that provide the “one true view“ of their customers. CRM vendor PeopleSoft had the 360° View (somehow lost during the acquisition by Oracle); other CRM vendors provided insights into past customer behaviour and analytics vendors touted clever ways of predicting future behaviours based on visualising past activities through graphical and interactive dashboards.
The main problem with such systems lie in that they are pretty dependent on having enough past information to work against, and in analysing large data sets to provide the required visualisations – which can require large compute farms and data warehouses. That the future predictions take time to come through can also be a problem – the aim is to capture customer activity in real time and make the most of them.
Some approaches managed to get close to giving real-time value through using pattern matching – if a given customer is doing this, then based on past behaviour, we should point them in this direction. Makes sense, but requires deep analytics of past data (again) and the formalisation of the rules that will need to be in place.
Quocirca recently spoke with Featurespace, a Cambridge Ring company started in 2005. The company is currently touting itself as a customer retention and fraud identification and management company – but there seems to be a lot more underneath the hood.
Featurespace uses the real time data streams for its main feeds. It is self-learning and can work against minimal historical data. Through using advanced algorithms for analysing on-line (or other – see later) behaviour, fraudulent activity can be identified at a very early stage, and actions taken to curtail it. Yes, this has value to a business, but will only tend to be seen as massively valuable by the Chief Risk Officer (or equivalent). Customer churn is an accepted occurrence in most markets, and as long as a company sees its churn as being no worse than the industry average, they are likely to stick with what they have.
The trick for Featurespace is to take what it has and create messages that have better value to businesses. For example, behavioural analysis not only identifies bad behaviour, but also good behaviour. In real time, customers can be encouraged in their good behaviour, spending more in the process and ensuring that shopping carts are completed and the customer-to-cash process is fully optimised.
Also, bad customers can be easily identified – the bane of markets such as telecoms, where the top 20% of customers make 80% of profits, and the bottom 20% make 80% of the losses. Behavioural analysis can identify whether there is any hope in turning the customer through to profitability – if not, then bidding them a fond “farewell“ (maybe even offering them a £5 voucher to go to the competition) can improve profitability – and lower churn, as many of these bottom 20% are the ones that hop from deal to deal.
Such cluster analysis can lead to identifying interesting opportunities that many analytic approaches miss – and if supplemented with other data, such as the (somewhat outdated, but still widely used) ACORN scoring, can further be used to optimise offers at an ad-hoc immediate level and a strategic future product or services level.
Featurespace can help in the on-line retail space in optimising customer behaviour, but it is also showing how it can operate outside of the “standard“ markets. For example, it can analyse video streams. Imagine at an airport: your average traveller is doing all the “normal“ things – gawping at shops as if they have never seen them before; coming to a halt at the bottom of escalators and causing others to fall over behind them.
Consider someone who is not a normal traveller – a terrorist, say. No matter what they do, their state of mind will not make it possible for them to look as relaxed or normal as the average passenger. Tracking all behaviours enables differences to be picked up very rapidly – and it doesn‘t have to be hidden in how it is used. No matter how aware the person is of the system, they cannot work around it: their behaviour patterns will just look more false the more they try to be normal.
Featurespace has to change its messaging, and the new(ish) CEO, Martina King, knows this and is going to be making a big push around Featurespace for behavioural analytics.
There are competitors out there – the big one that springs to mind is IBM with the work that Jeff Jonas has being doing for some years. However, there is more than enough room for other players, and Featurespace looks like it could well be one to watch.
Dropbox has been a pretty good success, and it is difficult to do it down when it comes to an easy way for an individual to put information in one place for their own use across multiple devices. Dropbox sparked off a raft of “me-toos” trying to do things just differently enough to create a market for themselves – companies such as SugarSync or Ubuntu’s One, or bigger players trying to retain control of their customers such as Apple with iCloud and Microsoft with SkyDrive.
Consumer service are one thing, but there are problems when it comes to the business use of such services; the individual cannot be king here. To the organisation, information is the basis of its intellectual property, and if the information is spread around the cloud, this can be a major issue.
Dropbox was originally aimed purely at individuals, and as they started to use it for work-related documents, enterprises had a couple of major worries. Firstly, they had no visibility of what information was being stored in Dropbox (or any other cloud-based consumer service) and secondly, it was not being shared across a team in an effective manner.
Dropbox is addressing this through its “business” plans and Microsoft is working through its plans for SkyDrive Pro – but are they doing enough? A look at what other providers such as Box are beginning to put in place, including additional team and organisational functionality, points towards the availability of well-rounded business information sharing system.
One interesting company that is taking things to the next level is Perforce Software. Perforce is best known for its on-premise software configuration management (SCM) tools. This provides the levels of control and ownership that many organisations are looking for that cloud-based systems may be perceived to lack.
Within SCM, teams work together, creating and working on digital assets that need to be managed and controlled at a granular level with high levels of security.
Hang on – isn’t this what’s needed for team working on business information as well?
This is exactly what Perforce thought. However, the existing Perforce SCM system was not something that could just be re-badged and thrown over the wall in the hope that users would flock to it and change the world. Perforce is a tool aimed at technical developers and its front end would appear very complex to business users. Even so, Perforce has seen it being used by non-technical users to manage other digital assets.
Perforce could have gone for an approach of taking what they had and cutting out all the functionality that wasn’t needed. This may well have worked, but would have presented them with two set of underlying code to manage, two products to support needs and so on.
What Perforce decide to do was to take the existing Perforce SCM system and keep the engine as it is, but create a new skin over the top, creating Perforce Commons. Starting from the “keep it simple, stupid” school of thought, it started with the very basics – what would users want to do? Well, dragging documents from their device into the system seemed like a good place to start. Once the documents, what next? Well, preview them would seem like a good idea. Put them in folders would keep things clean. Share them between people inside and outside the organisation. Comment on them to create a stream of activity – you get the picture. Start simply and allow the interface to make this happen in the simplest way possible.
However, Commons also allows some advanced features – for example, individuals can work on documents at the same time and three-way comparisons can be carried out to aggregate and resolve comments and changes in an easy manner through an intelligent merge. Ideal when working as a team against the same information assets – parallel work can be carried out, helping to compress timescales.
What Perforce is ending up with is the proven strengths of its SCM product, completely re-skinned so that a business person can use it in a business environment to put documents in a controlled environment so that they can access them from any device wherever they are, share them within their teams and with those outside their teams and enable social collaboration via comments and tagging. Full versioning is there too – and users can send links to people that will always link to the latest version – or to a specific version if the user wants.
This approach takes things beyond where some of the other shared file providers are looking. And for Perforce, it has the luxury of being able to rapidly introduce new capabilities through just surfacing the underlying functionality of the Perforce SCM engine.
There are problems for Perforce, though. Where it is known, it is for SCM – and trying to persuade its SCM users to allow Commons to be used across an organisation may not be easy, although Perforce itself says that its customers are quite open to the proposition. Where it is not known, it has the problem of messaging – does it want to sell SCM or Commons – or both? Each needs different messaging to different groups – but any one sale could cloud the sale of the other. Perforce also has to decide how it works with its channel – the SCM channel will not be well positioned to sell Commons.
It also has to decide what it really is – is it a Dropbox for the enterprise? Is it an evolution of where others such as Box are going? Is it an alternative to SkyDrive Pro? There will be those who want to stay with an on-premise deployment, and Perforce fits the bill well against all these cloud-based services. Indeed, it would be relatively easy for Perforce to create a cloud-based offering and take on these other vendors head-to-head.
However, to start with, it will be an on-premise only. But there are other on-premise products available – should Perforce be aiming to be SharePoint with bells on, or maybe even Documentum for the masses?
Its future is probably somewhere towards the SharePoint with bells on – and it has an interesting business model where small groups can use it indefinitely with no constraints for free: an interesting offer to the SMB market, but one which if it becomes Perforce’s main market will produce little in the way of revenues but with considerable cost overheads.
Overall, Commons looks promising. Quocirca expects Perforce to struggle to start with, but it has the capabilities to react to users’ wishes and wants rapidly and as long as it sorts out the channel and creates a sustainable business model, Commons could well be a success.
Things change, but recent advances in technology coupled with social changes are changing the work/life balance, and not in the way that was once expected. Shorter days and more leisure time was a twentieth century dream for the twenty first century world of work, but the reality is somewhat different.
At one time, information and communications technology (ICT) for the working environment was only made accessible to a select few, controlled by central diktat and superior to anything you were likely to see at home. Now the complete opposite is true and consumerised IT not only extends the working day into individuals’ personal lives, but also allows them choices and to bring their personal devices (BYOD) and activities – especially social communications – into the main hours of the working day.
While this blurring may not be an issue providing employees do not push too much personal activity so as to be a detriment to their work, it does create other challenges.
One in particular is related to another change, but this time instigated by the organisation. There is an increasing need to open up business applications to communicate and share information with users outside of the organisation. This includes outside the physical boundaries and the need to share with employees on the move or working from home, but also outside the corporate boundaries to contractors, third party suppliers, business customers and even consumers. The reasons for this are to improve relationships with customers, transact directly with them and to more tightly integrate the supply chain.
Organisations are themselves also increasingly using social media to do this as they feel that it will make it easier to identify, communicate with and retain customers.
The problem then is how and what to share, and will it be safe?
Up until recently the main method of sharing information remotely with anyone external would either be physical media – CD, memory stick, etc – especially for large volumes of data; or, more often for smaller volumes, email. Most organisations are relatively confident they can secure email sharing, and there are certainly many tools to support this and minimise data leakage.
Physical media is more tricky, and as mobile devices have become increasingly prevalent, this increases the physical device risk further. This might be by direct connection through USB such as memory sticks (although ‘podslurping’ was a term coined for downloading gigabytes to a connected iPod) or over the air through a cellular or Wi-Fi connection.
The risks this brings through the potential loss or theft of device are well known and understood, with mobile device management (MDM) protections often put in place to lock or wipe, and sometimes, though not frequently enough, through on-device encryption. There are also those who avoid data residing on the device at all through virtual connections that leave no permanent data footprints.
However, a greater risk comes from user behaviours related to the increasing use of social media – posting or sharing something ‘out there’ on the internet. This might be as an update to ‘friends’ via a social media site or a dedicated cloud storage provider.
Either way it is potentially out of sight from an enterprise perspective, as employees will be using their own preferred tools to create a Bring Your Own Cloud or Collaboration (BYOC) experience. If this casual and informal usage translates into how official or formal information is shared with third party businesses and consumers, the organisation is not in control, making the demonstration of compliance virtually impossible and increasing security risks.
It might be that enterprise IT has its own set of endorsed tools for information sharing via cloud based services, but the blurring of boundaries in employee behaviour may make the use of these difficult to enforce, especially if employees have been allowed or even encouraged to BYOD in an uncontrolled manner. One way or another, lax behaviour may need to be reined in, monitored or checked.
Technology vendors and industry pundits take great delight in announcing that “this time it’s different!”. There are paradigm shifts, unstoppable trends, ground-breaking changes and disruptive innovations.
Mobile technologies are no exception, yet a short look back in time tells us that things are not always as revolutionary as first perceived. For a while, mobile email was something special. There were dozens of software vendors, although not typically the major email players, offering email on the move. Then there was the BlackBerry – the must-have email gadget for former-Yuppy executives looking to replace their Filofaxes. In fact, mobile email itself was so special that senior folk demanded special exceptions must be made to security policies but that only they should have it.
Now the edge has worn off, it turns out that email is just email, but you can also access it on the move i.e. while mobile. BlackBerry has lost some of its shine and the need for dedicated mobile email software vendors has evaporated. There are certain things that make mobile email more complicated – such as being careful how much is downloaded to keep data costs down and watching out for the risk of loss or theft if private attachments are on the mobile device – but these are management challenges, not reasons to say that mobile email is so radically different.
The broader needs of complete mobile working also seem to be following similar lines.
What started out as a special tool for certain roles and only with certain devices has exploded into a consumer-led boom of a huge diversity of smartphones and tablets. These devices might be operated differently with touchscreens instead of keyboards and connect over public wireless rather than private fixed networks, but they are essentially doing the same job – allowing their users to communicate and interact with data.
Extra risks occur because of the use of open and public networks, a greater variety of devices and increasingly that employees want to be told ‘you can bring your own devices’ (BYOD) and use them for work. These things are not necessarily unique to mobile devices and some businesses will have had employees connecting in from domestic desktop computers over the last couple of decades, but the consumer mind-set towards IT has really gathered most of its momentum from mobile devices.
The risks this varied mobile usage brings do need managing, but it is not enough to think it is simply about mobile device management (MDM), because actually the things that need protecting are sensitive assets that belong to the employer and the employees’ ability to get their work done efficiently without incurring considerable extra costs.
There are several areas beyond the devices themselves that could do with further attention.
First to consider is applications. How will these be deployed, installed and correctly configured now that the concept of a standard corporate build on a standard corporate device is out of the window? It needs to be done in a simple, flexible, self-service manner, delivered over the air with enforcement to ensure critical apps are installed, and unapproved ones are not, or are at least contained. Application versions and configurations need to be managed over the complete usage lifecycle and secured for access control and data leakage prevention. The whole thing needs wrapping with tracking and monitoring of performance, usage and compliance.
The next area that most companies consider is data. The knee-jerk reaction of the most paranoid security manager will be to lock everything down and encrypt everything. Most users will rebel against this at some level if it makes work too complex or difficult, and most especially if their own BYOD phone or tablet is the device the data is on. An organisation, and it is the line of business, not IT’s responsibility, has to determine to value and risk of data in order to decide how much security to apply. Access controls based on users, roles and the capabilities or risks of classes of device might be applied; some data may be ‘geo-fenced’ to ensure it can only be access in certain locations, other may be only accessible from a cloud service and never residing on the device. The important thing is to ensure that the right controls can be exerted on data of known value or risk, without removing the flexibility that mobile brings – otherwise employees will work around the issue, bringing potentially great risks.
Beyond protecting those tangible digital assets, the next question is what are employees doing? For managing the mobile enterprise, this breaks into two areas of interest – behaviour and expenses. These areas might often be related and both are greatly challenged by the move to BYOD. However the relationship between employers and employees with communications technologies – desk phones, internet access etc – has always been one of trust and consequences. And if that seems to be failing, monitor what employees are doing and block things that are not allowed. Little changes.
All together, effective IT management requires an enterprise to consider all aspects – devices, applications, data and users – and apply suitable controls based on the risks. These might be elevated by mobile, but should be assessed based on value and risk to the business.
While all sorts of powerful tools can be readily deployed, it should always be remembered that their goal is to automate the hopefully sensible procedures and policies that an organisation has put in place to support its strategy. This is still true of mobile, just as it is with other technologies. Disruptive? Yes, but ultimately not that different to other innovations in that its implementation needs to fit with the business.
Sellers of computer security products and services sometimes fret that their messaging is too scary as they go on about risk, data loss and regulatory fines. To get around this, every so often they like to remind potential buyers that their wares are also business enablers. The case is easier to make in some areas than others, one such is identity and access management (IAM).
In the old days (pre-business use of the internet) IAM was mainly about providing identities to employees (and the odd contractor) to give them access to various in-house applications. This was generally from PCs and dumb terminals situated on premise and owned by the business; all was restricted to private networks. How things have changed.
A recent Quocirca report, Digital identities and the open business, shows that the majority of European organisations now open up their applications to external users; from either business customers, consumers or both. This is done entirely for positive business reasons, the top drivers being direct transactions with customers, improved customer experience, smoother supply chains and revenue growth.
However, this requires a level of IAM to be put in place that enables the quick capture and on-going authentication of identities. One of the challenges this throws up is the need for federated identity management.
Organisations that only need to worry about their own employees can put in place a single directory for centralised storage and rely solely on this to underpin IAM requirements. Microsoft Active Directory is by far the most common “internal directory”. However, when it comes to users from external organisations a whole range of other identity sources come in to play.
For users from business customers and partner organisations, it will often be the target organisation’s own directory (so may be another instance of Active Directory). However, identities may also be sourced from the membership lists of professional bodies (e.g. legal and accounting associations), government databases and social media sites.
When it comes to dealing with consumers, social media tops the list as a source of identity. Many of us will already be familiar with, being able to optionally use our Facebook identities to login to sites like Spotify of JustGiving. Wherever an identity is sourced from it is clear that for external users there is a growing concept of BYOID (bring-your-own-identity).
Some may frown at this and wonder how secure it can all be. The answer to that is down to the IAM system in place. This is where the different sources of identity are federated and policies about who can access what are enforced.
Banks would clearly be taking a great risk by allowing a user to move large sums of cash around based on a Google identity, but it may be good enough to answer an enquiry about opening a new account and capturing some basic details to kick the relationship off. If things go further the expense of creating a more secure identity and means of authentication can go ahead and the details updated in the IAM system.
Quocirca’s report shows that when IT and IT security managers think about IAM they still think primarily in terms of achieving certain security goals. However, its use for achieving business goals is creeping up the list the priorities. Furthermore, in the past IAM may have been seen as affordable only by large enterprise. However, it is now widely available as an on-demand service (IAM as a service/IAMaaS) and open to business of all sizes.
The majority of respondents to Quocirca’s survey report that their business managers are taking an interest in IAM. This is for not for security reasons but for its power as a business enabler. Now that’s not too scary – is it?
Quocirca’s report Digital identities and the open business is freely available to download here: https://www.ca.com/us/register/forms/collateral/quocirca-european-research-digital-identities-and-the-open-business.aspx