Quocirca Insights

Page 5 of 31« First...34567...102030...Last »

March 14, 2017  12:17 PM

Real-world views on the role of IoT in the farm-to-fork food chain

Clive Longbottom Clive Longbottom Profile: Clive Longbottom

Late on in 2016, Quocirca carried out primary research for Rentokil Initial, looking at perceptions about the current and future impact the internet of things (IoT) will have on organisations.  The respondents were from large companies in the farm, logistics/warehousing, food processing and retail industries in Australia, China, the UK and the USA.  None of the respondents was in a technical position – they were all chosen because they had responsibility for food hygiene within their organisation.

And herein was where a wake-up call to all those technology companies that believe that the IoT is fully understood within their target organisations, for example when it comes to the likely number of devices involved.  In research carried out by Quocirca for ForeScout earlier in 2016, where the respondent profile was senior IT decision makers in German-speaking countries and the UK, the average number of IoT devices expected to be in use within an organisation within 12 months was 7,000.

What quantity of IoT devices do you expect to deploy in the coming 24 months? (From Rentokil Initial research)

Figure 1: What quantity of IoT devices do you expect to deploy in the coming 24 months? (From Rentokil Initial research)

Compare this to the Rentokil Initial research, where only 10 respondents out of the 400 expected to have more than 1,000 devices – with nearly half expecting “very few” (less than 10) (see Figure 1).

Why the discrepancy? A more granular drill down into the data, hints at the reasons.  Within the farm-to-fork food chain, the logistics function is already a big user of IoT devices.  Chilled transport uses temperature detectors and cab-based GPS generally linked to central control systems; some advanced logistics companies are using multi-function systems that not only monitor temperatures, but also things like when and where the lorry or container’s doors were opened; the G-forces on the food packed in transit; CO2, nitrogen and other gas levels and so on.

It would have been expected that amongst the 100 logistics and warehousing companies interviewed, more would have had such capabilities – and therefore, the number of IoT devices already in use would already exceeded 1,000.

Food processing lines also tend to be full of IoT devices – for example, devices monitor the quality of food; the temperature of blanching or cleansing water; look for any problems along the line.

Rentokil Initial carried out some roundtables with some of their customers to drill further into perceptions around IoT.  One respondent stated that they had never even heard of the term IoT.  Others stated that they had specific needs – but did not see things such as the monitoring of how employees dealt with personal hygiene as an IoT issue.

It becomes apparent that whereas technical staff are seeing all of these as areas where IoT is of use, less technical staff see them as general tools of the trade – something that is part and parcel of what is needed, but not part of a more coherent, joined up environment.

In the context of managing food safety within your environment, how important are the following pieces of information?

Figure 2: In the context of managing food safety within your environment, how important are the following pieces of information?

“Having enough data to rapidly and effectively deal with an infestation/hygiene incident” was the number one concern.  However, other parts of the research showed that tying this in to a need for a more standardised IoT platform where such data could be pulled together so that this can be done was not being thought about.

It is apparent that there is a deep chasm between those in the technology space who are building up a knowledge of the IoT and those in the line of business who are actually trying to deal with the day-to-day problems.

Vendors with an interest in IoT approaching IT departments may well find that they are shouting in the echo chamber – the people that they are targeting will agree with what they say, but will not be able to raise the funds necessary for funding meaningful IoT projects.

Instead, these vendors must construct solid business messages around why the IoT matters to the business; they must have solid use case scenarios that use the right language to empathise with the line of business’ needs.

Otherwise, IoT projects will be carried out in silos of usage, leading to the age-old IT problems of islands of data that cannot be pulled together easily and analysed.  This then minimises the value of the IoT and fails to provide the distinct value that benefits the business.

March 13, 2017  12:20 PM

Collaboration innovation – re-thinking the workplace

Rob Bamforth Rob Bamforth Profile: Rob Bamforth

Most organisations are looking for ways to foster collaboration and grow team productivity. How this is achieved is less obvious. For a while it has been assumed that if you throw sufficient communications media (ideally unified into a single tool) at people then they will spontaneously collaborate. This is rarely the case. What happens is either over-communication and information overload if there is a sharing culture, or siloed, secretive, business as usual, if there is not.

divein

More radical approaches employ smart use of facilities or create collaboration spaces within the working environment. These might simply be comfortable seating in a relaxed and accessible part of the workplace for a few people to ‘huddle’, (such as this novel idea from Nook) or some forced Californian cool of beanbags, table football, bright décor and a limited edition coffee served by an on site barista.

Walking or standing

While a comfortable working environment plays a part, there is something about the posture of participants that affects how they collaborate too. Are they walking, standing or sitting?

For those walking, the chances of meaningful collaboration are low. Already multi-tasking, their communication tends to be focused; issuing commands, some information sharing, but complex interaction between multiple participants? Unlikely. All useful, responsive and timely, but it is not collaboration – it tends to be more command and control.

Standing keeps people (literally) on their toes, and has been suggested as a way of holding shorter meetings. Attendees are less able to relax, so more likely to participate and reach decisions quickly. But does it lead to more or better collaboration?

One area where meeting space technology has advanced and become more widely available, does support the notion of collaboration while standing around. The success of tablets has led to wider availability of touch screen displays. What started as recording and copying whiteboards has evolved into large touch enabled interactive screens. These are often smart and connected to the network, enabling remote as well as local interaction and access.

Is this the solution to collaboration?

It depends.

While this will work well for sharing information – presentations, classrooms – it is not necessarily collaboration. One person presents or shares at a time. They might have their back to their audience while they interact with the screen. It works very well in a one to many scenario, and of course presenters can take turns. But this is not really multiple people working together and at the same time in free-flowing collaboration. Ideas may occur to individuals, but by the time they get their ‘turn’ the momentum has been lost or the discussion has moved elsewhere.

Sitting around

Most meetings involved attendees sitting. Keeping people engaged, especially when their email and favourite social media site is only a glance away, is a challenge. Sit them in remote places with only an audio connection on a conference call and the temptation to be distracted in boring moments might be too great. Being there in person or holding shorter meetings might be better, but that is not always possible. Adding video to the connection might help, but in a group setting, with everyone in the room looking at a distant screen, the situation is similar to a standing presenter.

Two recent product developments put their own distinct twists on how to do it differently and improve interaction.

One is Polycom’s portable video unit, the RealPresence Centro. Four screens with integral cameras and microphones make this connected Dalek the centre of attention in a meeting. Those involved sit, or stand, and talk to each other facing the device, which can be connected to remote participants on any other video device. It might seem quirky, but rapidly feels natural and engaging for everyone, who can participate locally and remotely, facing everyone else across the unit or across the network. With the concept of ‘huddle’ spaces proving popular, the RealPresence Centro might have found an interesting niche.

The other is more unusual, but familiar to anyone who has seen the film, Minority Report. Oblong’s Mezzanine employs a series of large screens and a wand pointing device (not yet holograms and hand gestures, Tom Cruise fans) to share and interact. Participants are surrounded and therefore immersed by information presented on the screens. These are replicated remotely for those beyond the room.

Content on screen can be interacted with, inserted, moved, parked and, crucially, visually presented using a third dimension of depth or distance away. Moving it closer makes it larger, moving in front of other content in a satisfying application of perspective. Everything is coordinated via the wand, but participants can bring and use their own devices to share and integrate into the experience – locally or remotely. Inserting new content, comments and flags is simple and seamless.

Mezzanine definitely has a different feel compared to other systems, and does need the room to be suitably equipped. The approach allows for much more free flowing interaction, avoiding stalling or interrupting thought patterns.

Getting everyone engaged

Technology companies and products have made it much easier to communicate. But this does not always make the process collaborative, engaging or ultimately effective at reaching a desired conclusion. All too often new communications media are dominated by those who ‘shout loudest’ or are restricted to those in senior or special positions. Thinking differently about the process and environment from a human perspective might provide the impetus to make collaboration something that everyone wants to and can participate in and that their contributions are recognised and valued.


March 8, 2017  1:50 PM

From ECM to EIM: the need for control

Clive Longbottom Clive Longbottom Profile: Clive Longbottom

canstockphoto27605501Enterprise content management (ECM) has long been necessary for organisations in highly regulated industries.  From SoftSolutions through to Documentum and OpenText, companies have implemented systems that control the flow and access of information within their business.

However, the problem is that these products tend to be used purely to manage a small subset of an organisation’s content.  Most organisations wait until an information asset has gone through some of the early stages of its lifecycle before it is entered into the system.  This may be based on reasons of cost (per-seat licencing for ECM systems tends to be high) or process: if an ECM system has been put in place to manage a single set of processes, such as those required for Federal Drug Administration (FDA) in the pharmaceutical or Civil Aviation Authority (CAA) in the aviation industries.

Whatever the reason, putting only a subset of information into a system is dangerous.  When an individual carries out a search across the system, they will (unsurprisingly) only get returned what is in that system. If they are then going to make a decision on what is returned, they could be missing out on pertinent information that is still outside the system: documents that are still in the early stages of their lifecycle.

These early-stage documents will be the ones that contain information that is most up-to-date, as they are the ones that are still being worked on.  These could therefore carry the information that can make or break the quality of the decision. Increasingly, such documents may not even be stored within the direct control of the organisation – they may be held in the cloud using services such as Dropbox or Box; they may be elsewhere in the chain of suppliers and customers the organisation is dealing with.  As these assets are not in the ECM system, they are less controlled – access rights are not managed; information flows are not monitored and controlled.

Rather than converging on a system that fully manages information, organisations seem to be struggling to control the divergence of information types and locations – and this can be damaging.

A rethink of ECM that moves thorough to an enterprise information management (EIM) system is required.  EIM is approach where information is captured as close to the point of creation as possible and managed all the way through its complete lifecycle to secure archiving or disposal.

Based around an underpinning of metadata, large amounts of information can be controlled.  Rather than pull all the documents themselves together into a massive, binary large object (BLOb) database, these files can be left where they are and only the metadata needs to be managed.  Such a metadata system will be a fraction the size of the overall information, plus it can be mirrored and replicated across the overall technology platform, providing high availability for searching and retrieving single items from the information asset base.

Through these means, all information sources can be included, so enhancing an organisation’s governance, risk and compliance capabilities.  It makes decision-making more complete and accurate, enabling an organisation to be more competitive.  It also provides better capabilities for collaboration around content, as single sources of original information combined with versioning and change management can be managed through the metadata.

In the first of a series of short reports on the subject, Quocirca looks in more depth as to how an organisation needs to readdress its needs around information management in the light of increasingly diverse information assets and growing GRC constraints.  The report is available for download here.


March 6, 2017  11:10 AM

Colocation HPC? Why not.

Clive Longbottom Clive Longbottom Profile: Clive Longbottom

ngd_rack-aisleHigh performance computing (HPC) used to only be within the reach of those with extremely deep pockets.  The need for proprietary architectures and dedicated resources meant that everything from the ground up needed to be specially built.

This included the facility the HPC platform ran in – the need for specialised cooling and massive power densities meant that general purpose datacentres were not up to the job. Even where the costs of the HPC platform were just within reach, the extra costs of building specialised facilities counted against HPC being something for anyone who needed that extra bit of ‘oomph’ from their technology platform.

Latterly, however, HPC has moved from highly specialised hardware to more of a commoditised approach.  Sure, the platform is not just a basic collection of servers, storage and network equipment, but the underlying components are no longer highly specific to the job.

This more standardised HPC platform, built on commodity CPUs, storage and network components, is within financial reach. This still leaves that small issue of how an organisation can countenance building a dedicated facility for a platform that may be out of data in just a couple of years?

For those with a more generic IT platform, colocation has become a major option for many.  Offloading the building and its maintenance has obvious merit, especially for an organisation that is struggling to understand whether its own facility will grow or shrink in the future as equipment densities improve and more workloads move to cloud platforms.

However, the use of colocation for HPC is not so easy.  The power, emergency power and cooling requirement needed for HPS will be beyond all but certain specialist co-location providers.

Power

Hyper-dense HPC equipment needs high power densities – far more than your average colocation facility provides. For example, the average power per rack for a ‘standard’ platform rarely exceeds 8kW per rack – indeed, the average in colocation facilities is more like 5kW.

Now consider a dense HPC platform with energy needs of, say 12kW per rack. Can the colocation facility provide that extra power?  Will it charge a premium price for routing more power to your system – even before you start using it?  Will the multi-cabled power aggregation systems required provide power redundancy, or just more weak links in an important chain?

Also consider the future for HPC.  What happens as density increases further?  How about 20kW per rack? 30kW? 40kW?  Can the colocation facility provider give guarantees that not only will it be able to route enough power to your equipment – but also that it has access to enough grid power to meet requirements?

Emergency power

What happens if there is a problem with grid power?  With a general colocation facility, there will be some form of immediate failover power supply (generally battery, but sometimes spinning wheel or possibly – but very rarely – supercapacitors), which is then replaced by auxiliary power from diesel generators.  However, such immediate power provision is expensive, particularly when there is a continuous high draw, as is required by HPC.  Make sure that the provider not only has an uninterruptable power supply (UPS) and auxiliary power system in place, but that it is also big enough to provide power to all workloads running in the facility at the same time, along with overhead and enough redundancy to deal with any failure within the emergency power supply system itself.  Also, make sure that it is not ‘just a bunch of batteries’: look for in-line power systems that smooth out any issues with the mains power, such as spikes, brown-outs and so on.

Cooling

Remember that a lot of power also gets turned into heat.  Hyper-dense HPC platforms, even where they are using solid state drives instead of spinning disks, will still produce a lot of heat.  The facility must be able to remove that heat effectively.

Taking an old-style approach of volume cooling, where the air filling the facility is kept at a low temperature and sweeps through equipment to remove the heat which is then extracted outside the facility is unlikely to be good enough for HPC.  Even hot and cold aisles may struggle if the cooling is not engineered well enough.

A colocation facility provider that supports HPC will understand this and will have highly targeted means of applying cooling to equipment where it is most needed.

HPC is moving to a price point where many more organisations can now consider it for their big data, IoT, analysis and other workloads.  There are colocation providers out there who specialise in providing facilities that can support the highly-specialised needs of an ultra-dense HPC platform.  It makes sense to search these providers out.

Quocirca has written a report on the subject, commissioned by NGD and Schneider.  The report is available for free download here: http://www.nextgenerationdata.co.uk/white-papers/new-report-increasing-needs-hpc-colocation-facilities/

 

 


March 6, 2017  9:01 AM

Growth of mobile-owning individuals drives need for innovation

Clive Longbottom Clive Longbottom Profile: Clive Longbottom

mobile-phone-indiaAccording to the GSMA, there are nearly 5 billion active individual mobile phone contracts on the planet at the moment.  Sure – many of these will still be for individuals who have more than one device, but it is still felt that by 2020, around 75% of the world’s population will have some form of mobile device.

With global and local handset manufacturers moving from the provision of low-end, voice-only handsets for emerging markets to making cheap smartphones available, this can lead to a whole new approach in how such markets can operate at the social and economic basis.

As mobile connectivity increases in these countries and the use of 4G and 5G overtakes the old 2G and 3G connections originally put in place for the major conurbations, relatively high speed, universal wireless connectivity becomes the norm.  The smartphone can become a personal hotspot for the individuals to use for other items to connect to the greater world as needed.  But what sort of things could this bring in?

Firstly, consider health.  Low-cost wearable sensors could be provided to monitor such things as blood pressure, blood sugar levels and so on.  For patients that have been seen by a travelling doctor and have been diagnosed with, say, a fever, cheap, disposable digital thermometers can measure and send back data via the mobile device on a regular basis, so that the doctor can respond on a more ‘as needed’ basis.

The same goes for pregnancy – rather than hoping that nothing untoward will happen between visits when the doctor/midwife just happens to be in the area, wearables can send back data as needed so that the health of the mother can be monitored centrally on a regular basis.  Many issues can then be dealt with directly over the mobile device, via voice or video call; other areas through the sending of links to the phone; others by scheduling a visit from a lower-skilled local healthcare professional.  Only where a real emergency is obvious does the doctor have to go to the patient directly.

Now consider the economic basis.

As these smartphones do all have browser capabilities, individuals can now cooperate and trade with each other far more easily.  A farmer in one area of the country can use cloud-based systems to find customers in other areas – or can input details of crop availability to food processor companies that may wish to buy the crops.  Issues, such as the occurrence of a pest such as locusts or impending drought, can be quickly logged so that plague tracking can be initiated and dealt with far more effectively.  The farmer can also keep a closer eye on what is happening across their farm through the use of internet of things (IoT) devices being connected to the mobile device.

Small, local farmers can let villagers know when they will be in the area with specific crops, and what price they would like for them.  They can then take orders and adjust prices as necessary to ensure that the entire crop is sold at a good margin in the minimum number of journeys required.

Farmers can also become cooperatives.  They can come together to provide a more complete offer – one lorry can pick up supplies from multiple farms and deliver packages of, say, maize, milk, meat, vegetables and fruit to markets, or even directly to customers.  Smartphones can provide mapping and geo-analytical systems to ensure that the lorries take the optimum route, minimising the costs of fuel and stress on the vehicle itself.

By coming together as a cooperative, it also provides the farmers with greater collective bargaining power when dealing with downstream food processing and wholesale companies. Offers of crops can be sent to multiple different prospective customers at the same time, getting them to compete with each other to gain delivery of the crops to themselves.

Individuals can create their own businesses.  Goods that sell well to richer foreigners, such as ethnic art and jewellery can be advertised directly via the web, using the mobile device as a means of inputting the goods into cloud-based retail systems.  On the sale of an item, the monies paid by the customer can be cleared via e.g. PayPal into an easily accessible account; the individual can arrange for the items to be picked up by a courier or to be sent for first-stage delivery to a more central place via train, boat or plane as required.

For the countries involved, the rise in personal mobile device ownership must be seen as a major chance for individual, local and central innovation.  However, contract prices need to be managed to ensure that the cost equation to the individual is obvious.

Governments may need to provide community systems, where a few mobile devices are made available to a community on the understanding that the devices will be made available to individuals on an as needed basis.  However, this is a minor issue, as the figures show such major growth in device ownership.  Where real help will be required is in creating and providing low-cost access to the cloud-based services involved. It may be that data contracts are subsidised under a country’s health budgets, as the returns can be so major in this area.  Healthcare based cloud services can also be funded the same way – or via foreign aid or non-governmental organisation (NGO) funding projects.  If the device and data contracts are so covered, the individual and their community can then work on building the additional services themselves.

In the early stages, governments may find that providing grants or prizes based around individuals and groups which create innovative cloud-based services that help a specific group of people or deal with a specific general need will drive innovation in how mobile devices can be used.

A mobile device-first approach to social and economic success will be different to that which has already occurred in more mature markets.  It is far more of an opportunity, as there is little technology already existing that must be considered.  Such an environment gives massive opportunities to those involved.


March 1, 2017  10:46 AM

Collaboration – where AV and IT meet?

Rob Bamforth Rob Bamforth Profile: Rob Bamforth

There were plenty of amazing products launched and on display at ISE2017 in Amsterdam in early February. But in the background buzz there was a common theme of an industry in transition. While many talked about convergence between AV and IT, some fear the risk that it will actually be more of a ‘collision’. This will have a consequential impact on jobs and revenues.

sunburst2

None of this restrained the exuberance of showcasing the best of the audio visual (AV) sector. The event brought in a record number of over seventy-three thousand attendees. In many quarters, there was also a more upbeat assessment of the new opportunities that might be created as the AV and IT sectors move closer together. There was also an acknowledgement that this would require some work.

Now the dust has settled and the exhibition paraphernalia is dismantled for another year, it is possible to take a pragmatic view of where the opportunities may lay.

The AV industry is undoubtedly undergoing change, but the IT sector is by no means static or settled. There has been a significant and ongoing shift towards the utility or ‘as-a-service’ model, which some find unsettling for both job security as well as data security. There has also been the liberation of IT into the hands of consumers. Mobile, wearables and the internet of things (IoT) have seen IT shift from the easily managed desktop into a voracious hydra of access options. Great for users and customers, but adding to the already challenging IT operational burden.

Is now a good time then for IT to work more closely with AV?

Historically, the focus of AV could be characterised as the experience within the room and an increasingly spectacular ability to convey information. For many, that meant presentations and over the years, the technology that this encompasses has grown in capability and usability. It has also become more connected.

This is where the overlap with IT, with its focus ‘beyond’ the room and across the network, becomes more apparent.

AV is all about the user experience and supporting media-rich communication. With recent advances in large touch screen and interactive displays systems – mirroring the advances in mobile IT with tablets and smartphones – this user experience has expanded into the important, but often elusive, area of collaboration.

This is high on the agenda for IT. The word ‘collaboration’ has been added onto the end of the term Unified Communications, and peppered liberally across many PowerPoint presentations. Making it a reality that delivers its anticipated value has proved difficult.

Making collaboration a reality

IT is very used to tackling the challenges of integration, security and resilience. It has also been unifying the communications plumbing with the help of major IT vendors. But turning this into seamless simple experiences that people delight in using every day is rarely a core competency. Here is where a closer relationship with AV would be beneficial to both sectors – collaboration rather than collision.

Tools for enhancing communications, by unifying or incorporating different strands of media such as video is only one of the areas where the AV world is moving away from point products toward solutions and building broader relationships in open ecosystems of partners. The industry is now showcasing integrated systems to specific business problems. This is not just for collaboration, but also with omni-channel commerce solutions for retail, tools for education and smart buildings as well as the more obvious sectors focused around entertainment.

This was evident at ISE2017, not only in the way that halls oriented around these business topics as themes, but also in that the discussions and presentations on stands and in the conference, had moved on from form and features to addressing business needs and challenges. With this positive attitude, the AV industry does not need to fear convergence along with IT, but embrace it as this will be good for both sectors.


February 28, 2017  12:52 PM

Seclore – DRM 2.0 revisited

Bob Tarzey Profile: Bob Tarzey

In October 2016 Quocirca reported on a new breed of digital rights management (DRM) tools which have emerged in recent years. These tools have security built in to their core and are designed to support the growing used of cloud stores and mobile computing (DRM 2.0). The post looked in detail at three vendors; Vera, FinalCode and Fasoo. Some others were mentioned in passing, including Seclore, a California-based vendor, with origins in India and some major European customers.

Perhaps the most striking thing about Seclore is its claimed DRM market share for its Rights Management product which it says is second only to Microsoft (the latter embeds DRM in certain of its other offerings). Seclore says its own directly managed customer base of 470 enterprise customers accounts for 4.5 million end users. However, via OEM partners it claims another ten thousand customers with 8-9 million users.

In many case partners are using Seclore Rights Management to extend the scope of existing content management or productivity products to ensure protection continues beyond the scope of the base product. For example, “Citrix ShareFile enables security to “follow the file” through integrating Seclore”; this was required to extend DRM to cloud and mobile use. Seclore has been integrated with IBM’s FileNet content management for the same reasons.

It is not just content management systems. Data loss prevention (DLP) systems were originally designed to deal with content moving around within an organisation’s network and police what left it. This has become too limited an approach with the growing use of cloud stores and Seclore claims both Symantec and McAfee’s DLPs are being extended to enable the external use of DLP using its product.

As well as being designed to address the need for external sharing, Seclore ensures it remains independent of device types and document formats to support as wide a range of use cases as possible.

Another intriguing initiative is that, wherever possible it aims to inherit rights and policies from the original systems, for example SAP and Microsoft SharePoint, rather than having to re-write them. However, policy can be modified and Seclore Right Management also enables policy to change as documents progress through a work flow, for example as financial results move from confidential to public domain. These capabilities are key to making Seclore’s OEM partnerships work.

If, like many, your organisation has reached the point where the management of rights needs to be extended to cloud stores and mobile users, then Seclore should be added to the list of products for consideration. Better still, it may be possible to simply upgrade some of your existing technology if there is an existing Seclore integration that allows you to do so.


February 26, 2017  5:22 PM

Car transport? There’s an app for that

Bernt Ostergaard Bernt Ostergaard Profile: Bernt Ostergaard

The impact of self-driving technology, whether it be Uber-style driverless ride sharing vehicles, automated long-haul lorry driving or drone transport, will be felt across all transport sectors.

The next 20 years will see the steady uptake in driving automation. It will increase real-time communications in order to minimise travel time and cost. Legislators must grapple with standardisation, liability and security issues; while the industry is adding more and more driver-assistant services under the hood without significantly increasing the price point. But what about connectivity requirements, and will driverless cars actually reduce travel time?

Automation in the works

High-end cars today are more than semi-autonomous. Many hundreds of meters of wiring connect sensors to computers, that directly interface with the engine and steering systems. The development of car automation technology is a multi-billion-dollar race – a mix of competition and co-operation between the IT and automotive industries.

screen-shot-2017-02-24-at-08-54-31

Major players on the IT side include Google with its Waymo driverless car technology, and Amazon, Microsoft and Apple with their navigation technologies. On the car manufacturing side GM, has acquired Getcruise to create a range of driverless cars, and Mercedes is developing its Car-to-X technology that lets the car exchange information with the surrounding infrastructure, like traffic lights, and other connected vehicles. Ford is partnering with Amazon to provide its driverless cars with Alexa, Amazon’s smart voice assistant technology, allowing drivers to voice communicate with the car systems.

Automated traffic infrastructures

In a fully automated road traffic scenario (something the airlines are pretty close to in the sky), the speed and course of driverless vehicles is optimised by a city-wide computing system. That requires fast and secure active-to-active WAN connectivity between cars and traffic management systems. The automated – and ultimately driverless cars, will need network connections capabilities to handle in-car IoT communication between sensors and computers, as well as external wireless 4G LTE and WiFi connectivity. The cars may also need satellite connectivity in rural environments.

Advanced navigation systems already have network connectivity to check weather and traffic conditions ahead. Intelligent mapping systems like HERE, supply information to control self-driving cars equipped with street-scanning sensors to measure traffic and road conditions. This location data can in turn be shared with other map users.

Ultimately, driving cars will be left entirely to computers – in cars without steering wheels. We will all be passengers or freight. Mobile connectivity must be maintained using dedicated roadside Wi-Fi networks as well as the existing mobile data services. The ability to switch, select and bond with constantly changing wireless base stations will be crucial for success. This is where SD-WAN routers from vendors like Peplink, that can handle multiple connections as a single virtual connection, are needed across a wide range of mobile environments.

With the driver gone, next to go may be the privately-owned car. The Singapore government estimates that replacing today’s 700.000 private vehicles with network connected, driverless vehicles would reduce the Singapore car pool to 300.000. It would simultaniously reduce transport times and the need for parking spaces. It would generally lower pollution levels and improve road safety.

Reduced travel time?

The Singapore scenario, and similar assessments of driverless traffic, vector in the advantages of much higher traffic density and the reduced need for parking spaces. With central management of in-city transport, users will buy transportation services – not vehicles. What these scenarios do not vector in, is traffic increases, if transport becomes as easy as using your smart phone. When every child, disabled, elderly or drunk person can order driverless transport, we risk a physical traffic volume explosion. Just look at the traffic increases the smart phone caused. So maybe queuing is not going away, just because we automate it.


February 23, 2017  8:30 AM

Durable and circular for more a sustainable mobile device strategy

Rob Bamforth Rob Bamforth Profile: Rob Bamforth

Consumerisation of mobile technology has had many benefits. It has driven down the prices of devices, improved the user experience to the benefit of non-technical users. Plus, awareness and crucially acceptance, of mobile devices, has soared. All of this might seem good for the business, but there is a significant drawback. Consumer attitudes to technology can lead to a throwaway culture, with the obvious impact on wastage. This is not sustainable.

Mobile technology uses precious materials which are becoming scare, expensive to find or politically harder to gain access. Devices also include hazardous waste materials and lots of energy is consumed during production. No surprise that governments are increasingly introducing legislation to decrease waste, lower carbon emissions and penalise polluters.

This is being felt by organisations already, and will have further impact in future as regulations tighten. A much larger direct effect of throwaway technology is the disruptive impact on business processes. This has commercial and not just environmental consequences. Low cost consumer devices might seem simple and cheap to replace, but device failures and unexpected changes affect and interrupt the business process.

Durability

While few working environments are really ‘hazardous’, they can be unpredictable and unforgiving if devices are mishandled. So, a better approach is to make mobile device design fit for purpose and sufficiently durable. This means considering not only the device itself, but also the peripherals, accessories and software that will be used with it over its life.

More durable ‘whole life’ design should ensure devices can be maintained in the field. It could also take into account that devices should be compatible over several generations with replaceable components such as batteries and other ecosystem elements such as printers, scanners, cases etc. This would keep costs down and address some of the environmental concerns about wastage from replacing items that still work but have been made obsolete because of changes to the core device.

Whole life design would also offer better support for business continuity where workers rely on mobile devices.  If devices are not sufficiently durable there is always the risk of something breaking. Failure of any single element of the system causes downtime, aborted processes and user frustration.

Circular economics

A different economic model, which takes the whole life approach, has been suggested from the work of the Ellen MacArthur Foundation. This followed the “cradle to cradle” concepts established by William McDonough and Michael Braungart. It has at its core the term ‘circular economy’, and its approach is to replace the current largely linear approach of ‘take, make and dispose’ with one in which resources circulate at high value, avoiding or reducing the need for new resources.

There are many environmental benefits to a more circular economy and a ‘circular’ or sustainable approach to mobile devices – from reducing greenhouse gas emissions and other pollutants, relieving pressure on raw materials and energy consumption. Such circularity could also be directly beneficial to businesses and the mobile workforce:

  • Durable design. Products are built to last and survive the day to day knocks and challenges of an active working environment. Products are increasingly built for energy efficiency with simpler in-field replaceable components e.g. Batteries.
  • Sustainable Supply Chain. Products are designed for in-field upgrade and re-use at end of life. This provides opportunities for remanufacturing and refurbishment across the supply chain.
  • Recycling and Recovery. Manufacturers operate comprehensive warranties and take-back programmes at the device end of life, making it easy to return and responsibly recycle hardware.
  • Product life extension. Manufacturers are able to extending the product life through software-upgrades and firmware updates. This allows for long-term compatibility with peripherals and accessories to ensure all elements of the mobile device ecosystem remain in active use as long as possible.

The key principal is that while a more circular approach offers environmental benefits, it also provides benefits to the business; direct cost savings in the total cost of ownership of devices, and indirect savings when looking at the reduction of disruption to the mobile business process. This approach to mobile device durability is further explored, with guidelines for how to build a sustainable mobile device strategy, “All mobile, still working, becoming sustainable”.


February 16, 2017  10:25 AM

Forget intelligent cities – how about a more intelligent planet?

Clive Longbottom Clive Longbottom Profile: Clive Longbottom
iot

Gin Lane by William HogarthThe year is 1750 – just before the industrial revolution. The overall population of the UK is around 6.5m. London has a population of 675,000: the second most populated city is the sea port of Bristol, with 45,000.

Roll on to 1850, when the industrial revolution has passed its peak. The UK population is now 26m. London has a population of over 2.5m. Liverpool, Manchester, Birmingham, Leeds and Sheffield have all overtaken Bristol, all with populations of over 150,000. Whereas London has remained at around 10% of the country’s population, the next 5 cities have moved from being around 2% of the population to around 6%.

The transformation of the northern cities pulled in people from the outlying areas. Moving to the cities ‘paved with gold’ was seen as the way to become rich via working in the new mills. What it really led to was the London that moved from Hogarth to Dickens, and northern cities where malnutrition, illness and the poor houses led to high rates of death in these expanding, coal-fired conurbations. The countryside suffered – there were fewer workers available to work in the fields, and this led to less fresh food being available to feed the growing population of the cities, leading to diseases such as rickets and scurvy. Even where workers remained outside of the cities, they were increasingly pulled in to working down the mines to fuel the growth of the cities.

Is an equivalent happening as technology creates the digital revolution?

The wrong focus
The focus to date has been on the intelligent city. This has a degree of sense, in that it provides constraints around a vision. Those creating the intelligent city can focus on specific boundaries, a specific population of people and specific desired outcomes. However, if the desired value of the intelligent city is forthcoming, then that city becomes more attractive as a place to live than other cities, towns and villages around it. This has been seen in cities where hyperspeed internet has been introduced, and where integrated citizen services have improved accessibility of certain services. The massive growth of cities such as Pune in India (10-year population growth 40%) and Shenzhen in China (25%) shows how people are still being sucked in to high-growth centres.

London now has a population of around 8m – more than the whole UK population back in 1750. Even with massive improvements in technology, it is struggling – many organisations find that advertised internet speeds are rarely (if ever) achieved; housing costs are driving people to live outside the city and travel in; transport and utilities are struggling to cope with demand; homelessness is on the increase. London is far from being an intelligent city.
This is where the internet of things (IoT) may be able to help. Instead of focusing on an individual city, governments, organisations and communities should start to focus on citizens across the whole of the country, and even beyond. Each citizen has their own needs, whether they be a city banker or a country farmer. Prioritising one above the other leads to increasing friction and feelings of ‘us and them’ between individuals and groups. Providing a level playing field leads to a more cohesive community, which then leads to greater success as a country.

Better internet
As a starter, providing good levels of internet access to villages means that more people can work from these areas, so moving away from the large second home model that is prevalent in the UK. Many of these homes are empty for large parts of the year, meaning that few fully local businesses can survive. Enabling people to spend more time in the villages can revitalise such local businesses – the butchers, bakers and greengrocers, for example. This does not mean the government’s target of a minimum of 10Mb/s as a universal service offering (USO) by 2020 is going to help much.

The UK is already way down the global internet speed rankings. With some countries already working against USOs of 1Gb/s and some cities, such as Chattanooga in the US already stating a USO of 10Gb/s by 2030, 10Mb/s is looking a little like wet string and baked bean cans.

As consumers move increasingly to a digital economy, the need for faster broadband speeds is pressing. Sure – basic browsing, buying and information seeking can be done on 10Mb/s, but telephony, music, video conferencing and HD TV streaming will be constrained by such speeds. 5G could help here by providing high speed connectivity without the need for dependence on ageing copper and aluminium infrastructure.
The broader use of the internet of things (IoT) also needs good connectivity. Farmers can use IoT devices on their farms to optimise the use of the value chains from the farm to the fork – but the data being gathered by thousands of devices on the farm needs to be dealt with adequately. Some of that can be managed through intelligent filtering at the farm itself, but true high speed internet will help enormously in the capabilities to aggregate, analyse and report on the data.

Public transport can also benefit from such connectivity – citizens can ensure that full itineraries are created and managed in real time, linking buses, taxis, trains and so on lowering the needs for owning cars. Indeed, as autonomous vehicles come through, the need for solid connectivity to clouds where the vehicles can exchange and act on data becomes a necessity.

The right skills at the right time
Identifying skills as required in real time can also be better enabled. Need someone to come to you and fix your printer? Maybe there is a mobile engineer not too far away at the moment – GPS tracking and mobile work ticketing can make it that the expert can be there in a matter of minutes or hours. Likewise, need someone to fix your machinery on the farm? Don’t wait until tomorrow – either have the IoT pre-identify the problem before it becomes a major issue and call in an engineer to swap things out, or get the nearest engineer on site as soon as possible – without needing to pick up the phone.

The whole of the UK can benefit from an IoT based around better connectivity – it can move from these too highly focused intelligent city projects to an intelligent, and far more productive, country model. If countries would then start to use connectivity in a positive manner to break down the bureaucratic and nationalistic walls between nations, then we may – just may – be able to move toward the intelligent planet.


Page 5 of 31« First...34567...102030...Last »

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: