IoT Agenda


October 11, 2017  10:48 AM

5G alone won’t power IoT

Greg Najjar Profile: Greg Najjar
5G, Connectivity, Internet of Things, iot, IOT Network, Redundancy, RF, spectrum, Wireless

While the internet of things has the potential to be transformative across industries, businesses and enterprises share grave concerns about connectivity. In fact, a new Inmarsat report that included 500 senior management respondents from major IT organizations found that 54% said that connectivity is the biggest obstacle in IoT deployment. That’s today, before Gartner’s famous prediction of 20 billion connected devices by 2020 becomes a reality. Ericsson and Cisco go further in their estimates and have each forecasted 50 billion devices in the same timeframe.

It’s also before the nationwide rollout of 5G, which has been misidentified as some sort of connectivity panacea for devices of all types. Sure, 5G connectivity will absolutely provide more speed of data transfer for select devices. It will not, however, be a single source of wireless connectivity for every connected tablet, sensor, car or refrigerator.

There are three key reasons that 5G will not be the sole source of IoT connectivity:

1. Business and connectivity models: Today, companies typically pay a monthly subscription fee for each of the devices that they connect. It seems unlikely that an enterprise can continue this approach with hundreds of thousands or even millions of new devices coming online. Consider a logistics company that places a sensor on each of the million packages it ships every day. That’s potentially millions of new monthly subscriptions paid out to a mobile carrier.

2. Uses of various spectrum: In tandem with the idea of today’s connectivity models, it’s important to consider how connectivity is delivered. The national approach to wireless spectrum grants various companies the ability to use certain swaths of bandwidth. In some cases, other swaths are reserved for specific uses (for example, public safety) and still more bands are set aside as white spaces. This approach aims to ensure that everything that needs to be connected today can be connected; however, scalability could become an issue as the need for more connected devices increases.

For instance, a connected oil pipeline may only need a low-power sensor that identifies whether oil is flowing correctly. The amount of data being transmitted may be very small in this case, whereas its origin may be a remote area with very little connectivity. The oil company would likely want that signal to be transmitted over a very low frequency band because it travels better.

At this time, carriers are testing different spectrum options for 5G applications, from lower bands such as 600 MHz to higher bands such as 2.5 GHz, 3.5 GHz and 5 GHz, both licensed and unlicensed spectrum. Finally, through various acquisitions, some wireless carriers are also exploring even higher spectrum testing in 24 GHz, 28 GHz and 39 GHz bands for 5G. With all of the various spectrum options being tested, the likelihood of “one size fits all” seems remote, and it will take a variety of technologies and devices for a business with different needs to support its IoT needs.

3. Redundancy: It’s critical for a device to have redundant, multiple connections that ensure continuous connectivity. This ranges from airlines having multiple ways to track airplanes, to healthcare companies being able to monitor the signals coming from heart monitors. However, each frequency can require its own antenna, which potentially makes devices bigger, while different frequency bands have their own challenges, such as the inability to penetrate concrete. In these instances, we could be discussing multiple monthly subscription fees for each device. It’s untenable, if not impossible, to fathom.

IoT is coming, and there are companies looking to solve some of the challenges laid out above. But for enterprises hoping for the confluence of high-speed mobile connection provided by 5G and the data created by IoT, it’s important to take a step back and realize that there is a wide range of issues on the horizon as we get closer to 2020.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

October 10, 2017  3:33 PM

Mobile and IoT drive better outcomes in healthcare

Brian Lubel Profile: Brian Lubel
Connected Health, Healthcare, Internet of Things, iot, IoT applications, MDM, Mobile, mobile device management, Mobility, SDN

The healthcare and life sciences industries continue to accelerate the adoption of mobile devices and internet of things technology to simplify the patient experience, improve quality of care and create a positive impact on patient outcomes. Remotely connected medical devices, such as cardiac monitoring devices, insulin pumps, CPAP machines and other devices, provide valuable real-time patient information, without the requirement of the patient being tethered to a hospital or healthcare facility. The ability to remotely monitor patients means shorter hospital stays or, in some case, avoiding the hospital altogether.

IoT helps provide physicians with more accurate data which can be allocated to one location that is accessible to all caregivers. For example, a weight scale or blood pressure monitor that transmits data automatically takes out the possibility of human error when transcribing. In a recent survey by Applied Clinical Trials, one of the biggest benefits of mobile health technology in clinical trials is 35.2% of improvement in data quality.

Furthermore, IoT provides better quality care by opening a line of constant communication with the primary care physician to proactively intervene when they see troubling trends with the patient data. This simplified experience is economically beneficial for the patient as it saves costly trips to the emergency room and requires fewer trips into the office, thus giving them a better quality of life.

However, for remote patient monitoring — or any kind of connected healthcare application, for that matter — to be effective, the technology has to be both seamless and secure. Device and network security in this increasingly connected industry are huge concerns.  In fact, according to a recent study by Synopsys, 67% of medical device manufacturers and 56% of healthcare organizations believe that an attack on medical devices built or used by their organizations will happen within the next year. Cybercrimes are generally financially driven. Medical data can be used for tax fraud and identity theft, or used to obtain prescription medication.

Healthcare providers have begun to take steps to prevent future attacks by securing existing devices and protecting new ones; however, they aren’t going at it alone.

Many organizations struggle with the end-to-end IoT experience — including the complexities associated deploying edge IoT devices, ensuring that they are reliably connected to a company’s cloud infrastructure and administration of ongoing device management and network security, as well as logistics support. These companies are turning to third-party managed services providers to manage their IoT technologies and infrastructure to help manage the costs and complexities.

On the network edge, a longer-term strategy should include mobile device management (MDM), which can help secure mobile devices that are deployed across multiple mobile network operators and their various operating systems, organizing device management under one umbrella. In many connected healthcare systems, these mobile devices act as connectivity hubs for medical peripherals and wearables, and introduce security and data privacy threats if not properly secured and managed. One approach to addressing this is through MDM technology. While MDM products weren’t originally designed to support IoT devices, some technologies have been adapted and specifically tailored to IoT healthcare devices to address their specific vulnerabilities, providing secure technologies to lock down devices.

On the network management side, one way to provide added security is through software-defined networking (SDN), which can essentially create a private healthcare network on existing public internet. Applying SDN technology to connected healthcare systems results in more control and flexibility than traditional costly and time-consuming VPN technologies. SDN creates a tunnel that is “invisible,” while dispersive technology encrypts the information traveling through, then reassembles it at the endpoint. In addition to enhanced security, applying SDN to connected healthcare systems provides improved business agility. The ability to segment and separate the network into subgroups allows more control over the network and the data.

Healthcare organizations are increasingly adopting IoT to improve patient experiences and outcomes. However, with IoT comes a new series of challenges, including data management and security. Proper ongoing management and administration of the connected device and network infrastructure allows healthcare organizations to capitalize on the benefits of IoT and provide a successful, secure and seamless experience for patients and their healthcare providers.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


October 10, 2017  1:04 PM

Are you ready to manage connected digital ecosystems?

Craig Bachmann Profile: Craig Bachmann
Digital ecosystem, Enterprise IoT, Internet of Things, ioe, iot, Partnerships, RFP, Service providers, trust

The internet of everything (IoE) connects people, places, things and services. While this offers businesses an abundance of opportunity for innovation, new business and revenue, it also comes with some great challenges.

In this new reality, connecting and curating digital ecosystems are essential to defining, designing and implementing complex, IoE-based services. With market estimations ranging between $15 and $60 trillion over the next 15 years, a business’s ability to integrate and manage the digital ecosystems represented in this term “everything” will quickly sort the winners from the losers.

Learning how to master ecosystem management will allow businesses to grow, while also reducing the risks associated with multiple close, dynamic, inter-reliant partnerships. But how do you go about this?

Ecosystem management is a new concept to the majority of organizations today. It is not part of any MBA program, nor is it discussed at conferences on digital business. The lack of information and expert leadership surrounding ecosystem management means that it remains a largely abstract concept for most businesses, regardless of size and current market position. Only recently has the job title “ecosystem manager” appeared in relation to IoT projects, as stakeholders recognize the imperative of connecting digital ecosystems.

Sharing risk and reward across partnerships

Success comes down to being able to connect digital ecosystems as well as manage risk and reward across this dynamic landscape. Over the past few years, collaborative work among TM Forum members has resulted in breakthrough insights and the development of practical tools for ecosystem management. This has allowed stakeholders to better understand the complex ecosystem partnerships, required to deliver end-to-end connected services.

The IoE ecosystem is an intricate value fabric, as opposed to the traditional linear value chain — and integrating this value fabric is vital to commercial success. Digital transformation programs aimed at enabling ecosystem business models and integrating this value fabric have become ubiquitous. Indeed, in order to grasp even part of this multitrillion-dollar market, companies will need to adapt their entire business model — as well as technology, operations, processes, internal culture and attitude towards customers, partners and competition.

Monetize services, manage ecosystems and build trust

Broadly speaking, the main challenges to IoE success can be summarized into these three questions:

  1. How do you monetize IoE and the new digital services?
  2. How do you manage the intricate and broad ecosystem of partners needed to deliver these services?
  3. How do you establish digital trust across these ecosystems, right through to the end user?

Though, no single company can expect to find a miraculous solution to these multifaceted, multitrillion-dollar challenges. However, through open collaboration, steps are being made. For example, TM Forum’s members are making great strides by collaborating on digital service and platform reference architectures, an ecosystem mapping and management tool and a suite of open standard APIs.

For different businesses, there are of course different challenges. However, as mentioned above, most if not all could certainly benefit from open collaboration and innovation. Let’s take the example of responding to request for proposals (RFPs). As RFPs begin to roll in and opportunities arise from enterprise and consumer IoE applications, designing a clear set of risk and reward sharing frameworks to manage the flow of the buy/sell process and managing monetization, management and trust across ecosystems will become a real focus. Connected technologies are increasingly evolving from M2M to IoT to IoE, so responses require end-to-end consideration of all supplier partners providing the final connected or “smart” offering to the end-user. With this in mind, communications service providers, for example, will need to determine their role in the value chain, while also bearing in mind that this role may vary from one RFP to another.

IoT M2M IoE

Courtesy of Charles Reed Anderson

The ability to manage complex digital ecosystems can make or break IoE initiatives

Managing ecosystems is a new professional capability required for the digital marketplace, and this will require new skills, tools, resources and methods of aligning business, technology and processes across stakeholders of any nature. There’s a lot to build and lot of opportunity out there for those prepared to take on the risks and challenges of building a seamlessly connected economy.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

Thandi DemanetThis article was co-written by Thandi Demanet. Thandi is the business analyst for TM Forum’s IoE & Digital Business program, working closely with members, including some of the world’s leading communications service providers and technology providers. Thandi helps design and support implementation of the Forum’s collaborative work in these areas, from strategic to tactical. Her aim is to optimize effective and efficient open collaboration on the most pressing challenges and opportunities within IoE and digital business for the creation of tangible best practices that all TM Forum members can implement for success in the digital economy. Thandi is passionate about the power and promise of IoT and technology to improve lives, societies and the world we live in.


October 9, 2017  12:11 PM

Speaking of security: Smart speaker risks and rewards

Olivier Legris Profile: Olivier Legris
Amazon, Consumer IoT, Internet of Things, iot, IoT devices, iot security, security in IOT, Smart Device, smart home

The smart speaker market may still be in its relative infancy, but with the rise in connected devices and a growing demand from consumers for more convenience, this is one market that is maturing rapidly. But as with any transformative growth in the digital landscape, cautionary tales have come to light, flagging up concerns around security hacks and privacy invasion. Are these warnings legitimate or are they just scaremongering tactics built on misinformed opinion? Olivier Legris, lead strategist at Future Platforms explores the proliferation of this market and considers the real threats posed by smart speakers of the future.

Our homes are supposed to be safe havens, but with modern houses being increasingly connected, a question mark hangs over their inviolability, leading us to wonder if they are in fact becoming hotbeds for cyberattacks.

According to Gartner, the number of internet-enabled devices in use globally is expected to hit 8.4 billion by the close of 2017; a figure which is hotly tipped to soar to 20.4 billion by 2020. Drilling deeper into these forecasts, Gartner has also flagged that 75% of U.S. households will have smart speakers by 2020.

Global Market Insights (GMI) bolsters this prediction, indicating that the smart speaker market will feel a strong surge, fueled by the accessibility to higher network connectivity and a push to meet the increasing needs for consumer convenience. Indeed, GMI forecasted that the size of this market will exceed $13 billion, with 100 shipments of over 100 million units by 2024.

Drivers of change

Armed with this foresight and to stay ahead of the competition curve, device manufacturers are investing in the improvement of smart speaker functionality, pushing them beyond their initial music-playing capacity. While the development of the Bluetooth speaker played a part in driving the growth of this market, it was the Amazon Echo, combining artificial intelligence with Bluetooth technology and a voice-activated interface, that was first to truly disrupt this scene, marking a turning point in terms of driving interest in and fast-scale consumer adoption of the technology.

Dominant companies such as Amazon have, for a few years, refocused their attention on developing voice technology — perhaps spurred by visions of how voice may become the second generation of search engine. And so, transformational changes in the smart speaker market have inspired others to get on board the innovation train. Therefore, with Google and Apple both hot on Amazon’s heels, we have seen the evolutionary journey of the smart speaker accelerate.

Speaking engagement

As more players enter the market, aggressive pricing from the likes of Amazon may have turned the thumb screws on the competition, but it has also had a favorable impact on adoption rates as smart speakers of today become more affordable. But pricing is not the only allure.

The concept of a smart home is changing the way we live, and when connected devices continue to develop increased benefit to the consumer — by making their life better in some way or offering more convenience by making it possible to, for instance, program a complete and autonomous home system — engagement with smart speaker technology will rise.

What’s the use?

Since the launch of the first smart speakers onto the market, adoption levels have been on the rise. However, there have been some levels of hesitation to adopt the technology due to doubts around the usage factor — for consumer and developer alike — because no one has yet demonstrated a usage that we can’t live without.

From a developer’s perspective, there is still a great deal of constraint imposed by the companies behind the technology. This is particularly pertinent in terms of what can be done at voice level, which adds limitations to achieving a group of functions that have the power to change consumer behavior.

Currently, the technology is based on simple commands — perhaps requests for music or simple reminders, similar to how it works on the iPhone, for instance, where the Siri kit is used only for specific use cases such as ordering a taxi or booking a meal.

While aggressive pricing has been a draw for some, it still remains out of reach for others, which has also fed into the levels of consumer adoption. Consumers need to justify spending up to four times more for a product which is not such a distant relation from the Bluetooth speaker at this point in time. While the smart speaker is still seen as an entertaining gadget, rather than a useful commodity, not all consumers will feel the investment is worth it.

Beyond pricing, the issue of distribution has also been detrimental to adoption as it has been so limited. Both Amazon’s and Google’s assistants have been available only online. But often, consumers need to see these products on the shelf. The lack of opportunity to tangibly explore them can turn consumers off. It is not a surprise that one of the first things Amazon did when taking over Whole Foods was to sell Amazon Echo speakers.

The last hurdle for adoption is a challenging one. Homes are symbolic. They increasingly represent our last private space. So to bring something into your house that is alleged to “spy” on you is a big step to take. The challenge with any technology is that consumers don’t own it for the sake of owning technology. It has to function in some way and have a positive impact on life — making it better, more convenient, easier.

At the moment, the threat-versus-benefit balance is weighted towards the threat. Once the real usage is cracked and the benefits become clear, the scales will tip in favor of the tech. Facebook offers a strong example of how users will readily make this switch, trading off privacy for the benefits.

And so, once smart speakers become an object of necessity and a means of making life better, then we could have a scenario where any residual concerns around trust and security fade in importance when offset by the increased efficiency and convenience that the devices could give in return.

Should we be worried?

While convenience and pricing attract some buyers, they do not overshadow all security concerns, especially those associated with adopting a new technology into your home that seemingly listens to your every word. But is it realistic to assume that this will be a long-term hesitation built on worries over hackability and invasion of privacy? Given the quick tradeoff that some generations of consumers are demonstrating between convenience over data, this scenario seems unlikely.

The scaremongering and media hype surrounding today’s smart speaker does not necessarily paint a realistic picture.

A prime example of scaremongering tactics is evident in the recent news that Chinese developers — using a technique called the dolphin attack — discovered a “terrifying” vulnerability in some of the leading voice assistants. Here, the vulnerabilities were claimed to affect “every iPhone and Macbook running Siri, any Galaxy phone, and PCs running Windows 10 and even Amazon’s Alexa assistant.”

Any consumer reading about the dolphin attack, at face value, will of course feel threatened.

But looking beyond the surface, it’s not as devastating as first assumed, particularly when it comes to smart speakers.

In reality, these hacks were made possible only when the attacker was within close proximity of the device — within inches, in fact. In which case, it’s difficult to consider the possibility of a home-based Amazon Echo, for example, being hacked in such a way by a stranger — unless he was inside your home and close to your device.

Nevertheless, this kind of remote hacking is a real fear, and is reminiscent of consumers’ concerns over contactless cards when they were first introduced. It was widely thought that if someone was close to you, they could use the NFC waves to copy the ID of the card without anyone noticing. But similarly, look at contactless today — with clear information and guidance, consumers have come to understand that because of the physical distance, this scenario is unlikely. Usage and trust has increased and concerns have all but faded into nothing.

The findings from examples like the dolphin attacks are important in raising awareness to potential threats — but the underlying takeaway from this is that a physical presence and proximity was needed for it to happen.

From a security perspective, the big guys such as Amazon, Google and Apple have delivered smart speakers to the market that are, in fact, fairly secure. The main security threat will most likely come when cheaper products enter the scene, where you do not know who or what lies behind them. It will be interesting to observe consumer patterns this Christmas and next, when simpler, copycat versions hit the market. This is when the alarm bells will need to ring out.

Conversations being recorded remotely, spying from a distance, invisible attackers … these will be the headlines that will warrant the serious attention. At this point, home-based smart speaker security will then need to come under harder scrutiny.

There is, at least, a greater capacity to control the threat within a private space like your home. However, the possibility of remote hacking will generate a new plateau of concerns when it comes to smart speaker usage in public spaces.

Imagine an airport lounge, a shopping mall or a hotel reception … these could be the real hotbeds for cyberattacks. It could also raise serious ethical questions about ownership and responsibility. Will we start to see stickers on store-front windows as we have for CCTV? Warning customers that a smart speaker is in operation?

Future sense

Hackers will continue to try their hardest to create chaos in our lives, and there will always be manufacturers who try to beat the competition with shortcuts in order to capitalize on consumers’ increasingly squeezed budgets. But there is still good news for consumers: For now, there remains an element of control.

By researching and ensuring they do their due diligence, consumers can — for the moment — safeguard themselves, their smart speakers and ultimately their homes. They should be mindful of copycat versions and invest in known brands. Paying that little bit extra from the outset could save them more than just money in the long term.

The smart speaker is already on a journey of evolution — its future shape and form and function will depend heavily on the development of the wider ecosystem, and on how much more of our homes will become automated. There’s a plethora of possibilities for progress; consider TV viewing and online services such as Netflix — voice control could really create exciting opportunities in this space. And with an increase in connected devices, these opportunities will only grow.

Ultimately, smart speakers of the future could potentially become the brain of your home. If it’s a weak link, then it’s a weak link to everything else.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


October 6, 2017  2:10 PM

How IoT unlocks customer value: Five core truths of customer experience

Brian Hannon Profile: Brian Hannon
consumer experience, Consumer IoT, Enterprise IoT, Internet of Things, iot, Personalization, User experience

Traditional product manufacturing went like this: We built it, we tested it, you bought it. Job done. Product managers never heard from customers — and frankly, they didn’t want to. If the R&D team wanted to tap into any user data, some heavy lifting was required to really put an organization in the customer’s shoes. Despite very clever research methodologies, data required a great deal of interpretation to show insights.

But now everything is dramatically shifting. With the amount of data doubling every two years, it is instant and substantial. Customer touchpoints are exploding and product manufacturers need to grasp the potential of customer experience (CX) sooner rather than later if they don’t want to miss out on revenue opportunities. There is huge potential for many brands, beyond the applications, in how they use data to craft customer experiences.

So, from a CX perspective, how is IoT going to impact the value in the customer relationship? What are the CX mechanics that will impact this relationship and increase advocacy among users?

No CRM or brand advertising will come close to IoT’s capacity to engage customers emotionally and deepen customer value. For many brands, it will change the principles of customer relationship marketing in much the same way that digital changed the advertising industry. To further hone this point, I’ve gathered five core customer truths to explore the impact of data in engaging customers through IoT:

1. Prediction

You might be wondering how cloud-enabled sensors and actuators can change customer experiences. Put simply, a new customer journey is born. A new model where products predict and manage impacts on performance and delivery. As customers, IoT is removing the burdens of ownership and those points of irritation which can, collectively, send a customer relationship over a cliff edge.

One of the great burdens of electric car ownership is range anxiety. Or in other words, “Am I going to get there?” Of all the great features in Tesla, its capacity to use predictive analytics to optimize journeys and keep customers on the road is critical. Even before the driverless age takes off, data is looking ahead to ensure we’re getting better experiences behind it.

2. Insight

The new insights loop enables us to see products used and consumed through objective and vivid data. Not only is this kryptonite for R&D, it shifts the customer-product experience, enabling brands to share beautiful visualizations of data, ideas for optimizing usage and evolving technology, and comparative consumption patterns across use groups. And for customers, it enables them to explore their own consumption patterns.

In doing so, data becomes the product. The data generated becomes part of the product experience itself and transforms our expectations of functionality. Take Nest, for example. For decades, the home thermostat was a clunky one-way device to control your home climate. Now it’s a living, breathing, connected dashboard enabling customers to optimize energy usage based on real-time behavior. This not only results in money savings for the user, but also engages customers with an effortless brand experience. Knowledge becomes power for both the brand and customer.

3. Personalization

Personalization is arguably one of the biggest pain points when you call customer service. After running through a lengthy automated call screening process, you have to run through all your PINs and passwords again when you finally get through to a customer support specialist. It’s not just the time investment that bothers us, it’s the fact that so much of it is avoidable.

A smart IoT model changes this. Not only can data instantly identify connecting customers, but customer support can access product inventory, observe usage patterns and assure customers they’re in the hands of a CX support agent who genuinely knows who they are. Customer support, therefore, becomes truly personalized. An IoT + CX ecosystem delivers an almost concierge level of support.

4. Community

Sharing data across user groups and communities creates powerful tribes. Speak to any wearable tech advocate, be they a fun runner or cycling road warrior, and you’ll realize the fiercely competitive world of segments and leaderboards. Big data brands like Nokia Health realize the potential of community through competition, not just connection. This living data era is pairing those trusted running shoes with the cloud and generating astounding content to share with those you love (and love to compete with).

5. Humanization

IoT humanizes products like never before. A great deal of brand strategy is invested in humanizing products so we can unlock emotional connections. Any half-decent brand idea will talk in terms of human personality, to the point that customers could describe the brand as a real person in the room.

Brands can bring genuine personality to product experiences. You can finely tune the personality you project. You can intellectualize through data and build emotion through semantics, content and tone of voice. For a humanized product economy, it is simply revolutionary.

The expectation deficit

Ric Merrifield, the IoT CX consultant, cites the point at which acceptance and expectation converge. For the IoT industry to realize its potential, customers need to accept access to their data. This opens an “expectation deficit.” Customers understand how valuable that data is to the product owner, but where’s the payback for the customer in this relationship?

The IoT sectors and applications that are seeing early adoption are those that balance the legacy functionality (a watch to tell the time) with the data functionality (a watch that tracks your activity). Customers share the value of this data and the expectation deficit is balanced.

The greater challenge is posed to traditional industries migrating to the IoT era. The internet of things is really an “internet of customer experiences.” Data will grow organically and rapidly, so brands will need to harvest it for powerful relationships. It is time for clever product engineers to grow into brilliant CX engineers too.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


October 6, 2017  12:07 PM

Succeed in enterprise IoT: Narrow your focus and focus your vision

Chris Witeck Profile: Chris Witeck
Business strategy, Digital transformation, Enterprise IoT, Internet of Things, iot, IoT platform

According to a recent Cisco survey, 60% of IoT projects stall at the proof-of-concept stage. And of the 40% of IoT projects that made it past the proof-of-concept stage, only 26% were considered a success. That adds up to a pretty dismal success rate. Is this indicative of IoT being overhyped and oversold? Or, to borrow from Gartner, is this just IoT slipping into the “trough of disillusionment” before finding that “slope of enlightenment?” Having talked to many organizations evaluating IoT projects, many organizations perhaps set the bar too high with their initial IoT projects, and then find it difficult to tie together too many disparate systems and apps, and then keep track of everything. This was reinforced by the same Cisco survey that identified two main failure points for IoT projects: integration complexity and lack of internal expertise. Based on my research and time spent helping customers to deploy IoT projects, I’ve seen two crucial elements that set the projects up for success in the long run. Enterprises stepping into IoT initiatives, need to “narrow your focus” and “focus your vision” by starting with more narrowly defined IoT projects, drilling in on how IoT can be an enabler for solving specific business problems to set the project up for success.

Step one for IoT success: Narrow your focus

Already there is evidence of a shift to a more focused and specialized approach to building IoT technologies, something that may help to limit the overall complexity and integration challenges that come from using generic tools to solve a complex problem. One recent example recently was GE acknowledging that building a horizontal IoT platform stack is perhaps a challenge too large, and in order to succeed as an IoT service provider, you need to narrow your focus — and in the case of GE, that means selecting specific industries where it can focus its IoT technology efforts. While the big horizontal cloud providers, such as Microsoft and Amazon, may dominate at the compute and IoT platform layer, it is down at the orchestration layer where this specialization will take root. Essentially focusing the movement of information between users, IoT devices, enterprise applications and cloud services on a specific and narrowly defined business problem, while using tools and techniques optimized for solving that type of problem.

I witnessed this focus firsthand in conversations with healthcare organizations. After the initial broader and more technology-focused experimentation, these organizations then started to invest in a variety of very specific, very focused IoT workflows. Just some of the examples I encountered included automating the movement of patient data from a smart device to a medical record within a patient space, accelerating the movement of patient data to clinicians based on clinician location, or orchestrating the tracking of and locating of medical equipment in use across the organization. These were unique solutions focused on solving specific business problems as part of a broader digital transformation initiative. And rather than attempt to build these IoT workflows in house, they were instead looking toward service providers and system integrators with expertise in that area, or also looking to healthcare-focused startups to assist with these efforts. This is not unique to healthcare; similar examples can be found across any industry. This was reinforced in a recent survey from Vanson Bourne where 74% of enterprises said they planned to work with external partners in building their IoT systems.

Step two for IoT success: Focus your vision

This narrowed focus speaks to the evolving maturity of enterprise IoT and will help enterprises reduce complexity in their digital transformation initiatives. Yet, it only speaks to solving one piece of the puzzle. As organizations invest in these IoT workflows tying together devices, things, users, cloud services and on-premises applications, they are effectively stretching their network boundaries to anywhere these connections occur. While Metcalfe’s law speaks to the inherent value of these connections, managing and securing these connections will require a level of visibility that most organizations are not accustomed to. This will require complete east-west-north-south visibility and visibility into the event streams generated by every interconnected device, thing and application tied to any IoT workflow. And this visibility needs to be in real time to allow for rapid responsiveness in optimizing, troubleshooting and securing IoT workflows. While this sounds like you need to broaden your visibility by casting a wider net, in reality it actually means you need to focus your vision by putting your network data into the relevant business context. This necessitates taking a more proactive, analytical and business-focused approach to network visibility. The challenge is that this level of network agility, visualization and analytical capabilities has been traditionally unavailable to the enterprise. The good news is that there is an evolving category that speaks to this type of focus, something Gartner is referring to as “NetOps 2.0.” This moves from thinking of NetOps as a tool for optimizing network performance toward a methodology for helping business initiatives to succeed.

Summary

The reality is that most organizations are investing in IoT, with efforts increasingly tied to solving problems related to their overall digital transformation strategy. While many initial IoT projects may have floundered at the proof-of-concept phase, organizations have learned from those initial challenges and are starting to narrow their focus, defining concrete business objectives they want to tackle, working with service providers who are also narrowing their focus. Yet the enterprise must not forget to also focus their vision, ensuring as their IoT workflows touch devices, things, apps and users across the globe that they have the visibility required to ensure that these workflows, and the business objectives they are tied to, are optimized for success.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


October 5, 2017  4:08 PM

How IIoT platforms with AR/VR help OEMs reduce operating costs

Rick Harlow Profile: Rick Harlow
AR, augmented reality, IIoT, Industrial IoT, Internet of Things, iot, IoT platform, OEM, Virtual Reality

Kurt from our Houston office recently visited an upstream operation in Eagle Pass, Texas. At this operation, there was a variety of mission-critical equipment operating and collecting crucial production data points. It took Kurt a good six hours to get the facility. It was time-consuming, tedious and it cost money. We were asking ourselves a simple question: “How can we reduce Kurt’s visits to Eagle Pass by combining the 3D immersive experience of a virtual reality (VR) tool with the deep advanced analytical capabilities of an IIoT platform?” That question led to the development of augmented reality (AR)/VR apps that gracefully compliment an IIoT system.

Take, for example, a pump or a motor that commonly powers upstream operations. Our IIoT platform’s anomaly detection algorithms flag and mark cases of motor temperature overheating. These anomaly markers are laid out on a 3D model of the asset and reliability engineers, one sitting in Houston and another sitting in Oslo, can experience the unhealthy motor from the comfort of their headquarters. The sensor streams from the motor are streamed from historian tags in real time to the IIoT platform. The IIoT platform is then integrated with the AR/VR app, which enables the engineers to perform multiple asset examination operations. They can get “exploded” and “zoomed in” views of the asset and can rotate the asset across the 3D axis to pinpoint what is going wrong and where it’s going wrong.

In addition to experiencing the asset the reliability, engineers at headquarters can use voice and hand-based gestures to understand the sequence of events leading up to a high-value failure mode.

These features are extremely useful for optimizing upstream operations, reducing trips and shaving off costs in a hyper-competitive marketplace. As Harvey Firestone said, “Capital isn’t so important in business. Experience isn’t so important. You can get both these things. What is important is ideas. If you have ideas, you have the main asset you need, and there isn’t any limit to what you can do with your business and your life.” These new ideas promise to change the way OEM and operators run their upstream operations.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


October 5, 2017  1:49 PM

Demystifying IoT and fog computing: Part one

Sastry Malladi Profile: Sastry Malladi
Data Analytics, Edge computing, FOG, fog computing, Internet of Things, iot, IoT analytics, IoT data

Two of the most common buzzwords you hear these days are IoT and fog computing. I’d like to put some perspective on this, coming in from the real-world experience from my current role as the CTO of FogHorn. I intend to divide this up in a series of posts, as there are many topics to cover.

For the first post, I’d like to cover some basics and context setting.

The “T” in IoT refers to the actual devices  — whether they are consumer-oriented devices, such as a wearable device, or an industrial device, such as a wind turbine. When people talk about IoT, more often than not, they may be referring to consumer IoT devices. There are a lot of technologies and applications to manage and monitor those devices/systems, and with ever-increasing compute power of mobile devices and broadband connectivity, there isn’t a whole lot of new groundbreaking technology that is needed to address the common problems there. But when it comes to industrial IoT, it is a different story. Traditionally, all the heavy and expensive equipment in the industrial sector — be it a jet engine, an oil-drilling machine, a manufacturing plant or a wind turbine — as been equipped with lots of sensors measuring various things — temperature, pressure, humidity, vibration and so forth. Some of the modern equipment also includes video and audio sensors. That data typically gets collected through SCADA systems and protocol servers (MQTT, OPC-UA or Modbus, for example) and eventually ends up in some storage system. The amount of data produced per day ranges from terabytes to petabytes depending on the type of machine. Much of that data could be noise and repetitive in nature. Until recently, that data has not been utilized or analyzed to glean any insights as to what might be going wrong if any (therefore, no predictive analytics).

The industry 4.0 initiative and digital twin concepts hover around the idea of digitizing all these assets and transporting all the data to the cloud, where analytics and machine learning can be performed to derive intelligent insights into the operations of these machines. There are several problems with this approach: lack of connectivity from the remote locations, huge bandwidth costs and, more importantly, lack of real-time insights when failures are occurring or about to occur. Edge computing or fog computing is exactly what is needed to solve this problem, bringing compute and data analysis to where the data is produced (somewhat akin to the Hadoop concept). In this article, I’m using edge and fog interchangeably; while some don’t agree with that — some people like to call the fog layer a continuum between edge and cloud — but for the purposes of this article, that difference shouldn’t matter much.

I know some of you may be thinking, “So what’s the big deal? There are mature analytics and machine learning technologies available in the market today that are used in a data center/cloud environment.” Unfortunately, those existing technologies aren’t well-suited to run in a constrained environment — low memory (< 256 MB RAM), less compute (single or dual core low-speed processors) and storage. In many cases, the technology may have to run inside a programmable logic controller (PLC) or an existing embedded system. So the need is to be able to do streaming analytics (data from each sensor is a stream, fundamentally time-series data) and machine learning (when the failure conditions can’t be expressed easily or are not known) on the real-time data flowing through the system. A typical machine or piece of equipment can have anywhere from tens of sensors to hundreds of sensors producing data at a fast rate — a data packet every few milliseconds or sometimes in microseconds. Besides, data from different types of sensors (video, audio and discrete) may need to be combined (a process typically referred to as sensor fusion) to correlate and find the right events. You also have to take into account that the H/W chipset can be either x86 based or ARM based, and typical devices (either gateways, PLCs or embedded systems) will be the size of a Raspberry Pi or smaller. Finding a technology that provides edge analytics and machine learning technology that can run in these constrained environments is critical to enabling real-time intelligence at the source, which results huge cost savings for the customer.

In my next article, I’ll talk about some of the use cases that are taking advantage of this technology and explain how the technology is evolving and rapidly finding its way into many verticals.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


October 5, 2017  10:17 AM

IoT and secure data centers

Rado Danilak Profile: Rado Danilak
ai, Artificial intelligence, chip, Consumer IoT, Data Center, Internet of Things, iot, IoT devices, IoT hardware, iot security, voice

IoT devices and use cases are exploding. Coupled with advances in artificial intelligence (AI), they are poised to transform our lives. Interfaces are moving from touchscreen to intelligent voice control. In order for IoT devices to become ubiquitous, they must be increasingly intelligent and, at the same time, low cost. That is the conundrum — product attributes that are in conflict with each other. The only solution is to amortize the cost of high-quality AI across many devices with centralized processing in hyperscale data centers. Designing every single light switch to have intelligence beyond Siri or Alexa would be prohibitively expensive. However, passing voice commands to data centers, where the cost of AI can be amortized across thousands of intelligent light switches, makes the cost of a high-quality AI system a thousand times cheaper. That light switch voice interface used is less than one minute out of 1,440 minutes per day. This will provide $1,000 worth of high-quality AI, at a cost of only $1 per light switch — which is a manageable price.

For example, an intelligent light switch can save energy by reducing light output when there is sufficient ambient light or the user is working in a different area. With an echolocator, it can pinpoint the person’s location without the need for a camera, which many people do not like in private environments. If a person suddenly ends up on the floor, an intelligent switch can determine if the person is doing yoga or if it is a grandmother who collapsed and needs 911 to be called. Or, if somebody at 3:00 am entered the room not through the door and it is wise to turn the lights on and/or call 911.

Another aspect is sharing knowledge. If we have a toaster with a small camera sensor to make sure our bread is toasted but not burned, such a toaster may experience a new type of bread, which behaves differently. Having our toasters networked means they can learn from each other’s experience. Again, hyperscale shared data centers provide this shared knowledge. Such a toaster could recognize the voice that is talking and make toast exactly the way that person likes it.

However, there is an additional ingredient needed for success. It is security. It will not be acceptable to use smart devices if a hacker can burn down somebody’s house by attacking a stove or toaster. It is not acceptable that somebody can listen to your private conversations. Thus security is must. Security drives further demand for processing performance at the data center — not only for encryption, but for AI software to determine if something is appropriate or dangerous, if something should or should not be done, and even whether a malicious hacker requested an action in question.

From intelligent refrigerators telling us not to drink the out-of-date milk, to stoves making sure food will not be burned, to microwaves which will not overheat food, to smart garage openers, home security — everything in our lives will benefit from AI-powered IoT devices with collective shared knowledge and wisdom — if they also have security. It can’t be accomplished without amortizing the cost of intelligence by concentrating it in hyperscale data centers, where AI cost is spread across billions of “intelligent” IoT devices.

All of this drives the demand for more processing power in these data centers. With today’s data centers consuming 40% more energy than Great Britain, more than the airline industry and with 15% annual growth, this means we will have two times more data centers, burning two times more power every five years! Humanity can’t afford >10% of the planet’s energy to go into data centers in 10 years, and definitively not 40% of the energy in 20 years.

In the past, Moore’s law gave us power reduction in semiconductor products from lower voltage and aggressive process shrinks. But now, with a performance and power plateau now in processors, we can’t rely on existing technology to offset growing power consumption demands like it did in the past.

However, companies today are working on new solutions to this problem, building chips that not only reduce data center hardware, but also reduces power consumption of public or private cloud data centers. Solving the processing power problem cannot be ignored if we want to make billions and billions of intelligent IoT devices safe, secure and cost-effective.

Those of you in the San Francisco area are encouraged to watch my invited speaker presentation at the Storage Visions Conference. I will address the growing data center energy consumption challenges, on Oct. 16 at 2 p.m. at the Embassy Suites in Milpitas, Calif.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


October 4, 2017  3:35 PM

Virtualization and IoT made for one another, but performance monitoring still essential

Dirk Paessler Profile: Dirk Paessler
Internet of Things, iot, IoT applications, Monitoring, Network monitoring, Network virtualization, Virtualization

In many ways, the internet of things offers the perfect use case for virtualization — an elastic and highly adaptable environment and framework capable of adjusting to the dramatic lulls and spikes that characterize the data streams between machines. In much the same way, IoT, with what is shaping up to be an innumerable collection of connected devices, requires the dramatic scale that most traditional networks can’t provide in a cost-effective way.

Of course, IoT isn’t the only driver of virtualization. The ability to dynamically allocate CPU capacity, memory and disk space to applications as needed is a powerful benefit in and of itself. Not surprisingly, for those just beginning their journey with virtualization, it can feel like performance, capacity and scale are unlimited — and in many ways, that’s understandable.

Even so, as IoT continues to add additional devices and data to these same networks, an inescapable reality becomes clear: It’s just as important to monitor the health and performance of your virtualized network and components as other parts of your infrastructure. In fact, reliable network monitoring plays an important and crucial role in dynamically assigning the very capabilities and capacities virtualization makes possible.

It’s also no secret that any network failure in a virtualized environment can have a dramatic impact on the applications in question, and for that reason alone, network monitoring is required to ensure that system administrators, network engineers and IT teams know as soon as any issue arises. That, of course, isn’t all. Following are some of the many reasons it’s important to monitor your virtual assets, as well as specific things you’ll want to keep an eye on to ensure that your IoT initiatives function smoothly with the virtualized network you ultimately put in place.

  • Use monitoring to ensure that you plan for the virtual assets your IoT initiative will demand. Increasingly, smaller organizations are looking to virtualization and the many benefits it offers for the first time as their IoT initiatives grow in scale and scope. Network monitoring technologies and capabilities should be considered a necessary investment for organizations at this important stage for a simple reason: It’s critical to understand what different applications demand of your network resources. Virtualizing systems without knowing the CPU and memory load, dish usage and network usage is very risky. Surprises are never good in any network, and virtualization is not a one-size-fits-all proposition. It’s important to plan your efforts based on the facts, not intuition.
  • Use monitoring to optimally assign resources. With virtualization, you always want to find the right balance. You don’t want to assign too few virtual machines to a host — and in that way waste resources — but you also don’t want to overload any virtual servers — and in that way slow or disrupt the performance of all of the systems and applications running off of it. Network monitoring technology enables you to see in real time how these virtualized resources are being used during periods of peak usage and lulls — information you can use to assign just enough of the resources at your disposal, but not too much.
  • Use monitoring to maintain quality of service. Today’s network monitoring technology not only enables you to see what’s happening in your network right now, but it also provides you with access to a historical record and view that can be used not only when troubleshooting any problems that arise (you don’t have to replicate or recreate scenarios to see what happened when a real historical record is available and easily accessed), but also compare how any changes to the network ultimately impacted the quality of service. Virtualization is no exception. Monitoring before and after you begin and complete your virtualization deployment will enable you to demonstrate just how much the virtualization effort improved the service experienced by users.

Fortunately network monitoring technology has advanced with virtualization, and today there are many sensors and technologies available that provide a real-time view not only of what’s happening in these networks, but also simultaneously in the traditional infrastructure and data centers they augment. Some of the many performance metrics and capacities that can be monitored in virtual networks include CPU usage as a percent of guests, CPU usage for each hypervisor, total CPU usage, read and write speeds, the number of packets sent and received by bytes or time period (such as seconds), network usage, disk usage, available disk capacity, active memory, consumed memory, the number virtual machines running, load average and the health of any host hardware, including temperature, power, fan rotations per minute and battery voltage. These are only to name a few.

A bright future, of course, lies ahead for IoT as we find new ways to make life better with connections that put more information than ever at our fingertips. But just as these connected devices will enable us to act with greater knowledge, the virtualized networks that make them possible will require far greater diligence on the part of networking professionals who must now, more than ever, take steps to ensure that they know exactly what’s happening across their networks at any given point in time.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: