A number of factors influence consumer perspectives on privacy and data, and concerns vary for any particular brand or product category. Moreover, the general backdrop of headline news about data breaches and unauthorized data sharing raises the anxiety for some consumers, whether they are affected or not. An increasing number of consumers have actually become victims of identity theft, leaving them with a heightened concern for the real-world consequences of re-establishing identity and monitoring credit records for unauthorized purchases. Privacy concerns about certain devices vary by whether a consumer owns or intends to purchase the device, as opposed to someone with no actual experience with the device. Concerns rise once a consumer owns a device and becomes more familiar with its functions.
Altogether, 47% of broadband households express privacy or security concerns about at least one smart home device. They express the greatest level of concern for the privacy and security of computers and tablets (43%), followed by smartphones (41%). The lengthy history of connectivity, data breaches, and the volume and variety of data stored and transmitted by these devices drives these concerns. Nearly as many respondents express concern about relatively newer smart entry devices (door locks, garage openers) (40%) and home security systems (38%). U.S. broadband households expressed significantly lower levels of concern for thermostats, lights, and HVAC systems (25%), and connected CE devices (24%). Notably, despite stringent HIPAA requirements in the health sector designed to protect consumer health data, respondents express the lowest level of concern for connected fitness devices and connected health devices, each earning a 23% concern rating.
Device ownership and purchase intentions are the best indicators of a consumers’ level of privacy and security concerns. Privacy and security concerns do not differ drastically by age group. Smart home device owners are more likely than non-owners to have concerns, suggesting that concern rises when ownership creates the real possibility of compromise. Those intending to purchase a smart home device also report significantly higher concerns for privacy and security versus those with no intention to purchase, implying privacy concerns are more top-of-mind for those in the market for devices.
A similar pattern exists for other device categories, with an approximate 50% increase in concern level among those who own connected devices. For instance, 20% of smart TV non-owners express privacy and security concerns versus 30% of smart TV owners. In the health devices category, security and privacy concerns among connected blood pressure cuff non-owners (22%) increase by an even greater margin for owners (43%).
When specific privacy and security concerns are considered, concern for “identity theft or data hack” ranks as the first or second leading concern in eight of nine connected device product categories. Regulators, advocacy groups and forward-thinking industry players have championed the notion of a consumer bill of privacy rights. However, when a variety of specific privacy rights are presented to consumers, no one privacy right alleviates the concern of more than one quarter of consumers.
Still, combining at least three privacy rights alleviates almost three quarters of consumer concern. Adding together consumers who have no privacy or security concerns with those whose concerns are relieved by the right to be invisible, the right to approve who uses the data and the right to be erased, relieves concern for 73% of consumers. Further, simply giving consumers the ability to opt-in or opt-out of data collection and still use the product or service alleviates most of the concern.
Sales of connected devices are exploding. The massive amount of data available from connected devices creates an unprecedented opportunity for new products, value-added services, new partnerships and new ways to use data. Consumer concern for data privacy and fears of hacked data loom larger than other concerns for connecting devices to the Internet, and high levels of security may become a differentiator for many new connected products.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
The Internet of Things, food quality and food safety take on another dimension at Freight Farms, a Boston company that provides a farm in a box — or, more specifically, a 40′ x 8’ x 9.5’ shipping container.
The company’s container farms provide climate control, lighting and a hydroponic system, enabling customer-farmers to grow leafy vegetables year round. The containers, according to Freight Farms, are optimized to grow herbs, different varieties of lettuce and other vegetables such as kale and cabbage. IoT wasn’t in the picture when Freight Farms launched in 2010, but the technology has become increasingly central to farm management.
“We didn’t start out there … but [IoT] became a very significant part of the business,” said Kyle Seaman, director of farm technology at Freight Farms. “It is essential and fundamental to our success. Continued »
The era of the “connected refrigerator” may have arrived, but Mark Roemers, co-founder of Netherlands-based AntTail, has simply given up on consumer refrigerators, smart or otherwise. They don’t cool evenly, with a dozen or more degrees Fahrenheit temperature difference between storage in the door and storage in the back of a typical refrigerator — more than enough to allow for degradation of certain medications. And AntTail, which focuses on pharmaceutical logistics, is all about keeping medicines fresh. Part of its partnership with one of the major pharmaceutical suppliers in the Netherlands is to supply small drug storage coolers to individual patients to use in their homes.
Using refrigerators, more than 90% of patients don’t store their medications within the safety margin. Using the dedicated cooler turns that statistic exactly on its head. AntTail knows this because it tracks the temperature continuously and monitors when the drugs are accessed for use.
The company has developed a small sensor that fits inside the sealed package in which pharmacies deliver drugs to patients. The sensor device, which looks somewhat like an overgrown SD card, not only tracks temperatures, but also incorporates a light sensor so it knows when the package it resides in has been opened. With a life between battery changes of 18 months, it’s something of a textbook example of these sorts of communications and power-consumption issues that real-world IoT devices must deal with.
The first casualty of power requirements for AntTail, Roemers says, was industry standards. The company really couldn’t use conventional wireless standards such as Wi-Fi, Bluetooth, Zigbee or Z-wave– it’d have only managed six months or so of life from the CR2032 batteries that its sensors use — and thus have invented its own proprietary network. “There’s a big need for someone to come up with a standard for very efficient wireless communication,” Roemers said.
The proprietary AntTail protocol is unusual in that it doesn’t use addressed-based routing the way IP-style networks do. Instead, each device monitors the hop counts of packets traversing other nearby units (this is a mesh architecture) and sends packets to devices that it knows are “upstream” in relation to an aggregator (think, roughly, of a Wi-Fi access point). The aggregator then uses a cellular data connection to send the sensor data to the cloud. If the company had chosen a standard protocol for local connection to the sensors, Roemers said, it would have been Zigbee, but then “all the equipment would be six times larger” because of the extra batteries needed.
Data from the sensors is collected in the cloud and, of late, some of it is shared back to patients by way of a smartphone app, which helps with reminders to take medicines on schedule. At present, there are only about 1,000 sensors in the field, but the deployment is doubling monthly. One downside: fully one half of the sensors are thrown away by patients who forget that they are inside the foil pack containing the medicine. AntTail notes that they do typically get a couple of trips to the end user and back before they go astray. Still, at thirty euros a throw for the sensor units, IoT medicine delivery is still a pricey proposition.
With this post, I am pleased to join TechTarget IoT Agenda as a contributor; I hope to post to the community monthly. My background is in IT operations as a practitioner for 17 years, then moving to an analyst role at Gartner for three years. Currently, I’m helping to drive vision, strategy and roadmap at AppDynamics, an application intelligence software company focused on application performance monitoring and analytics.
This post analyzes some key data and predictions for the IoT market, and helps frame segmentation and IoT spending in IoT project phases.
The Internet of Things has some traits in typical applications, but also introduces some significant new challenges. With typical applications, users interact directly and data is collected inclusive of the specific actions the user is taking as they engage with the app. This means the app is only emitting data for small, more definitive timeframes. IoT systems, on the other hand, often emit data when they are at rest — especially in the case of consumer systems.
IoT spending (in US $ millions)
Forecast Analysis: Internet of Things — Services, Worldwide, 2015 Update (04 January 2016) Analyst(s): Peter Middleton | Thilo Koslowski
Gartner segments IoT spending (you must be a subscriber to access) between connectivity of “things” — typically via mobile networks or Wi-Fi — and two major market use cases: consumer and professional. As the data shows, the professional market is over 50 times larger than the consumer market.
Gartner predicts that “manufacturing and natural resources” is the largest segment by total spend, estimated at $95 billion in 2016 and projected to grow to $136 billion by 2020. By comparison, the entire consumer segment is estimated to be $7 billion in 2016 and projected to grow to $39 billion by 2020. In many cases, these industrial systems do not rest often, and hence provide a larger opportunity — and significant challenge as well — due to both the size of the data and the value locked inside. Among these typically complex and often widely dispersed systems could be sensor-connected machines in factories, construction vehicles, mining equipment, oil and gas equipment, and robotics in manufacturing.
Gartner has done additional interesting analysis, looking at how IoT is likely to evolve. As an IoT project gets underway and is in development, Gartner estimates that 20% of the project spend is focused on design and consulting, 35% on implementation and 45% on operations.
As IoT evolves over the next four years, the ratio of that spending will shift more heavily towards operations. Gartner predicts that in 2020, the ratio will move towards 18% spend on design and consulting, 30% on implementation and 52% on operations.
Operational costs are tied to analytics use cases, which are applied to the operations of IoT assets, including not only hardware or cloud services, but also software and services to enable infrastructure management, application management, device management, performance monitoring, remote diagnostics, authentication, billing and support. The purposes of these functional areas are to collect and analyze the data generated from IoT devices and create insights from it with algorithms and other views of the data.
The reason for this shift is that the high value of IoT is within the analysis and insights derived from the collected data. Newer, more sophisticated algorithms and analysis require more computing and associated resources. The operational cost of increasing the number of devices, customers and ultimately users means more IoT spending on more storage, traffic and raw processing power needed to scale the IoT business.
As the use of IoT projects ramps up, there are economies of scale at play. But the current models analysts are predicting still show that the cost of these services will be passed onto end users in the form of premium services and offerings. This will increasingly become a normal upsell tactic within IoT that has yet to take hold.
Thanks for reading this post! Next time I’ll offer insights around device and data management, along with emerging IoT platforms.
Please leave comments here or via twitter: @jkowall.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
As a vendor in the Internet of Things, we recently decided to research and evaluate whether to join a standards committee to begin influencing how tomorrow’s IoT standards get defined. What an eye-opener. I found at least 20 such initiatives focusing on connecting the Internet of Things together, and I broke them down into four categories:
- Open connectors
- Standards bodies
- Companies with industry muscle
Examples of the open connectors include IFTTT (If This Then That) and Zapier, both of whom are offering simple ways to connect disparate apps and services via small scripts or (in the case of IFTTT) “recipes.” The example I always give to explain IFTTT is that the driver of a connected car can easily program an app on the dashboard of the vehicle to automatically take an action, like “If I am driving towards home and if I am 300 yards from home, then turn my alarm system off, and it it’s less than 15 degrees Celsius outside, then turn up my heating.” IFTTT and the other open connectors are at the application level and user level, so there’s plenty of opportunity for them to live alongside the next two types: the standards bodies and the consortia of industry heavyweights.
It’s actually quite hard to figure out exactly which organizations are true IoT standards bodies and which are industry groups. Some are consortia trying to set IoT standards, others claim they are standards bodies, but in effect they are committees made up of industry heavyweights. For each of them below, I looked to see whether Cisco was involved for reasons I will explain afterwards:
- IPSO Alliance: “Defining identity and privacy for the next generation of smart objects.” IPSO is an open, informal and thought-leading association of like-minded organizations that promote the value of using the Internet Protocol for the networking of smart objects. (Cisco is a contributor.)
- AllSeen Alliance: “A cross-industry consortium dedicated to enabling the interoperability of billions of devices, services and apps that comprise the Internet of Things.” Qualcomm kicked this off as AllJoyn and then handed the source code to the Linux Foundation, from where the AllSeen Alliance was born. (Cisco is a community member.)
- The Industrial Internet Consortium: “The open membership, international not-for-profit consortium that is setting the architectural framework and direction for the Industrial Internet.” It was founded by AT&T, Cisco, GE, IBM and Intel in March 2014, with a mission to coordinate vast ecosystem initiatives to connect and integrate objects with people, processes and data using common architectures, interoperability and open standards. (Cisco is a founding member.)
- oneM2M: “The global standards initiative for Machine to Machine Communications and the Internet of Things.” Its purpose and goal is to develop technical specifications which address the need for a common M2M Service Layer that can be readily embedded within various hardware and software, and relied upon to connect the myriad of devices in the field with M2M application servers worldwide. (Cisco is a member.)
- FiWare: “An independent open community whose members are committed to building an open sustainable ecosystem around public, royalty-free and implementation-driven software platform standards that will ease the development of new smart applications in multiple sectors.” (Cisco does NOT appear to be a member.)
- OpenDaylight IoDM: OpenDaylight (ODL) is a modular open software defined networking (SDN) platform for networks large and small, and ODL has an initiative called IoTDM (where DM stands for data management) which aims to deliver open source IoT middleware solutions based on oneM2M over ODL. (Cisco appears to be driving IoTDM.)
- OpenFog: Yes, you read it correctly, we have OpenDaylight and now OpenFog. “A public-private ecosystem formed to accelerate the adoption of fog computing in order to solve the bandwidth, latency and communications challenges associated with IoT.” Its work is centered around creating a framework for efficient and reliable networks combined with identifiable, secure and privacy-friendly information flows between clouds, endpoints and services based on open standard technologies. (Cisco is a founding member.)
- Open Connectivity Foundation: “The Open Connectivity Foundation (OCF) is creating a specification and sponsoring an open source project to unlock the massive opportunity in the IoT market, accelerate industry innovation, and help developers and companies create solutions that map to a single open specification.” OCF will help ensure secure interoperability for consumers, businesses and industries. (Cisco is a member and is on the board.)
- Thread Group: “It’s hard to get devices to talk to one another. And once they do, the connection is often spotty and power hungry. Thread changes all that. It’s a mesh network designed to securely and reliably connect hundreds of products around the home — without blowing through battery life.” Thread has a couple of hundred members in the connected home space. (Cisco is NOT affiliated with Thread.)
- Hypercat: “A consortium and standard to drive secure and interoperable IoT for industry and cities.” The Hypercat specification allows Internet of Things clients to discover information about IoT assets over the Web. With Hypercat, developers can write applications that will work across many servers, breaking down the walls between vertical silos. (Cisco is on the advisory board.)
As is typical with a new and hot sector like IoT, there’s an incredible amount of jostling for position among the giants and the start-ups alike; there are many competing “standards” (and therefore by definition none of them are actually IoT standards), and most importantly the fact that Cisco is involved in most of them tells us that the big guys are hedging their bets because they aren’t sure which ones will emerge as winners and which ones will die on the vine.
My next article will look at whether these IoT standards bodies will even be relevant in a world where Google (Brillo and Weave), Apple (HomeKit), Samsung (SmartThings), Amazon (Alexa) and Microsoft (Windows 10 IoT editions) are all bringing out their own IoT solutions. Will one of them become the de-facto IoT standard just as Google’s Android became the de-facto open mobile OS?
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
“Prediction is very difficult, especially if it’s about the future.” – Niels Bohr
While it may be difficult to envision the exact details of our world in the next decade, current trends and developments in technology can inform our vision of the future of IoT. Sans a technological bouleversement, how companies engage with IoT‐driven solutions to derive just the right insights and manage workflows and automation will shape business practices, industries and human interactions in radical new ways.
Data sharing transforms industries
Today, companies and organizations hoard their data, but the real intrinsic and extrinsic value derived from data emerges when firm‐specific data is combined with exogenous information. This can be seen in the use of weather data to better predict ice cream sales. Cooperation and data sharing within and across industries will revolutionize the next stage of data analytics insights. Traditional industries, such as manufacturing and heavy industries, will evolve into the product + platform as a service. Some enterprises will transform into network ecosystems to leverage the network effect.
Currently, standards are a snarl of proprietary polemics, but the future of IoT is open source — speeding the implementation and cross‐functionality of networks requires a universal set of guidelines. The newly formed Open Connectivity Foundation, as well as AllSeen Alliance and Industrial Internet Consortium, is trying to reduce fragmentation in the industry and bring interoperability to devices and networks. The Open Compute Project’s Telco Project envisions a new 5G IoT platform: scalable, low‐latency, error correcting IoT networks that will cater to billions of devices.
The 500-pound gorilla
Everyone knows that the 500-pound gorilla in the room is security. Consumers are (rightly) concerned about becoming a target through stolen information, while companies struggle with a broad range of troubles, from lost market share from stolen R&D to public backlash against perceived inadequacies in protections. There are a number of solutions already enjoying some success in providing peace of mind and privacy for data, and they will only improve as analytics informs more secure processes and complex algorithms. For example, ownCloud offers customers a private, open source cloud on any server and currently boasts over eight million users. Private data vaults allow users to control the use of their own data: rather than sharing information, companies query the encrypted data vault that returns the data without compromising the consumer’s privacy. The ubiquity of IoT will give rise to the increased use of decentralized, anonymous systems. While cryptocurrency Bitcoin was the first to make the idea of a Blockchain significant, look for others to move towards a decentralized ledger system.
Data matters in the future of IoT
Data insights will begin with the strategic outcomes of the analysis instead of simply aggregating data. Companies will invest significant resources in creating an approach to getting the types of insights they need to better inform their decisions. More companies will be deploying parts of their systems to the cloud, and there will be an increasing momentum towards a hybrid approach of using the cloud and computing on the edge.
Man and machine
Science fiction stories are replete with human/machine interactions, and the next 10 years will begin to make some of those visions a reality (of course, all AI should adhere to Asimov’s Three Laws of Robotics). As AI/machine learning becomes more pervasive, the need to interface with numerous apps and devices will disappear. Eventually, there will be a single AI assistant that will navigate the virtual world for us, a virtual Jeeves (or Alfred, for the more adventurous) will identify and interact with the appropriate network or app to deliver the desired information or outcome.
Data‐driven decision‐making is particularly useful in making individual health decisions, which is currently confined to self‐monitoring with wearables like wristbands or compression shirts. However, the relationship between an individual and their doctor will be altered forever, as individuals use their virtual assistant to aid in making better health decisions. Delitescent information that was never within the reach of the average person will be served up regularly in order to allow for personalized control over an individual’s lifestyle.
With the power of a virtual assistant to increasingly assume a variety of roles, individuals will have the opportunity to turn towards self‐improvement and the construction of stronger relationships. A personally‐empowered society will begin to evolve in response to the technology. In Martin Seligman’s, Flourish, he uses his years of research to promote the idea of “Positive Psychology.” The goal of the Positive Psychology movement is about transforming lives for the better, exactly the direction we are moving in with data and analytics. Seligman’s ideas about what makes a good life — for individuals, for businesses, and for nations — will become increasingly important. Improved access to enhanced insights turns society from merely reacting to problems (dealing with negatives) to addressing them before they get out of hand. Individuals will be better equipped to control their own outcomes and use information to make the world around them a better place.
Current trends are driving the future of IoT. Wherever that journey leads will be one more step on the road to creating superior companies, healthier individuals, a stronger society and a more beautiful world.
One of the many shock waves that the Internet of Things has set in motion across different technology markets can be found in telecommunications. Previously, the direction of the telecoms industry seemed very unambiguous: to keep increasing the data rates that can be delivered over the air. This strategic homogeneity made the industry predictable and even stale, with all suppliers pursuing more or less the same goals.
The status quo is now being shaken up by IoT, which is shifting attention from data rates to power efficiency. The latter priority is largely at odds with the former, so the new technologies that this shift is leading to are decidedly low in terms of bandwidth, in order to minimize the power consumption of end devices.
There obviously is still a time and a place for using high-bandwidth communications in IoT applications that require higher data rates and lower latency — such as connected car, smart grid or industrial automation — but all in all, the really transformative stuff is happening at the opposite end of the connectivity space. As an innovation enabler, the emergence of low-power IoT networks could be compared to how the rise of mobile unleashed personal communications and computing from the world of wires.
The key difference, though, is that this time around the Things are being unleashed from electricity wiring. Advances in low-power, low-bandwidth telecoms are allowing enterprises and makers to develop “connected” and “smart” products that have to rely neither on access to power supply nor frequent recharge cycles. They can be built to run several years on batteries and, furthermore, those batteries can be small enough to permit significantly more discreet and flexible form factors for the end device than what could have been envisioned only a few years ago.
In the following, let’s take a look at the most important recent developments that the quest for low-power telecoms has lately led to:
- Thread is a new 802.15.4-based protocol that deserves to be highlighted for the mere fact that it has triggered a whole new sense of urgency amidst the vendors that drive the following three technologies on the list. Designed for home and building automation, Thread has generally gotten off to a good start and its certification program is currently vetting the first cohort of compliant devices. Once the commercial versions of such products start hitting the market — probably in the second half of the year — we can start assessing how well Thread’s implementations can actually match the admittedly lofty expectations.
- Bluetooth has had its IoT-friendly incarnation, Bluetooth Low Energy, available already for half a decade, but its relevance to developers has been quite seriously inhibited by its short range and point-to-point nature. This year, Bluetooth as a standard should finally get its eagerly awaited capability for running mesh networks. That is an addition that could, especially, divert the (hitherto over-hyped) beacon space into an interesting direction.
- Wi-Fi HaLow is the framework for the low-power, longer-range implementations of Wi-Fi, run in the increasingly sought-after 900-MHz frequency bands. Given the relative ubiquity of Wi-Fi in homes, buildings and urban spaces, HaLow could certainly reshuffle the connectivity market for a variety of IoT applications. On the downside, the progress towards it has been undeniably slow; for example, the certification program is scheduled for not earlier than 2018. That is starting to be rather late, given everything else that is going on.
- ZigBee 3.0 is the latest version of the ZigBee standards, whose infamous fragmentation has so far made the technology as a whole amount to quite much less than the sum of its parts — or application profiles, to be more specific. Ratified in the end of 2015, the third generation aims to harmonize the profiles substantially and thus improve device interoperability. The case for developing Thread from scratch had a lot to do with ZigBee’s shortcomings, so there is something of a last-ditch feel to the attempt.
- Low Power Wide Area (LPWA) is an umbrella concept for a group of new technologies that have been built to combine power efficiency and inexpensive hardware components with the operational benefits of wide area networks. Machina Research further splits the concept into two subcategories: with “dedicated” LPWA consisting of the purposely designed technologies such as LoRa, RPMA and Sigfox, and “evolutionary” LPWA covering the alternatives that have been developed as upgrades to existing ones like LTE. During 2015, LPWA became one of the hottest topics on the supply side of IoT. Collectively, these networks could well prove a real game changer, although there are still also certain question marks related to their technical, commercial and even regulatory feasibility.
- Other major developments in IoT connectivity can be found mostly in low-power mesh networking. Wi-SUN is an 802.15.4-based standard that has been engineered especially for utility networks, and it has shown a lot of early promise in that sector. Its most high-profile implementation, Silver Spring Network’s Starfish, has been just opened up to also serve developers outside of the company’s own customer base. At the same time, Wirepas is a vendor that has come up with a software solution that seems to be able to dramatically boost the scalability aspect of mesh networks, which has always been their weakest spot from the enterprise perspective.
By far the best part about such multipolar innovation is that the involved technology camps are keeping each other on their toes, and progress by one pushes the others to up their game. It should also keep the bargaining power of key vendors or service providers in check, preventing rent seeking and other practices that would ultimately stifle innovation on the enterprise level.
The objective of the connected home experience is to enhance the lifestyle of the consumer, delivering an anytime, anywhere, borderless lifestyle where all devices work together whether the applications are entertainment, home control or energy management. The connected home makes it easier and simpler to accomplish what people are already doing today. Many daily activities like eating, sleeping, driving to work, listening to music, watching television, reading a book or going shopping can be enhanced by connected technology.
For the promise of this truly connected home to be realized, it is imperative that the connected devices can exchange data among themselves as well as with third-party applications and services. Common barriers to smart home market growth are the results of interoperability challenges, and include inconsistent and limited connectivity capabilities, lack of contextual richness of data expensive devices with long lifespans, and point-to-point integration strategies that quickly become unmanageable.
Parks Associates has identified the following key challenges that the industry must overcome to accomplish a true smart home experience.
Bridging different home area networks
The past few years have seen a lot of activities by players in the Internet of Things (IoT) space to address the issue of interoperability. A number of industry alliances have formed to work towards creating common standards and communication protocols to allow devices, cloud services and applications to communicate and exchange data.
The smart home landscape is currently littered with a plethora of proprietary as well as open source communication protocols in an attempt to connect devices with one another.
The leading communication protocols in use in the smart home space include Z-Wave, ZigBee, IEEE 802.15.4, 6LoWPAN, DECT ULE and Low Power Wi-Fi. Each of these protocols has its own strengths and weaknesses, which make them suitable for specific use cases. Parks Associates believes that there will not be a single winner of this war of protocols; all will continue to co-exist for the foreseeable future.
Integrating platforms and connected products
In addition to the communication protocols, a plethora of industry standards attempt to create a framework for integrating devices, services and applications. These standards either are backed by a group of technology companies or come from influential platform players such as Apple or Google. Small device makers and app developers — which have limited resources — must weigh the pros and cons from both technology and business perspectives. In most cases, it is an overwhelming task to pick a side, and also a costly one to support multiple standards in order to hedge their bets.
Popular industry standards include:
- Thread Group
- Brillo and Weave
Finally, there is an additional challenge of bridging apps used for different connected devices. As consumer adoption of smart home devices accelerates, the number of connected devices in a household is also expected to grow. Mobile apps are the primary interface for these connected devices in a smart home. Parks Associates data indicates that more than 80% of smartphone/tablet users who use at least one smart home device have downloaded mobile apps for these devices.
The frequency of smart home app use is on the rise too. Parks Associates research shows that nearly half of broadband households with a smart motorized garage door opener use a smartphone, tablet or computer to control the opener daily or almost daily.
In this context, consumers navigating multiple apps for use of connected devices within a smart home act as a deterrent to adoption of connected devices. A number of technology solution providers have developed hubs or gateways and a corresponding mobile application that serves as a dashboard to the connected home. These solution providers come from a multitude of backgrounds and include service providers, home improvement retailers and security companies, as well as startups.
As the smart home market is increasingly moving towards a battle among multiple ecosystems led by influential companies from the technology sector and the service provider industry, it has become urgent that the industry must accomplish interoperability at all three levels: device-to-device connectivity, device to-platform and app-to-app. A close collaboration among smart home ecosystems could minimize the danger of fragmented user experience and bolster healthy growth of this exciting market.
Larry Prusak noted that “the only thing that gives an organization a competitive edge, the only thing that is sustainable, is what it knows, how it uses what it knows, and how fast it can know something.”
Information is foundational to business success, but understanding that data — and acting on it — is not always a simple matter. Using IoT allows your enterprise to access real-time data, and analytics allows you to elucidate insights. But then what? At IBM’s InterConnect 2016, Chunka Mui, managing director of the consulting team Devil’s Advocate Group, laid out five key lessons to “stress test” your enterprise innovation approach. How can companies drive innovation, create value and help create “fans” who are loyal? Mui’s key lessons were to embrace the gap, think big, start small, learn fast and cultivate a sense of “patient” urgency.
Time and technology waits for no man
Mui emphasized the need to embrace the gap: this chasm is the difference between the rapid development of technology and the incremental change desired by companies. He stressed that embracing the gap is not simply deliberating about ways you can serve customers’ needs or wants. It’s about approaching that chasm and then seizing the opportunity that lies there. Mui noted that a failure to embrace technology quickly limited enterprise ability to thrive in the marketplace. He noted that IoT is augmenting a range of disruptive “gap” technologies, such as social media, mobile devices, cloud computing, and analytics and artificial intelligence.
It’s bigger than you
Uber. Lyft. They’ve had an astounding effect on the taxi industry (and are being banned in certain areas because of it), but they are also revolutionizing cars and travel in ways never thought possible. Mui noted that in order to “think big,” enterprises must also be willing to move in directions that were never considered before. Businesses are being disrupted by technology (such as the shift to mobile), but are not investing enough time and consideration for new directions — or even completely novel problems. Startups have the advantage of a clean slate; they are small and agile, able to take advantage of the technology to serve a particular customer demand. Larger, more established companies may need to approach the marketplace with a clean slate in order to use innovation to drive growth. “Reimagining the customer experience,” Mui added, is key to develop “big” ideas.
The journey of a thousand miles…
After considering ideas about serious market disruption, Mui’s suggestion to take small steps seems anachronous. However, his reasoning is based on a solid approach to development: break the idea into smaller pieces and test them. Each small test provides a mountain of data which can be used to provide insights into whether the product or service is going to be successful. Many companies do not learn enough from playing around with or developing their prototypes (or identifying the best possible location to test them, either). Starting small can mean your company invests R&D resources in your app or considers a new connected device on a small scale. But none of this will be successful without his fourth step, which is…
It’s a given that enterprises need to be in a constant state of innovation — they need to be relentlessly adding to their knowledge base and pursuing new avenues for meeting customer needs. Mui pointed to Bob Lutz’s 931 approach to car design as one method for moving swiftly through the research and development phase. The most important part of the process is that the information that is derived from each step is understood and analyzed before moving into the next phase of development. IoT sensors can provide significant amounts of data, and companies can realize value by implementing data driven tweaks throughout the development cycle. (Currently, only about 1% of IoT data is actually used.)
Hurry up. Now wait.
Perhaps the most difficult lesson to implement is Mui’s final one: knowing when to move an innovation into the market is a highly sought after but sometimes elusive talent. The perfect timing to release your innovation is an amalgamation of well-informed analyses, a deep understanding of market trends, a thorough comprehension of your competition, rich, rounded insights about your customers — and yes, even pure instinct. History is littered with visionaries who were well ahead of their time — the Da Vincis and Teslas who imagined truly innovative creations … that the world was not ready for. Analytics can offload some of this problem by pinpointing what your customers desire now and predicting patterns of behavior that can inform your launch strategy.
“Innovate or die” is a business maxim that not only pushes companies into the red, but moves many others into the development of profitable, useful creations that disrupt the industry. Mui’s lessons underscore a key aspect of modern enterprise innovation: it is completely data driven.
The first thing you need to do when designing an IoT system is to get everyone onboard with the fact that it won’t work — at least every once in a while. Even if you’ve spent the time and expense to achieve several 9s of reliability for the stuff you build, there will likely be a dependency — somewhere in the chain — that is simply out of your control and causes some sort of IoT downtime.
IoT systems are often enabled by several supporting services on the back-end. Third-party services such as network providers, messaging platforms and even the core infrastructure of the Internet add dependencies between the sensors in the field and the dashboards they feed. While these supporting services are convenient and often necessary, they are managed by someone else and are consequently a risk. Maybe an IT person unknowingly changes the Wi-Fi configuration. Or maybe an ISP blacklists traffic from your devices, or there’s a widespread cellular outage. It’s important to identify dependencies, minimize the risk where possible and set expectations accordingly. The larger the system scales, the more distributed it becomes and the more likely it is that the spaces between “your stuff” will hiccup and you’ll have to deal with it. This means you have to plan for failure and handle it well.
“Handle it” doesn’t stop with technology. It’s important — arguably more important — to set the business expectation that IoT downtime could be a natural occurrence. It should spark conversation around expectations and the cost/benefit of making something truly highly available. You have to weigh the cost of support issues against the ROI of the opportunity at large. When margins are thin, even a single truck roll could kill the ROI for several units. It’s definitely not easy to quantify, but having those discussion early on will go a long way to minimizing the overall impact of issues.
Here’s a real scenario: An IoT system reported that several devices were not communicating after a routine firmware upgrade. The team determined that most of the devices were working as expected, but several devices were indeed offline. After a long troubleshooting session, they found the outage was coincidentally timed with a widespread DNS issue caused by a totally unrelated event which prevented the devices from communicating for several hours. Of course, the customer wanted to fix it immediately, but they had no way to communicate with the devices until the public DNS service was restored; the customer simply had to wait it out. Ironically, not long before, the same DNS service enabled them to avoid a previous outage by quickly pointing all of the devices to a failover system. Point being, dependencies aren’t inherently bad, but you should understand the tradeoffs.
Identify types of IoT downtime outages: Blips, blocks, bombs
A good practice is to walk through the entire system and identify potential types of outages. This could be as simple as walking through a block architecture and asking, “What happens if this block stops working?”
Ask yourself questions in this format, if this block fails:
- Can units still be shipped/produced?
- Would it prevent from someone from doing their job in the field?
- Can units still send data?
- Can we still communicate with the devices?
- Are we dropping data?
- Will it impact accuracy or quality of data?
- How does this outage affect billing?
- Will we have to visit the site?
Asking these types of questions will likely identify situations that fall into the category of blip, block or bomb.
Blips are the most common errors associated with cloud computing, typically called “transient errors.” Blips are typically short, on the order of seconds or minutes. You should plan for blips as if they are a common occurrence; implementing retry logic is typically all that is required. Blips often happen when a service is busy and access to it is temporarily throttled. Another example might be regional network congestion. From a UI perspective, just adding a little feedback can go a long way. Let the user know that you hit a blip but we’re still trying.
A block is typically infrequent, but a significant step up in severity. Blocks are typically longer, on the order of minutes or hours. Think cellular outage or the DNS issue explained earlier. In these cases, retries fail and you’ll need to implement something like a pipe and filter pattern to queue up work that is waiting for the block to clear. Another common way to handle a block is to store and forward. These methods allow the stuff on either side of the block to continue normally and minimize any data loss or downtime. Blocks are extremely disruptive and potentially costly if left unhandled. For example, a service technician might not be able to do his job because the onsite workflow takes a dependency on a system that is not available.
Bombs can occur when the previous situations persist to the point where they result in a cascade of failures. A bomb is as bad as it gets and usually something that requires manual intervention to recover. A cascade might look something like this: A cellular outage causes all of the devices in a huge region to reconnect at the same time, creating a denial of service on the servers which, in turn, causes the server to restart … which starts the cycle all over again. Your goal here is to identify bombs and work to put barriers between them in an effort to convert them to blocks or blips if possible.
In any case, feedback to end users is a huge help and will go a long way towards mitigating the frustration associated with outages and IoT downtime. Don’t leave a user in the dark if you know something isn’t working. Let them know so they can do something else instead of hitting a button over and over.
Taking the time to think through these scenarios will help everyone appreciate how outages impact their part of the business. This is as much an educational and discovery process as anything. Probably the most important suggestion is to give yourself health and diagnostics visibility focused on dependencies. That visibility can be a huge time saver; without it, you’ll end up spending time hunting down something you can’t fix anyway.