There’s a subtle reality that often gets overlooked during the ideation phase of an Internet of Things solution. The “Things” are real, physical products that have to be manufactured, shipped, installed, connected, serviced, billed, etc. Enterprise workflow supporting all of those tasks is critical to commercializing IoT products. These workflow and logistics issues are as important as the product itself and can become showstoppers if they are an afterthought.
It seems the mobile app industry has set an expectation that scale can happen practically overnight. While there is a great deal of innovation and technology that IoT is leveraging from the app industry, there’s one fundamental difference. Software-only apps (like your favorite puzzle game) leverage a hardware platform that is already deployed. Mobile app-only developers are fundamentally decoupled from the manufacturing, distribution and lifecycle of the supporting hardware platform. However, in the case of an IoT product, the hardware is often the manufacturer’s physical product, and deploying it is a huge part of commercialization. The point here is that the simple physicality of IoT requires careful consideration from the very beginning of the design process. One could argue it warrants as much time and consideration as the core features of the product itself.
Think through what’s required to ship a physical product. Your “Thing” starts as raw materials which must be manufactured, assembled, provisioned, tested, boxed, shipped, unboxed, installed and activated. Most of those steps require human interaction of some sort, a far cry from clicking “download” in an app store. Not to mention, this is just initially deploying the product. If the product has an issue once in the field, you might have to update it, replace it or remove it — which may require additional human intervention. Although, sending a service technician isn’t always a practical option once the device is out in the field or deployed at high scale.
Time is one of the key considerations. On the front side, you have to consider the time required to simply procure parts. Availability can be volatile, especially for the latest and greatest tech. You may have designed the most innovative and cost effective system, but it doesn’t matter if you can’t fill your bill of materials because the key parts are in high demand.
Cost is probably an obvious consideration, but its sources not always as obvious. Pennies matter at scale. Minor decisions in one area can have a major impact in other areas. For example, you may save both time and labor costs by purchasing components that are pre-certified or pre-tested, allowing you to omit tasks during manufacturing. That can add up to a major impact downstream when you multiply small time and cost savings a few hundred thousand times.
One of the most elusive but, in my opinion, most important considerations is the actual installation experience. Simply getting the system up and running out in the field is often more complicated in practice than it seems on paper. Your goal should be to make the field installation as simple and smooth as possible. You can’t take a simple process, make it more complicated and error prone and expect to see widespread adoption. Let’s say the existing process is to mount the product and connect three wires. If that process takes the installer 30 minutes, that’s your baseline. The “new and improved connected product” will likely involve some workflow changes, but you should be ultra-sensitive to how you impact the current process. If that same installer now needs to bring a laptop and a USB cable and find a 25-character Wi-Fi password which must be typed in by hand … you see where I’m going. You’ve introduced additional variables into the process that are prone to fail. At some point, the laptop battery is going to die or the Wi-Fi password changes and this installer’s productivity level tanks. You end up spending all of your time on the phone providing support rather than creating the next great feature.
Each step of the process will also include some sort of B2B data integration. Whether you’re procuring parts, loading firmware, matching device identifiers to shipping records or activating device connectivity, you’ll need to communicate with a backend process. Enterprise integration for supporting systems monitoring inventory, scheduling service visits and billing will likely make up a significant portion of the overall professional services required when commercializing IoT products.
Consider the task list required to meet the analyst predictions for 2020. If you expect, on the low side, 25 billion connected things by 2020, we have an amazing amount of work to do. I’m talking about all of these logistics tasks required to simply deploy billions of things. Even accounting for the 4.5 billion already connected, if we started immediately, we’d have to connect a little over 11 million things every day for the next five years to meet those expectations. The logistics alone is a big mountain to move.
I believe the mountain is already moving, but as more people jump at IoT opportunities, it’s important to remind the community that we’re not shipping ones and zeros across a wire. IoT is about shipping physical products that require the coordination of people across several organizations and disciplines. It’s imperative that we consider commercializing IoT and the workflow beyond the basic technical use case and realize that small decisions made in one area can have a huge impact on the process elsewhere. Keeping that in mind will help ensure everyone who has to interact with the product — across the entire spectrum — has a good experience.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
In the entertainment world, companies first focused on personalization to provide the best customized content search and discovery experiences, as well as to provide a competitive differentiator. Personalization and customization are becoming increasingly important elements of success, and companies are using these features and technologies to define services, derive new revenues and build a loyal base of customers.
Personalization has the potential to positively impact monetization. Theoretically, consumers will purchase, rent and subscribe to more content when that content is recommended to them based on their own preferences and search history, as well as the preferences and history of friends or other people with similar interests and demographics.
Controlling the smart home
The growth of the smart home has been, in many ways, slowed due to the difficulty of producing compelling mass market value propositions for thousands of personal and unique use cases. There are use cases that appeal to a mass of people, such as home security devices and smart thermostats, but where one household might want security and not energy, another household might want energy but not GPS. The package of solutions that appeals to one household is likely to be different from the package of solutions that works for another household.
A wide range of devices are vying to become the go-to device in the home, including streaming media devices (e.g., Apple TV), personal assistants (e.g., Amazon Echo) and home robots (e.g., Robotbase’s in-home assistant). TV makers are also interested in positioning their devices as the main control hub in the home.
The ultimate smart home controller will be multi-purpose with an open platform supportive of third-party apps and will be able to control multiple use cases. These elements are necessary to support the trend towards the personalization of almost every connected experience.
One example is the Amazon Echo, a personal assistant controlled by voice. The device provides weather information, sports scores, shopping and to-do lists, streams music and reads recipes. The device also supports other smart home devices, such as Phillips Hue lightbulbs and Belkin’s WeMo product line. With a growing list of smart home partners, Amazon is developing the Echo into a smart home controller.
A shift in relationships
As devices are added to the home, CE makers are working to try to own the relationship with the consumer instead of working through service providers, who have long owned this relationship because they aggregate and bill for content and related services.
Traditional pay-TV providers aggregate local channels and offer bundled payments for voice, Internet, video and smart home services. Consumers generally believe that bundling provides lower costs for such services; cable companies and telcos reinforce that notion through their pricing structures. While device makers generally do not sell services, the Roku Account is a one-stop payment service that consumers can use to pay for content services provided by content partners that have elected to process transactions through Roku. Consumers can pay content providers through the Roku account for content subscriptions, rentals and purchases. One-stop pay on a streaming device simplifies the process, provides a better user experience and provides a sense of security, as consumers do not need to share their personal credit information with multiple companies. It is expected that similar conveniences will become available on personal assistance devices.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
It wasn’t an April fool prank; Alibaba did overtake Wal-Mart as the world’s largest e-commerce retailer this April. What’s this got to do with IoT or Logistics? Jack Ma, the founder of Alibaba credits the Internet and the smartphone as the keys to this success. A majority of the transactions were from mobile phones. How else can you take orders worth $1.5B in 12 minutes from anywhere? Would Uber be possible without a smartphone? New business models using the Internet and smartphones are changing how we order, receive and pay for things. Apple and Steve Jobs deserve credit for making the smartphone such a great thing.
There are many similarities in the evolution of the smartphone from a personal digital assistant envisioned by Apple with what is happening in IoT. Some lessons learned from developing the smartphone are also applicable to IoT development and its continuing success:
- The thing has to be easy to use.
- It takes an ecosystem for a thing to succeed. In the case of a smartphone, it was the Internet, telecom networks, software services and millions of applications developed by others and zettabytes of other people’s data stored in data warehouses scattered across the globe.
- IoT is permissionless innovation. IoT is not controlled by any single company.
Unicorns and rabbits running over gorillas
The Alibaba-Wal-Mart example just makes a point that it does not matter how big you are, you cannot rest on your laurels. There are unicorns, rabbits and all kinds of things disrupting existing markets or creating new markets. Nothing is sacred.
We are now moving into the next phase in IoT with a major shift. Billions of devices and sensors will be deployed in systems of ecosystems. Billions of devices and sensors will have to be managed securely. We will be moving from things with human interfaces to devices and sensors without human interfaces. This requires a different thought process. Devices and sensors embedded in driverless vehicles, machines, drones, ocean containers/reefers, farms and cargo moving all over the world create more opportunities for change and employment than anything we have seen yet.
The potential of IoT is much bigger than what the Internet and the smartphone enabled. However, for a meaningful conversation of such a broad domain as IoT, I picked on IoT in Logistics (IoTiL) as the scope that readers in this community may want to engage and participate in growing the ecosystem.
Logistics is more permanent than death and taxes. As long as there are people on this planet, logistics is essential for our survival and happiness. Everything physical that is made, grown or sold involves logistics. Logistics is a cost sensitive business. The logistics ecosystem is a complex web of multi-level outsourced services dispersed globally. The opportunities for IoT in logistics are infinite.
A single company selling $476B worth of goods on the Internet is only half the effort. Delivering the ordered goods to the satisfaction of the customers in the shortest time is the other half.
3rd party information logistics (3PIL)
A new industry is in the making. What e-commerce did to brick and mortar businesses, 3rd party information logistics (3PIL) will do to 3rd party logistics (3PL). Digital networks of supply chains will be created by devices and sensors automating data collection outside the enterprise. Devices communicating at digital speeds 24/7 without human touch will use new IoT dependent ecosystems and applications with less complexity for humans to deliver and unlock value not possible before.
Amazon’s recent investment in its own logistics emphasizes the growing importance of logistics to support its global growth, address new delivery requirements and reduce logistics costs. While Amazon invests in hyperscaling its logistics, how can smaller businesses and vendors with less financial resources step up their logistics? All e-commerce businesses big or small including Wal-Mart, Alibaba and e-commerce unicorns such as Flipkart and Snapdeal in India have to step up their logistics to remain competitive. IoT in logistics is the new digital foundation that future digital supply chain networks will be built on.
Given the hype around IoT, it is difficult for logistics decision makers to know what works. The devices and applications are evolving rapidly. As the funding in the industry slows down the emphasis will now shift to monetization of recent innovations in IoT.
How can we collaboratively accelerate successful IoT implementations?
How can we use IoT for good and also be profitable? The food industry, for example, struggles with waste and spoilage as high as 40% in many countries.
How can IoT benefit 99% of the people? National interests must be addressed for IoT to be successful.
If you want to engage in building an ecosystem using IoT in Logistics (IoTiL) please respond with your comments.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
When a concept is as far-reaching as the Internet of Things (IoT) — involving literally billions of elements — we need principles for organizing and making sense of the data it communicates.
That’s where an emerging IoT subcategory known the “location of things” comes into play. Location is a vital dimension of the IoT concept that encompasses the ability of “things” to sense and communicate their geographic position. In this context, location acts as an organizing principle for anything connected to the Internet.
With more and more “things” connecting to the Internet, the amount of data coming in is overwhelming. We need filters to pull out the data that is valuable for us. We are all interested in the things that relate most closely to our context — whether personally or for our work. Location is the key to context.
The birth of what would become the “location of things” dates back more than 20 years, with the introduction of global positioning systems. GPS technology became fully operational in 1995, blazing the way for a new paradigm in positioning. In its earliest uses, GPS helped the U.S. military navigate across the globe with unprecedented precision. By May 2000, the GPS system was opened up to the general public. It then took another decade for GPS receivers to become small enough and affordable enough to find their way into our smartphones.
The breakthroughs afforded by GPS paved way for the location-based services we enjoy each day: Google Maps, Uber, Waze, Foursquare and others. These multi-billion dollar services all were enabled by the ability of a smartphone to locate itself with fair accuracy and precision using GPS technology.
Most importantly, this new standard made it possible to move beyond the concept that only people could know where things were located. The world opened up when “things” had the capacity to know where other “things” are located.
In all of this, location simply acts as a search engine for geographic data. Before we had Internet search engines, users had to know what they were looking for — perhaps down to the precise URL to the webpage they needed. Then along came Google and other engines to do the work for us. And just as Google and the others help pinpoint data, location data helps organize the billions of internet-connected devices by location based on the sensors and other location-centric elements in them.
What this all means is GPS is just one part of “location-based services” or LBS. Lots of our devices and sensors — along with other assets, people and content — are inside buildings, where GPS has no real reach. That’s where indoor positioning systems (IPS) are creating the next big buzz within the location of things. As IPS technology continues to be enhanced and as more apps that harness its power become available, we’ll see a slew of new data becoming part of the location of things.
With IPS, the position data gathered can help in everything from finding devices and equipment, step-by-step navigation of indoor spaces such as shopping malls, helping with logistics in warehouses, enabling geofencing around sensitive data, assisting in social interactions and more.
What will happen when everything knows its location? Watch for future columns from me, where I’ll explore how the location of things will impact and change our lives in a near future. Here’s a hint: finding “things” is just the tip of the iceberg.
In the tech community, you’ll be hard pressed to go a week without hearing the term “Internet of Things.” Advancements in this area have already altered our lives and will continue to do so as more and more devices become connected. IoT has had huge implications on organizations and the general population, but what about the security for connected devices? Security is of the utmost importance, but there are many myths surrounding IoT security. Below you’ll find a roundup of some of the top IoT security myths I hear regularly. And I’m here to help put a rest to these myths.
10. “Tiny IoT devices don’t have power to do really powerful security.”
Even early 1980s grade 8-bit, 8MHz chips with only 2k of RAM can do elliptic curve cryptography with a 256-bit key-length and are effectively as strong as RSA crypto with 2,048-bit key length, which is strong enough for U.S. “Secret” level national security information. That crypto is done using such little battery power that signing or verifying data on the hour every hour for twenty years would only use a slice of an AA battery.
9. “Security is too complicated, especially in IoT. You can never win.”
It’s true that effective security never stems from any single silver bullet. Instead, just as most good houses need a few walls, a roof and a floor, effective IoT security can be composed from a short list of crucial ingredients:
- Good crypto to protect the authentication and potentially protect the confidentiality of data
- Cryptographic verification of any and all code and configuration before permitting the code to run with any configuration.
- Third-party runtime security by security professionals to mitigate any vulnerabilities in the code
- Over-the-air management capabilities, including update and software inventory management, telemetry and policy management for security agility
- Security analytics to find and fight sophisticated adversaries who don’t trip any alarms
These ingredients are simple and strong enough to protect top brands against the best attackers.
8. “Can’t update these devices.”
Many devices are difficult to update, but almost none are impossible. Industrial systems are deployed for 19 years on average. Cars and medical equipment are similarly designed to last decades. Now, we see industrial equipment vendors issuing updates for multi-decade old equipment as businesses bank on the integrity of those devices. We see the same for medical equipment, ATMs, point-of-sale devices, retail kiosks and now even cars.
7. “Security is too expensive for the billions of devices we deploy.”
At scale, security often costs only dimes per connected device. For any connected device north of $20, that seems entirely affordable, and reckless to jeopardize your brand by skipping or skimping on security. Some consequences are too expensive to risk when prevention is pocket change.
6. “We have air gaps, gateways & network segregation protecting us.”
Nearly all systems are connected in ways that their creators might not know, but attackers quite creatively find. This has been demonstrated repeatedly on military, intelligence and critical infrastructure systems, including, but not limited to, Stuxnet. Last year, an attack damaging a steel mill blast furnace in Germany went straight through a gateway designed to protect the operational network from such attacks. Gateways help reduce risk, but are not enough to provide adequate protection alone. Just as air gaps are not effective, VLAN’s and other logical separation are even less effective. For high-value systems, harden them from the inside and don’t gamble on the reliance of gateways, air gaps and network segregation.
5. “Blockchain vs. PKI”
Blockchain is a great ledger system for recording transactions and for digital (and physical) objects to carry such ledgers as they go. Unfortunately, most people forget that the ledger level core of blockchains rest on lower level foundations of traditional cryptographic operations for signing each transaction with traditional crypto ops, libraries, keys and credentials. Bitcoin, for instance, uses elliptic curve crypto with a 256-bit key strength, the same as often advocated for IoT systems with or without blockchain-style ledger needs. Key management is often an Achilles heel of most crypto-systems. That’s why more than a billion IoT devices already use the world’s most proven key management system, a Certificate Authority offering managed Public Key Infrastructure (PKI). Good PKI in the lower level foundation makes the ledger level core of blockchain stronger. In other words, blockchain is best leveraging good PKI.
4. “We just need vendors and standards groups to solve this faster.”
Vendors and standards groups are making progress, but that process takes time. Unless customers start asking for the types of security they need, such as the “ingredients” mentioned above, equipment vendors will continue selling equipment both without security and, more dangerously, with security as an adjective that doesn’t really measure up to adversaries.
3. “Ops teams running operational tech just need to learn from IT.”
IT vendors and staff have historically not been welcome in operational discussions and for good reason. Operational constraints are far different than IT environments and consequences far higher, often with radically different timescales. For better or worse, many technologies needed on the OT side have been used for years on the IT side. However, until IT vendors and staff learn to speak and appreciate OT language and culture, OT teams won’t have any confidence that the technologies have been selected and adapted appropriately for their environments. IT security has far too many tools in the tool chest for OT ops teams to manage. Picking the right tools and adapting them appropriately requires collaboration between IT and OT.
2. “Our systems are so obscure nobody can figure them out enough to do damage.”
Steel mills, water treatment plants, power grids, factories, power generation plants and countless other systems have been hacked as a result of that naïve belief.
1. “I can do it alone.”
History and recent headlines are littered with the shame of companies who attempted to manage security single-handedly. No one company — and no single vendor — can beat all the attackers by themselves. Defenders need to stick together. Hire professionals and be sure they have good partners in hardware, software and cloud computing, as well as what is relevant for your particular vertical.
Google Nest’s shut-off of the Revolv product it acquired in 2014 is just one example of what may become a common occurrence. E-consumers expect everything to be warrantied forever and always connected. When those devices are on subscriptions or contracts, as they often are, they can be kept functioning, supported and moving forward. But, as demonstrated by the Revolv bombshell — and that’s not too strong a word, if you are, I mean were, a Revolv customer — that’s not always the case.
Even in the industrial world, embedded systems often fail to be updated, and oftentimes become old and less possible to secure. This can be seen in old SCADA systems that manage large multisite industrial complexes, providing data collection and aggregation. SCADA systems were most often air gapped — not connected to the outside world — making security far easier and more realistic. These systems in many ways are the predecessor to IoT. By connecting these systems to the Internet — creating IoT — the security issues are endless, and now they must be updated, patched and protected.
When we as consumers buy something — whether it be a computer, phone or other device — we expect it to run forever. Or do we? Tech-savvy consumers recognize that most devices have a finite lifespan before the hardware or software is outdated or inoperable. We’ve all seen the phone that gets too slow and the computer that crashes too often. These are not just due to the age, but the complexity of the software as you upgrade to new versions with additional features that require more horsepower and elbow room to function well.
This changes considerably with IoT, as the device has a very focused function. In the case of the Nest Hub, it was a simple gateway controller. This means that it helps with home automation tasks. In order to function, there are costs paid for by Google (Nest) associated with securing and upgrading the gateway, hosting the Web services, and maintaining the application and infrastructure. When this becomes less strategic, companies will force an update to a newer platform and deprecate old services or devices. For example, if you have an iPhone 4, you cannot run the newest operating systems from Apple; you are forever stuck on iOS 7.1.2 with its security issues. Remember the iPhone 4 came out in 2010, less than six years ago. Similarly, if you have a PC with less than 1 GB of memory, you cannot run Windows 10. These are both good reminders that often times older devices or hardware are not supported forever. The same is likely true for IoT, especially consumer-facing devices with shorter lifespans.
End of life occurs based on the strategy of the company, the cost and complexity of supporting the technology, and the need to eliminate technical debt in order to innovate. With vehicles, for example, you pay a subscription to get the newest software, navigational maps and other services. Alternately, you can decide not to subscribe, and your car still functions with less accurate, less up-to-date software and systems. As these cars become increasingly connected, we will see the same types of issues we saw when SCADA systems became connected — namely, increased attacks and issues — which may impact safety. In order to mitigate this, manufacturers of vehicles may either offer subscription services to keep the vehicle updated and secured, or risk having upset buyers who own vehicles that are unsafe or less functional.
In enterprise software and hardware, you see various strategies employed by vendors. Many will offer specialized support for end-of-life products, where the user of the software or device pays an incredibly high yearly service contract to keep it functional. Otherwise, you see end of life occurring regularly. Most vendors publish end-of-life timelines to allowing customers to assess the risk.
This is the IoT buyer’s dilemma we find ourselves in today. The questions remain: What model will dominate in the future? Will it be subscription-based? Will users expect software and functionality for life? Which model will present the most compelling business case and ultimately win out? We need a connected crystal ball to see the answer!
Connected audio devices include a variety of consumer electronics connected to the Internet for the primary or secondary purpose of streaming Internet-delivered audio content such as music, Internet radio or podcasts. These devices include smartphones, tablets, PCs, digital media servers and players, audio-visual receivers (A/V), networked audio players, home theater systems, soundbars, multi-room audio systems, shelf audio systems, streaming media devices, wireless speakers, speaker docks and Internet/clock radios.
Connectivity enables some devices — or device apps — to act as controllers for content delivery to other connected audio devices, such as a Philips soundbar that delivers audio content to other wireless speakers in the home. Connected audio devices may also enable additional wireless interactions and possible integration and control over other smart home devices through a smart home platform. The connectivity of the Internet coupled with audio as a medium of interaction — not merely playback — is opening up a range of new possibilities for voice-controlled audio devices.
The economic landscape is marked by the disruptive effects of smartphones, digital music players and streaming media devices, which have increasingly challenged sales of traditional, non-connected home audio devices. Sony predicts that as sales for home video (Blu-ray, digital video recorders) and traditional audio components (stereo systems, amplifiers, Walkman devices) decline, the growing audio category — comprised of soundbars, wireless speakers and headphones — will account for 50% of its sales of products for the home video and sound market by 2017. These products provide compelling value propositions for consumers who increasingly want to enjoy the connected, wireless experience of mobile devices in the home.
Connected home audio device sales have benefitted from the following:
- High-penetration of home Wi-Fi networks
- Pervasiveness of Bluetooth- and Wi-Fi-enabled devices
- Convenience and portability of wireless networking
- Rapid expansion of streaming content services
Connected audio product innovations are supplanting traditional audio products, such as the once-popular home theater in a box, the traditional rack stereo system of components, wired speakers and speaker docks.
The technological landscape is marked by recent introductions of several new home wireless streaming technologies, including Google Cast, DTS Play-Fi and Qualcomm AllPlay. Two-thirds of U.S. broadband households use a streaming audio service; 40% exclusively free streaming audio services, while 26% subscribe to a paid streaming audio service as of mid-year 2015. These new technologies offer device makers greater flexibility for whole-home audio, higher-resolution wireless audio than previously available and broader compatibility with other devices. The variety of in-home wireless technologies creates a highly competitive environment as device makers align their products with one or more technologies and test what combinations will prove most appealing to consumers. Major streaming audio device makers such as Sonos are also seeking more integrations with home automation platforms. Where home control developers such as Control4 and Crestron previously had to reverse engineer Sonos integrations for limited functionality, Sonos began opening up its API to a limited number of developers in late 2015. At CES 2016, Sonos announced a new integration with the Insteon home control platform.
Finally, the design landscape for connected audio devices has been largely influenced by the “Apple-ification” and “app-ification” of the consumer electronics space. Concern for product design in the audio space is not particularly new; however, Apple changed the game on making product design an integral part of a brand — and a key differentiator among competitors. In the current environment, breakout brands in the connected audio space are those that combine advanced technology with fresh designs. The expanded visibility of connected audio devices throughout the home also creates an opportunity for product design to harmonize with home décor preferences as an integral part of the product offering.
Likewise, the increasing integration of mobile apps with connected audio devices extends product design into the app space, where the user experience in the app becomes just as important, or perhaps more important, than device design by virtue of the app becoming the principal point of interaction with the device.
Wave interference is the technical term for when two waves meet. The resulting displacement or superposition is the combined net effect of each wave. IoT data and analytics reflects in many ways the superposition between IoT and big data.
IoT is a continuously evolving concept, and some definitions include IoT data and analytics as part of the concept, yet fundamentally, the Internet of Things is the network of physical objects or things, digitalizing information about the environment and exchanging that data across the existing Internet structure. Big data too has not been immune to various definitions, and one of the more commonly applied understandings is that by McKinsey that big data are “datasets whose size is beyond the ability of typical database software tools to capture, store, manage, and analyse.”1
These two waves, IoT and big data, have started to meet, and not only that, have had a significant multiplier effect on the superposition outcome. The number of connected devices continues to grow and accelerate with the demand for more and more data. The value of data has also started to change in a positive direction with more and more insights achieved with real-time data sources and data aggregation.
The combination of IoT and big data has had its demands on enabling technologies. Fast data has generated new requirements in terms of data ingestion and in-stream processing. Big data has placed new requirements on data storage and how schema and queries are managed. Let’s examine these in slightly more detail.
Fast data becomes an important game changer
Big data is an important factor in IoT data and analytics, however the fundamental and more significant change that has taken place in data management and analytics has been driven by the speed with which data is now being processed and fed back into action in near real-time. From traditional batch processing with historical analytics driving insight over periods of days and weeks, fast data is about real-time ingestion and in-stream processing of data, down to seconds and milliseconds of actionable feedback. Examples of database providers able to meet these requirements are Exasol, SAP HANA, SQream and VoltDB. Fast data does away with a fairly traditional extract-transform-load (ETL) approach and has pushed the analysis of data from a back-end business intelligence activity to a critical front-end application plus feature; application plus referring to the expected outcomes of such applications as predictive maintenance or prescriptive decisions for medication routines, both involving a degree of machine learning/advanced analytics.
Big data is the challenge on the other side of the coin
Big data is not a new phenomenon. Big data has become an increasing challenge for many enterprises, and enablement technologies such as Hadoop are really what have driven this new opportunity space. With Hadoop, or more specifically HDFS for distributed file storage and MapReduce for distributed processing, enterprises were finally able to scale-in and scale-out their data storage requirements in a more flexible and cost efficient manner rather than the more traditional “more data, one more server” approach. Examples of database providers here would include Cloudera, Hortonworks and MapR.
Big data has also been about variety of data and not just volume. Here, NoSQL databases2 or new hybrid databases have pushed boundaries, creating schema on the fly or on read, and dispensing with the more cumbersome and limiting RDBMS columnar approach. As the growth in numbers of connected devices continues, the richness and variety of data sources will continue to expand, and from highly structured data, enterprises will need to work with semi- and as well as completely unstructured data to gain the additional value from data aggregation.
The value engine in IoT data and analytics
The creation of value comes from all the components in an end-to-end IoT application. Devices contribute to the value. Connectivity contributes to the value. Applications certainly have a major contribution component as does the data and the analytics. What is interesting to consider, as illustrated by the multiplier effect in the two-wave model at the start of the article, is the net effect combining IoT and big data has — a superposition or multiplier effect which is greater than the parts.
Data is a reusable commodity, and where value may initially be unlocked from the single data point in real-time, the aggregation of single data points, real-time and historical will also yield additional and valuable insights previously unidentified.
1McKinsey Global Institute, “Big data: The next frontier for innovation, competition, and productivity,” May 2011
2 For some more information about NoSQL databases, read the Machina Research Research Note, “Why NoSQL are needed for the Internet of Things,” April 2014
The Internet of Things has become a catchphrase for everything from self-driving cars to tiny sensors we swallow like pills. Regardless of the hype, the next wave of innovation will clearly leverage connectivity and the Internet deeply and in ways never before possible. From a practical standpoint, this means millions of new “Internet of Health” devices will be added to health systems over the next five years. According to a report from MarketResearch.com, the healthcare IoT market segment is poised to hit $117 billion by 2020.
With this growth comes greater complexity. Regardless of the specific new “Thing,” what are the key attributes to keep in mind in order to grow new solutions safely into large-scale solutions? First is clearly security, which cannot be tacked on later. New ways of securing tiny devices are available, but the products must be carefully designed to be secure–it’s more than just encryption, it’s how to maintain trust and privacy. Trust means knowing the identity of the data so you can act with confidence.
Second is manageability. The cost of the Internet of Health can be dominated by operating costs if the systems are not designed like the best enterprise tools for easy commissioning, updating and operation. Especially with IoT you have to be smart about large numbers of devices possibly distributed across broad geographies.
And finally, just like the best cloud-based technology, IoT systems should interoperate with your existing technology through clear and standard APIs. It should be easy to flow information across a well-designed IoT system and into your favorite connected application from report generators to machine learning data analysis tools.
Another dimension of IoT is its effect on existing devices and practices not just on new products. By using new sensors, for example, we can now measure and learn more deeply and more completely than was possible before, even if we are using existing equipment. Ease of deployment is key.
By using software-driven techniques, we can synthesize results from multiple sources (sometimes called “sensor fusion”) to yield new insights faster than ever. The results can be as straightforward as better operations by using predictive analytics to avoid equipment failure to new ways of improving the patient experience by sensing traffic flow, finding the right equipment faster, or lowering the cost of care through better environmental management.
Cars are now some of the largest and most expensive mobile computers on the planet, each being manufactured with several hundred millions of lines of code integrated into their systems. The code within these machines can do everything from connecting to our smartphones’ infotainment systems to analyzing the proximity of other cars on the road. Yet while IoT software simplifies our lives — from recording our steps and heart rate on the cloud to remotely changing our thermostat’s temperatures — developing IoT software is a lot more complex than traditional software development.
Imagine the following scenario: Your car recognizes you as you enter the car, and after consulting with your smartphone calendar, it knows you’re off to your weekly poker night with your friends downtown. It checks the real-time traffic information and recommends the fastest route to your destination. After you lose at poker (again), you’re heading home. Your smart refrigerator at home has already notified your smartphone that you need to pick up milk and other groceries on your way home, and your car routes you to your favorite local store. The next day, you’re driving to a meeting when the car notifies you that the oil pump is about to go bad and suggests that you to visit a dealer along the way — one that it knows is open at this specific time. Prior to making that recommendation, the car has already checked that the dealer has the part and scheduled the appointment for you. After a quick service, you’re back on the road.
This scenario is no longer science fiction, and the technology is already here.
What happens under the hood (literally)
To support these advancements, the car manufacturer has three software teams:
- A team focusing on developing the software that’s embedded in the car. This software is responsible for the interaction with the driver, providing health data, phone connectivity, etc.
- A team focusing on big data. This is the software that aggregates and analyzes data in real-time from the millions of cars on the road and all third-party connected services. This component is the one that receives the S.O.S. signal from your car about the impeding fuel pump failure, finds the dealer and directs the car to the shop.
- A team focusing on building the mobile app for seamless integration with the car’s infotainment system.
Requirements for managing IoT software complexity
Coordinating these three software teams is a challenge without the proper DevOps platform. Any software upgrade must be coordinated in such a way that it will not break the functionality between the different software components installed on different devices. Such a complex software design, with such high stakes — from missed appointments to driver safety — requires shared visibility, shared reporting and an integrated dashboard for centrally managing the software delivery. This allows the project team leaders to see the progress of any change requests or software updates on three different software tracks and ensures that each software release is going smoothly, with no quality issues or possible integration failures that could disrupt the service.
The three software teams will need a single integrated DevOps platform that can handle three different deployment-targets, each with its own specific deployment process, stack, etc.:
- The embedded software in the car itself, where software upgrades are usually deployed over-the-air.
- The data center for the big data storage and computation, where software updates are done over the Internet.
- The mobile app, which is upgraded via the app store mechanism.
In addition to the standard continuous delivery and DevOps platform requirements, there are additional important requirements for a multi-target solution for IoT companies to accelerate software delivery securely and reliably, while improving the quality of service:
- The ability to handle different deployment paths (e.g., embedded device via over-the-air update, data center via Internet and mobile app via app store) from a single integrated solution.
- The ability to enable teams to own the pieces of orchestration pertaining to their applications while enforcing a separation of duties.
- The ability to orchestrate the delivery pipelines for each team and manage the dependencies between these pipelines.
- The ability to provide an artifact repository to store and trace the life of each artifact.
- The ability to provide centralized dashboards and processes to facilitate the monitoring and management of delivery pipelines and releases.
- The ability to enable zero downtime upgrades and automatic rollbacks for full-stack or partial IoT service updates.
- The ability to provide complete traceability with automated compliance reports that are available on-demand.
Through a single, integrated, DevOps platform, the project team leaders can have a single dashboard to track progress on each software team, and the variability management of artifacts from three project teams can be centralized to accelerate deployment and to eliminate mistakes.
Furthermore, with the smart deploy feature, the integrated DevOps platform has control over the upgrade processes to three different environments, simplifying dependency detection and reducing the risk of undetected bugs.
Getting IoT right
IoT has ushered in a plethora of new and useful services that enrich our lives, simplify it or save us time and money. To provide these kinds of connected — and complex — services, software companies must have at least three different software teams, and they have to deliver the different, integrated service components across different platforms and devices. In addition, software upgrades must be coordinated across all environments to ensure service continuity. Only an integrated DevOps platform can provide the traceability, visibility, shared control and the ability to react quickly for these complex software development, test and deployment processes.