The internet of things continues to prove its relevance in organizations’ digital transformation strategy. Companies big and small invested in it in 2016 and are realizing opportunities in extending IoT uses out into the field to enhance customer relationships and driving business growth. Throughout 2016, technology providers of all sizes started to truly realize that alignment is vital to relieve fragmentation and that they don’t need to “own” every piece of an IoT solution to add significant value and succeed.
2017 will continue to bring more complexity to the market, but companies will start to find their uniqueness, allowing us to focus on the real problems at hand together.
Below are five predictions for what 2017 holds for IoT.
1. Measuring business impact and security fears will be the greatest inhibitors of IoT projects and solutions
There’s no doubt that IoT has huge potential for business impact, but end users need to get comfortable with the anticipated ROI in order to move beyond maker projects and proofs-of-concepts to real investment. Without understanding business value and potential ROI, IoT adoption will stifle and slow. There’s a bit of a catch-22 in these early days because companies that have successfully deployed an IoT solution typically don’t want to share metrics of their success with competitors. We’re seeing progress on this front with more and more companies willing to document measurable benefits in case studies, but it will take time for this to become the norm.
Once customers are confident enough in the business value and how to calculate it, the second biggest inhibitor becomes security fears. 2016 brought the largest DDoS attack ever delivered by a botnet made up of IoT devices, which shut down 1,600 websites in the U.S. A month later, a major attack on Dyn led to a massive internet outage across the U.S and parts of Western Europe. According to analysts and industry experts, this is just the beginning. Most of the hacks to date have been conducted through consumer devices that had limited to no security measures applied — products that promoted ease of use and instant gratification over security. That said, as the business value and associated attack surface grows for IoT so will the interest of attackers.
Throughout 2017, hackers will continue to exploit IoT device vulnerabilities to launch broad-scale attacks. To help fight this battle, the fractured IoT market needs to come together to develop security measures that render devices less vulnerable by default and promote best practices in deployment while also recognizing the importance of retaining usability. After all, for IoT solutions to be successful the business value needs to be far greater than the complexities involved with deploying and maintaining them.
2. Consolidation of IoT platforms through broad-scale collaboration
It’s no secret that the IoT market is fragmented. Currently, there are over 400 platforms, which is confusing to customers and slows down the process of research and development. The market needs to consolidate so that it’s easier for customers to leverage preferred technology and they don’t feel like the rug might get pulled out from under them with a failed platform that built its entire data integration foundation in a silo. We have seen a distinct trend in 2016 that companies have gotten it out of their system that they must “own” everything to be successful. In fact, many have realized that it’s simply not possible to cover all facets of an IoT solution well. In order for the industry to scale, we need to come together on a more common foundation so we can focus our differentiation where it matters in areas such as analytics, advanced security, vertical expertise and services.
A key factor that will accelerate interoperability efforts is open source collaboration. Open source platforms are increasingly being used in industrial, smart cities and utility industry projects. I think historically conservative industries will continue to get more comfortable with open source in 2017. We will see a leading open source platform project emerge that provides a center of gravity for data integration. This will mitigate between the fragmentation in connectivity standards. As more open source tools are developed and mature, they’ll become a vital part of the research and development process, too.
3. Increased focus on use cases within verticals
A lot of the IoT hype comes from the consumer segments — connected baby monitors, refrigerators and toilets. We have been focused on commercial and industrial sectors from the broader markets have started to realize that’s where the real ROI is. In 2017, the market will have deep conversations on use case development within industrial verticals. By creating and sharing blueprints and solution architectures for these use cases, we’ll learn from each other and make more progress quicker. The consolidation I mention above will get us closer to the utopia of an open and flexible horizontal IoT platform to which specialized tools and vertical domain knowledge can be applied to address targeted use cases. Meanwhile, proprietary platform and service providers focused on highly specific use cases will see traction and jacks-of-all-trades will be masters of none.
4. Vendors will focus on certifications to help advance IoT growth
Becoming a certified expert in one of the countless IoT platforms doesn’t mean a whole lot. However, as the industry works towards a de facto standard for data integration (and leaders that apply this foundation with their own differentiation emerge), the stage will be set for industry-specific certifications to take hold in an effort to keep the bar high and ensure that certifications hold weight. These certifications will help prepare top talent with the required IoT skill sets and will cover industry-standard tools, specific platforms that apply them and domain expertise in security, analytics or specific industries and use cases.
As a result, large companies and innovative start-ups will begin to invest heavily in low- or no-cost training certifications. This trend is already being seen in IBM’s Watson IoT Academy and PTC University’s ThingWorx Certification — and enterprises should be ready to keep track of what their IoT vendors are doing in the area of certifications.
5. Artificial intelligence will increasingly be used to mine the data coming from IoT devices
As IoT gets distributed across the edge and cloud, the insights will be boosted by the use of AI deployed via containers. AI has already been making a mark through aiding real-time decision-making. In AI’s future, developing more natural language capabilities will help to further realize the potential of a connected IoT world, as natural language-based data descriptions will provide a universal way to understand data among various types of devices. This approach will not only break down silos, but also allow people to communicate with IoT directly through voice or text.
This will come with challenges, especially culturally in the workforce. In countries where traditional industries dominate, such as manufacturing in Brazil, our Future Workforce Study found that 41% of workers said they worried a robot might take their job. Some companies are using the opportunity to retrain employees and teach them new skill sets. For example, data scientists will start training machines to go beyond reviewing large data pools for insights and answers. This will help machines to develop the knowledge to read between layers of data. AI will be able to interpret data differently, break it down more succinctly and identify and share nuances otherwise overlooked.
Do you agree with these? What other predictions do you have for IoT in the New Year?
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
As we prepare to say goodbye to another calendar year, I am often asked about my thoughts on the previous year and predictions for the upcoming one. While I don’t fancy myself a fortune teller, I am always happy to have the discussion. Taking a look back at 2016, I thought I would shed some light on the really great milestones we as an industry had this year. So IoT — this was your life in 2016:
Standards are becoming more important than ever
2016 was a huge year for IoT, but many industry folks thought that we’d see much more standardization than we did. As we close out this year, however, we are leaving it with a sense that standards are going to become more important than many of us imagined. If you think about it, the real promise of IoT is that everything can and will be connected. For example, your smart door lock, once engaged, can tell the lights to shut off and the thermostat to set itself to the preprogrammed temperature. With this in mind, having a bevy of standalone products that won’t work across platforms is going to make it hard for IoT to realize its true potential. We saw encouraging movement with Thread, Zigbee and other standard organizations in 2016, and I expect to see even more in 2017.
A clearer IoT landscape is beginning to form
Entering 2016, most IoT vendors were lumped into one broad category, but one thing I can say about this year is that we did see a bit more clarity in the landscape with some specialties starting to emerge. For example, GE is staking its claim as a leader in industrial IoT, AWS and Microsoft Azure are creating IoT tool kits that can be used to develop custom solutions, and with Xively, LogMeIn is focused on IoT connected product management. While I don’t believe the market is as clear as it needs to be, I have hope that 2017 will continue the momentum.
IoT moved from exploration to execution
Last year we dubbed 2016 the year IoT went from exploration to execution. It moved beyond the hype and is actually becoming a true reality. Major companies are doubling down on their IoT efforts in hopes to take advantage of everything IoT promises. The 2016 Consumer Electronics Show was a really good example of that coming true. Thousands of product companies brought new connected products to market, proving that many companies are fulfilling their initial IoT plans. At Xively, we saw many of our customers’ products hit store shelves this year and we are seeing many more customers looking for guidance on how to get their connected businesses up and running.
In 2016, IoT security got everyone’s attention
It’s no secret that security breaches this year dominated many IoT conversations. While the most recent security stories focused on older IoT devices like routers and DVRs, it was definitely enough to get the whole industry talking. And though I don’t believe these conversations were enough to shake consumer confidence or affect business decisions to offer connected products, it certainly pushed security up to the top of the priority list (as it should be).
IoT data identified as the real value driver
When many start out on an IoT journey, the focus is simply about getting a product connected. 2016 was the first year that the value of IoT data really became the star of the show. Being able to collect, manage and make data actionable became a core requirement for anyone building a connected business. Knowing who the customers are and how they are using the products is paramount to creating positive customer experiences that will make customers want to return, keep revenues streams up and keep the connected product marketplace from being a bunch of short-lived gadgets. While that statement has always been true, this year was the year it truly became common knowledge.
As I look back on this year, I am very proud and encouraged by the milestones that occurred. 2016 was a year of growth and great maturity for IoT. It was the year it really started making a play for the mainstream. And I, for one, can’t wait to see what 2017 will bring.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
In October 2016, my column discussed the impact of Moore’s Law and Metcalfe’s Law on the internet of things, basically stating that IoT is in your future because the Law says so. For decades now, Moore’s Law has accurately predicted the steadily rising roadmap of new and more powerful devices, which has essentially been borne out by all the new sensors, devices and things that power IoT. Metcalfe’s Law predicted that the value of a network can be linked to the number of connections within that network — it directly relates to the value of context and data generated by things, devices, data and apps within an IoT-powered solution. However, are there negatives associated with the impact of Moore’s Law and Metcalfe’s Law on IoT? Do these negatives outweigh the positives? Yes. More connections represent the potential to collect new types of data, correlate data sets that were difficult to compare before and optimize user and device experiences with a myriad of contextual information.
Conversely, do these new connections, devices and things also represent a broader and more attractive attack surface that is harder to contain and secure? Unfortunately, we have already seen the potential for this broader attack surface unleashed in a barrage of unprecedented distributed denial-of-service (DDoS) attacks where the devices used in the attack were categorized as compromised IoT devices. These attacks served notice that unprotected IoT devices can cause major disruptions to the underlying infrastructure of the internet, and have many questioning if the security risks for the internet of things offset the benefits.
There is no question in my mind that a larger number of smaller, portable and more powerful devices combined with making it easier for things, apps, data and services to communicate with each other increases the security risk potential. This trend is similar in many ways to how the BYOD explosion increased the number and type of security risks an organization faced. With BYOD, more devices connecting from beyond the traditional network perimeter required an evolving approach to security, and with the internet of things, the security approach must also evolve. But how? This evolving IoT security paradigm is comprised of many elements, some which we have not even thought of, and the approach will need to vary depending on the IoT use case. Protecting IoT devices in use on a corporate network will be very different scenario than protecting your public web site. To capture some general elements of what an IoT security strategy should evolve to look like, I sat down with one of my colleagues — Jeffrey Sanderson, senior director, delivery networks at Citrix, and asked him his opinion on how IoT security will evolve in the enterprise. Here’s a look at our Q&A:
Securing IoT: The evolving security perimeter
Chris: Jeff, as the recent DDoS attacks leveraging compromised IoT devices has shown, the internet of things has the potential to increase the attack surface for businesses. From this perspective, how should an organization evolve or change its security strategy with IoT in mind?
Jeff: Firstly, defining an effective security perimeter greatly minimizes the risks of a malicious attack. We all should be very clear on that. If we consider the evolution of web-based services, where originally secure SSL sessions would natively terminate directly on the web applications — meaning the applications themselves would handle authentication, authorization and session decryption — this effectively shunted that perimeter directly into the application domain. We saw the role of traditional application delivery controllers (ADC) and firewalls adapt to bring greater efficiencies and enhanced protection. Effectively, it provided a security perimeter one step away from the “sensitive” application layers and enabled the ADC/firewall to become the ultimate arbiter for what sessions make it through to the application. This configuration allowed for finer granularity in application access as well as detecting and preventing malicious attacks.
Chris: This makes sense. I remember when organizations really started to open their network for remote workers this idea of shrinking your perimeter to the application itself really started to take off. Basically, a “trust no one, secure your communications and de-perimeterize your network” approach. With IoT, how does this approach change?
Jeff: The same evolution is in play. The exception here is that IoT has introduced some new components to contend with, and these new components offer new roles to play in your overall security strategy. One example that I want to focus on is the IoT message broker. The IoT message broker is important to IoT because much of what makes up an IoT solution is the communication and chatter that exists between all of the sensors, devices, things and applications, and this broker is what helps to manage the sending and receiving of messages to all of these different IoT solution components. Generally, the IoT message broker sits in a layer in front of the applications/cloud services and, because it sits here in front of the applications, many organizations are choosing to also use these brokers to provide the security perimeter (terminating SSL, handling authentication, etc.). Let’s be perfectly clear here though — these broker solutions were not designed to be a security perimeter. Pushing such capabilities onto them is simply a “convenience of architecture” decision in the mad rush to get early IoT services to market. An IoT message broker has enough on its hands doing its day job, and to suggest it is the best place to implement the security perimeter is not practical.
Chris: Are you suggesting something entirely new then to provide this layer of perimeter security in an IoT world?
Jeff: Not necessarily. I believe that the role of the application delivery controller will evolve to provide a security layer that is capable of managing and securing IoT event traffic. This is important because IoT traffic is not like conventional application traffic (web/HTTP for instance). First, we are likely talking a lot more sessions — a whole lot more. Second, IoT sessions are not like traditional web/app sessions. For example, when downloading a webpage there are multiple TCP sessions opened to render that webpage content (text, images, videos, etc.), these last only as long as they are required (opening and closing very quickly). Not so with IoT sessions. IoT sessions can last longer — much longer, hours, days, weeks… The nature of IoT transactions is also quite different. Rather than with the traditional web application model where a client or device initiates the dialogue (let’s call this “client-pull” mode), IoT is very much “push” and “pull” in nature. So it is not just things or devices “pulling” content when they need it, but it may be the applications themselves “pushing” content to things and devices. All this means you can’t simply repurpose an application delivery controller into something that can control and manage IoT events. Enabling your existing ADC with a smattering of new IoT protocols is also not enough. Optimizing the underlying platform architecture to handle IoT workloads is a must, then optimizing how the platform handles security functions like authenticating a much wider range of devices and things so they can exchange information with your applications is also a must.
Securing IoT: Secure the devices? Or secure the applications?
Chris: But what about securing the IoT devices themselves? Should organizations focus more attention on securing the applications and traffic? Or more attention on securing the devices?
Jeff: It is not either/or. Both aspects are incredibly important but both present different challenges. Securing the thing can be as much about physical security as it is network security. What is perfectly clear however is that the thing must be authenticated to connect to the IoT service. As stated above, authenticating devices is the critical first step in gating the thing’s ability to interact with the service. Given the spectrum of types of things this changes the somewhat more controlled scenarios of today’s device connectivity into a much more complex environment and will require a change in thinking for many current approaches.
Chris: So, it sounds like the foundation is there for what you are suggesting, but handling the different types of devices and protocols will require an evolution in current approaches and solutions. For this evolution, how important will analytics be as part of an IoT security strategy?
Jeff: While the security perimeter itself should prevent any malicious attempts to “break in,” if weaknesses are exposed then machine learning should kick in to perform real-time corrective measures to eliminate this exposure. This may be as delicate as changing configurations on-the-fly or as brutal as temporarily decommissioning the service. But let’s be perfectly clear here, initially there will not be a one-size-fits-all solution. Going back to the theme of device versus traffic security we discussed above, there will be limits on the ability to protect the things from direct attacks from the internet. What can be done however is the correlation of analytics from the devices with the analytics from the event delivery control concept we mentioned above. These correlations would allow an analytics/machine learning function to help establish whether there is anomalous behavior at the thing itself, and then prompt corrective measures against compromised things.
Securing IoT: Is enterprise IoT at risk to a DDoS attack?
Chris: OK, let’s bring this back to where we started. The recent DDoS attacks in the news were devices facing the public internet attacking public sites. How much does that type of attack extend to enterprise IoT where many of the devices will be on private networks?
Jeff: The recent attacks should be a wakeup call for anyone considering the deployment of an IoT service. DDoS attacks will remain a chief tool of the bad guys. The only way to lessen the impact potential of this happening to your service is to take the concept of establishing the type of security perimeter discussed above seriously, and I mean very seriously. At the end of the day, nothing is infallible. You can, however, significantly reduce the risk of your service being a target by making it much, much harder to penetrate. If someone manages to expose some flaw, ensure that you can signature that behavior before it becomes a real threat and do something about it fast — ideally without human intervention. Policy creation to automate and manage identity and access, instead of using simple passwords as authentication, will be critical in reducing risk.
Securing IoT: Summary
To wrap up our discussion, it is important to take away that there is not a one-size-fits-all solution, and today many are trying to push too much onto tools not designed for security, like IoT message brokers. And while firewalls and application delivery controllers are tools that can help at the application perimeter, these tools need to evolve to be more focused on IoT-specific events. This evolution includes a greater emphasis on machine learning and analytics to allow for organizations to learn and respond quickly to IoT security threats. The evolution of technology doesn’t have to be a scary thing. We believe that in 2017 we will see an increase in focus on securing IoT, and many new-and-improved solutions for organizations.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
The industrial internet of things is rapidly becoming a reality as the spaces where we live and work begin to become cyber-aware and react to events happening in its environment. In these spaces are “things” of varying complexity which include sensors to measure values like temperature, CPUs capable of sending things like text messages, and antennas to communicate over protocols like Bluetooth — all at the physical edge of our networks.
Many vendors take an elementary approach to building the internet of things by simply connecting everything to a single server in the cloud. This approach certainly simplifies the solution but fails to realize the ultimate vision of a connected world.
The following examples illustrate the gaps in a solely cloud-based IoT solution:
- A train shipping cargo across the country does not have connectivity 100% of the time, what if the brakes must be engaged quickly?
- A welding machine must be deactivated immediately when a shield is broken, not continue to run for 30 seconds as clouds servers lazily transmit data.
Safety alerts must be issued in real time and cannot wait the communication delay of 15 seconds or longer.
Additional examples include:
- Insurance companies cannot afford to pay for terabytes of redundant data to flow into their network daily without passing along the additional costs
- Modern and efficient LED light bulbs that communicate with ZigBee are not able to communicate with a cloud server
To solve the real challenges of IoT, we need edge-based computing. With IoT edge computing we benefit from five immediate and measurable results. Products like Amazon GreenGrass for consumers and ClearBlade EDGE for enterprises are targeting to deliver these results.
One of the most important things for any IoT solution of real-world consequence is to have 100% availability and uptime. For end users this means a positive, reliable experience. For industries and companies building IoT solutions it means that factories stay operational, workers stay in safe zones and processes run at optimal efficiency. In order to achieve 100% uptime in an IoT solution, a device must be located near or within the actual location where the solution is running. It must be capable of performing all the fundamental tasks of alerting users, rerouting packages and tracking orders without the need for a large cloud server infrastructure. Edge-based processing allows for running the IoT solution in the physical location of the devices. The edge processing can keep the rules running that optimize fuel consumption during low-usage periods, cache data to map RFID tag IDs to work order systems, and allow messages from switches to go directly to the lights they control. Without the edge, all of these valuable, lifesaving capabilities can be turned off the moment a router needs to be reset, the internet gets overloaded with traffic, an ISP has a server upgrade problem or a cloud vendor gets hit with an outage. To achieve the real vision and value of our IoT solutions, we must guarantee 100% uptime leveraging the power of the edge.
Ability to communicate with everything
The next challenge of an internet-only IoT solution is that it fails to include so many of the devices we would want included. It takes a tremendous amount of expensive compute power to communicate with the internet. Smartphones today send a constant stream of information back and forth to the internet, but come with an expensive upfront cost of hundreds of dollars for the device as well as hundreds more in annual connection and service fees.
It is impractical to empower every device to have the communication capability of our smartphones. Alternatively, we should leverage reliable M2M solutions of the past that use protocols like OPC or CANBUS. Newer devices are using lightweight protocols like ZigBee and BLE. For all of these devices to come together we must adapt them to an edge-based device that can speak all protocols.
Securing the vulnerable
As more devices communicate with IoT solutions, there is more and more opportunity for security vulnerabilities to appear. Those low-level devices don’t have the power to encrypt, don’t have the infrastructure to update and can easily be replaced. The edge device can solve these challenges by not only communicating across protocols, but they can also build security into simple protocols. This means that while a lightbulb may transfer information a short distance over an insecure protocol, the transmission to the internet is protected both at an access layer and an encryption layer. An edge device that brings true value to an IoT ecosystem makes it possible for all of these devices to behave in their normal usage patterns while securely sharing data and interacting with one another.
A solution without latency
Expectations for speed and performance constantly escalate. Today, if sports scores take longer than five seconds to appear, if papers aren’t printed in 10 seconds and if stock trades don’t happen within two seconds, the solutions are inadequate. The demand for real time from users and machines in an IoT ecosystem will only escalate. In our culture 250 ms is too long to wait for a garage to open or a light to come on. These activities should be as instant as the current hardwired switches we rely on today. There is no way to achieve these expectations with a cloud server driving these interactions. Today’s network equipment is barely able to move data from your homes and businesses to a cloud, execute the required processing on multiple servers and send back a response in an acceptable amount of time. Instead IoT solutions that include edge-based elements can run rules in real-time and handle the routing of requests between a smartphone and a door lock, a ZigBee switch and CANBUS HVAC system, or a business rule that reduces the fuel sent to a kiln when the temperature reaches a maximum.
The real-time experience is necessary for optimizing the performance, safety and experience of IoT solutions.
Affordable support model
The number one success criterion for every IoT solution is the affordability of the solution. This not only means initial development costs, but also the future operational cost as each message travels across expensive infrastructure for years to come. IoT solutions that communicate incessantly with the internet push the operational costs much higher than solutions that communicate only information of significance. This means a solution that only sends a status update when the battery level is at or below 5% rather than repeating unlimited updates will have a much lower long-term cost of ownership. IoT solutions that include an edge allow for achieving this smarter, more optimized form of communication. An edge solution should go even further and allow an operator to change the communication rules years after the initial rollout. The ability for an edge-based IoT solution to execute filter rules, scrape data and curate device noise allows for providers to engineer IoT solution optimized for affordability.
IoT technology is quickly arriving to make our worlds more connected and seamless. The first steps we have seen from cloud-based platforms are just the beginning of achieving these goals. To achieve the real vision for IoT we must embrace both the servers and edge requirements. We must build always-on, integrating, fast and cost-effective solutions.
Companies across industries are sizing up opportunities in the internet of things and how they might seize them. Some see a chance to create premium products that could boost their profit margins; others hope IoT will help them boost productivity. Whatever the goal, there’s a huge and growing opportunity for IoT vendors, to help their corporate customers achieve their goals. In fact, we expect that by 2020, the vendors producing and selling the hardware, software and other solutions that power IoT will pull in upwards of $450 billion in combined sales.
Who will be the leaders in this nearly half a trillion-dollar market? While disruptive start-ups snag a lot of headlines, the incumbents — analytics companies, cloud service providers and network equipment vendors — have an enviable head start in this race. With their long track records of innovation and their ability to integrate their products with existing systems, customers will look to these trusted providers to help them solve the many challenges of IoT adoption, from security concerns to simply getting products to work smoothly across different platforms.
But having a head start doesn’t guarantee victory. We surveyed more than 170 executives of IoT providers (the vendors) and more than 500 executives of companies looking to employ IoT solutions (the customers). We identified five possible pitfalls that vendors would be wise to avoid.
Pitfall #1: Spreading investment too thin
The majority of the vendors we spoke to said they’re investing in IoT solutions across four or more industries. It’s no wonder nearly half of them view prioritizing investments as one of their biggest challenges. While it’s tempting to cast a wide net when it’s unclear which area will pay off, these vendors risk developing solutions that are not tailored enough for early adoption — making implementation more difficult or return on investment unclear. Instead, they should hedge their short- and long-term bets by focusing on three to five use cases and developing a handful of targeted solutions within them.
Pitfall #2: Not understanding your true profit sources
There has been much buzz and excitement in the press around new consumer devices, such as wearables or smart home devices. However, we believe that the biggest long-term profits are more likely to be found in enterprise and industrial solutions. There may be revenue generated from devices themselves, but the best sources of profit are often less visible, such as the software and services required to manage, analyze and create services from the data generated by these devices.
Pitfall #3: Delivering standalone products
Vendors often sell a great product or service and then expect customers to integrate them into their own operations. But when some of the biggest challenges facing IoT adopters are related to interoperability and a lack of internal expertise, customers are wary of components that they’ll struggle to use. Leading IoT vendors work with partners to deliver end-to-end solutions that customers can easily integrate.
Pitfall #4: Underestimating the funding required to achieve scale
IoT has already drawn massive investments — more than $30 billion in venture funding and over $75 billion in M&A activity by major vendors. IBM has announced plans to spend $3 billion to bring IoT solutions to market in the next few years. And well-known telcos and device manufacturers are investing billions as well.
Companies often underestimate the cost of constant innovation, launching products and encouraging adoption. To compete with established players, vendors may need to team up with partners or pour in more money than they expected. What’s more, many IoT investments are at risk of being cut before they ever have a chance of gaining traction in the marketplace. Vendors should approach these early IoT and analytics investments as they would venture funding — protecting them from aggressive cuts, giving them time to develop and tracking their success with clear milestones.
Pitfall #5: Reassigning current employees instead of hiring specialized talent
When you’re blazing forward in a disruptive landscape as an incumbent vendor, you very likely won’t have the right team in place. Only 10% of our survey respondents said they had the right people in the right roles to capitalize on the IoT. To capture this opportunity, most companies will need to hire talent with new capabilities, industry expertise and entrepreneurial experience.
The internet of things is ushering in a new industrial revolution that will forever alter our relationships with the objects we use on a daily basis. We are now entering the era of “anything as a service.”
Manufactured goods used to be built on a particular use-case to perform a particular function. When needs changed, that manufactured item became obsolete and a replacement item was required. However, with “intelligent devices” constantly gathering data, there is a new paradigm in this connected world. Manufacturers can now offer customizable, upgradable products and services flowing from a single device. Customers can alter the function and the value delivered by IoT devices.
Take Tesla for example. The auto manufacturer announced that for $3,500 it was allowing customers to download a software update that would upgrade their existing car with an autopilot feature. Tesla owners were no longer constrained to wait for a new automobile to take advantage of new features, and Tesla proved to the modern world that a car is no longer a fixed object but an ever-changing, ever-customizable service that can be tailored on an ongoing basis with new capabilities of value to customers. Want added convenience when driving? Pay for a software upgrade and now your automobile operates the way you would like it to.
Tesla has innovated a new mechanism for differentiating its products. By leveraging the power of software to deliver new products, features and enhancements to existing customers, Tesla:
- Differentiated its product from every other automobile on the market by offering a way to automatically upgrade existing models with new, in-demand functionality
- Created a new revenue streams by monetizing its autonomous driving software upgrade — adding to the revenue potential of both new and existing customers
- Minimized manufacturing costs by offering this functionality via software rather than requiring the development of a new physical hardware model
- Established a direct, ongoing relationship with the end-customer and deepened that relationship
As manufacturers seek new ways to build customer relationships and grow profits, they will need to become more strategic. Like Tesla, manufacturers of internet-connected devices are poised to reap the reward of recurring revenues from sales of hardware, upgrades, apps and services. The recipe driving this transformation consists of platform + apps + service. This is delivered through a combination of:
- Hardware platform (the actual hardware device and its component parts)
- Software applications that control features and functionality of the hardware, software and services delivery
- Software monetization (licensing and entitlement management, which sorts out which device features, functions and services a customer has paid for and can therefore access)
With services becoming critical to selling solutions, monetizing software will be essential to profitability. By adopting software monetization processes, for instance, building automation manufacturers can equip their control panels with every feature and upgrade available — and simply turn on or off specific features via software and licensing based on what the customer has purchased. Such an approach will allow IoT device makers to monetize every single feature in their product at no additional costs.
We have seen this trend already reshaping the manufacturing space. The key challenge for traditional device manufacturers will be to stop viewing their products as fixed objects with fixed-in-time features and functions. They must start thinking and acting like service providers, who can deliver new value to existing customers to accommodate their changing needs. This will be done by understanding how software monetization plays a role in transforming new value propositions into revenues.
So you’ve generated a tsunami of data from your consumer-grade internet of things devices — now what?
One of the odder aspects of IoT is how the “things” get all the attention when we talk. The reality is that in order for IoT to deliver real value, the things have to coordinate their activities over distances far further than Wi-Fi or ZigBee can provide. This implies IoT will involve millions of devices being utterly dependent on some form of controlling intelligence via the internet. So while the cute gadgetry gets the attention, the real value will reside in the back-end systems that conjure value out of bit streams in real time. This, in turn, brings up the question of latency.
Poor latency at mass scale will doom many IoT projects to failure. A consumer who turns on a lightbulb expects it to turn on when the switch is flipped — not a second later. Anyone who has tried to hold a conference call in which one of the participants is 1500 ms behind everyone else will be viscerally aware of how badly humans cope with latency issues, and the gadgetry in IoT will be no different.
To make matters worse, many IoT applications will involve optimizing the behavior of swarms of devices over very short timescales using real-world wide area networks, which are far more idiosyncratic in their behavior than a tame development environment.
Traditionally in IT we can make accurate decisions on small volumes of data quickly, or very large volumes of data several minutes after the fact. Online transaction processing systems tend to think in terms of thousands-of-transactions per second. Conventional analytic technologies can deliver much higher volumes, but with much longer latencies. For IoT to actually work, we need to make millions of very detailed and accurate decisions over very short (< 5 ms) timescales — requiring real-time data management.
Companies such as Push Technology are focusing on improving connectivity by providing reliable “reality-aware” network links, but mass-scale, accurate decision-making represents a challenge to IoT industry. Over the last decade we’ve seen Hadoop and related technologies revolutionize the economics of slower batch systems, but there hasn’t been a similar, widespread change in our ability to make high-volume, accurate, compromise-free decisions on the scales at which we expect IoT to perform, in real time.
When the first data warehouses were deployed by supermarket chains, there was a period of joy as the grocery industry realized it could measure things like exactly how many cans of baked beans were sold in Scotland the day before, followed by a period of frustration when they realized they didn’t have any complete ideas for turning this information into more money. We are now seeing something similar with IoT, with a new twist being the realization that the commercial usefulness of IoT sensor data arguably has a half-life of about 500 ms. Being able to act in real time — within milliseconds — will be key to making money in many IoT use cases.
A further wrinkle is that the actual logic that will be implemented will be a far cry from the “Dog & Pony Show” demos now being used to promote IoT technology. With a product development lifecycle of 12-18 months and consumer expectations that physical devices like fridges will work for decades, the average IoT play will end up supporting at least 7-10 different generations of the same product — and that’s before first movers acquire their competitors.
So what can we conclude from all this? Much of the noise and hype in IoT focuses on the part of the process the industry is comfortable with — high volumes and high latencies, with any decisions being either obvious (such as switching lights off) or made by humans after the fact. But these systems are generally useful for optimizing existing processes. The problem with such optimizations is that the benefits are small, finite and decrease over time due to competition and technological change.
The more radical kind of change, which involves totally new use cases that involve automated systems making millions of accurate and precise decisions per second, is only becoming possible now. We’re moving away from an era where maybe 50% of all IP addresses were ultimately connected to a human, to one in which people will be outnumbered 20:1 by devices.
The real money is going to be in emergent use cases that combine high volumes, single-millisecond latency and the ability to make millions of commercially useful decisions in the trillions of tiny windows of opportunity each day.
VoltDB is ready for the real-time data management challenges of IoT. Are you?
In the digital age, even Santa Claus is under pressure to deliver an improved customer experience. So he will be grateful this year for a new tool to help him plan his deliveries across the globe. Santa’s 2016 world tour will be faster, more agile and more productive than ever thanks to the internet of things. IoT is helping supply chain managers across the globe to develop more efficient manufacturing and distribution processes than ever before.
An IoT-enabled supply chain provides critical information such as location, speed, product temperature, vibration and product condition. These pieces of information can be used to calculate ETA and take action if products in-transit aren’t maintained in the contracted manner. This information can also be used to synchronize multiple assets together to remove wait times, congestion and, in short, remove an aspect of supply chain waste that practitioners created workarounds for years because the granular level of visibility needed to eliminate them wasn’t available. An IoT-enabled supply chain has the ability to decrease costs, but it is only one aspect of a resilient supply chain.
A specter of the Christmas-that-almost-wasn’t arose recently when a large shipping company, Hanjin Shipping Co., suddenly went bankrupt. Around 90 huge container ships were stranded at sea, during the peak pre-Christmas shipping period. These ships were carrying half a million containers of electronics, clothing and furniture worth about $14 billion. Few ports would let them in, creating logistical problems around the world and giving many retailers a good reason to worry in the run-up to the holidays.
While many may think of supply chains as linear processes — where Christmas goods come directly from the supplier (the North Pole) to sit under their tree — these are actually complex systems that create, receive and distribute products to customers with many checkpoints along the way. In the modern supply network, IoT is the link that allows us to intelligently connect all the people, processes, data and things at each point. This collection of technologies — sensors, servers, analytics engines, etc. — is able to make sense of real-time data coming from each step in the supply chain, allowing managers to evaluate and predict events more accurately than ever before.
A disruptive event such as the Hanjin bankruptcy may be manageable after the fact, but the damage is already largely done — late shipments, lost revenue and unhappy customers. This is why it is important to prepare in advance and deploy technology that makes it possible to identify possible disruptive events before they happen.
In the case of Hanjin, there had previously been rumors about its financial instability prior to the bankruptcy. The issue is that it’s difficult for supply chain managers to act on rumors. It’s even more difficult to ascertain truth from fact in discussions with the vendor. Instead, if a receiving Hanjin shipper or customer received an actionable alert about Hanjin’s financial problems, you could have taken steps to mitigate the potential future risk, such as finding an alternate shipping company, or guaranteeing that critical or time-sensitive inventory such as Christmas stock always leveraged a transportation company with a low-risk profile.
The holiday season is usually a merry time for manufacturers, retailers and consumers, and with increasing IoT adoption, happiness and peace of mind will abound. By connecting all the elements in their operating infrastructure, supply chain managers will enjoy lower costs, greater insights with the ability to better respond, in real time, to changing conditions and changing needs. Ensuring your supply chain has visibility to risks, regardless of their origination, and the ability to manage or mitigate those risks will ensure your company will have a merry holiday season as well.
Venture firms throw billions of dollars at blockchain startups, while both startups and enterprises prepare to capitalize on the potentially massive opportunities for blockchain beyond finance. Developers are applying the approach to solve problems in power, supply chain, health care, media and many other use cases. The technology is evolving rapidly and a rich ecosystem of players is assembling. Based on interviews with industry stakeholders and extensive secondary research, we find that the power sector has emerged as an early leader for blockchain development outside of finance, with the potential to cut costs and streamline processes in several types of power industry transactions.
Blockchain is an innovative database and transaction technology with broad potential
Outlining the basics of blockchain and dissecting its value proposition, we conclude that blockchain can be a great enabler but has its fair share of hurdles and limitations:
- Blockchain is a distributed database technology that securely maintains a growing ledger of data transactions and other information among participants in the network. Developers are attracted to blockchain for its inherent security, data integrity and decentralized nature. The concept of “smart contracts” is a new innovation that furthers blockchain’s value.
- Bitcoin was the first implementation of blockchain technology. Two other major blockchain infrastructure platforms have emerged: Ethereum and Hyperledger. A wide range of products and projects are being developed on top of bitcoin, Ethereum and Hyperledger, while some choose to build their own blockchain solutions from scratch.
- Financial services is the most developed application of blockchain technology. Use cases in power, supply chain, agriculture and health care are beginning to take form, while the applicability in other sectors is less clear.
- Blockchain still has several obstacles that lay in its path to broad adoption, including lack of talented programmers, fallibility of smart contracts, averseness to using cryptocurrencies and the increasing regulatory environment.
The power industry will find blockchain to be a useful tool in the right applications
In the power industry, we examine the potential for blockchain in three key power applications, finding differences in value proposition, challenges to implementation and adoption timeline:
- Blockchain has a lot of potential to modernize wholesale electricity transaction processes as well as the financial markets that support them. However, it faces an uphill battle against influential, conservative stakeholders, who will take a wait-and-see approach to the technology.
- Peer-to-peer electricity trading is emerging as a high-profile application of blockchain, allowing neighbors to sell their excess power to each other. Blockchain can provide a high-fidelity, middleman-free settlement system for these trades, but poor economics and regulatory barriers will keep peer-to-peer networks niche in the near term. The application of blockchain to “transactive energy” — in which participants buy and sell power in small and frequent transactions in reaction to fluctuating electricity prices — closely matches peer-to-peer transactions in its value and challenges.
- Meanwhile, environmental attribute markets, such as those that exchange renewable energy credits (RECs), are already implementing blockchain, but only in the least-regulated applications, where both the barriers and benefits of implementing blockchain are relatively low.
What is blockchain?
Blockchain is a distributed database technology that securely maintains a growing ledger of transactions (defined as transfers of data natively tracked by the system) and other information among participants in the network. Developers are attracted to blockchain for its inherent security, data integrity, decentralized nature, and its ability to simultaneously provide both public openness and effective anonymity. Bitcoin was the first implementation of blockchain, which proved its robustness and overall value.
Bitcoin, Ethereum and Hyperledger are the three main games in town
Some blockchain developers choose to build custom blockchain solutions with their own toolsets, but there are few true experts who are capable of building blockchain solutions from scratch. As such, most developers leverage the available tools from bitcoin, Ethereum and Hyperledger. Depending on the specific requirements of a given project, these three platforms have strengths and weaknesses. Bitcoin does not offer native smart contract functionality, although companies like Counterparty and RSK are trying to extend that functionality into bitcoin; Ethereum and Hyperledger both offer native smart contract utility. As mentioned, Hyperledger is the only option of the three to provide permissioned blockchains, which offers advantages for users who only admit known participants into a blockchain project. Bitcoin is widely regarded as being incredibly secure, although concern is beginning to emerge over the consolidation of the mining pools; Ethereum’s reputation is in question after the Dao situation and an ongoing series of denial-of-service attacks; the jury is out on Hyperledger as there have not been enough scale deployments.
Beyond these three dominant core infrastructure providers, the blockchain value chain is rapidly becoming rich with a wide range of entities, many whom build on the core three (bitcoin, Ethereum and Hyperledger), as well as many who build their own solutions from scratch. This ecosystem includes horizontal platform solution developers, vertical solution providers and enterprises that are developing products or contributing code to other projects.
Listen to a podcast on blockchain from Isaac Brown and Katrina Westerhof.
Developers are targeting multiple enterprise use cases beyond finance
While many startups and major ICT players are developing horizontal blockchain platforms, plenty of startups are developing industry-specific enterprise blockchain solutions; meanwhile major enterprises from specific verticals are exploring the impact blockchain can have on their industries. Blockchain will likely have an impact on every industry, but there are several industries where the specific relevance is beginning to take shape.
We analyze three potential use cases for blockchain in power transactions
Power transactions come in many different flavors, as electricity changes hands several times between generation and consumption. Here, we use case studies to analyze three types of energy transactions — wholesale, peer-to-peer and environmental attributes — exploring the value of blockchain, the solution architecture, the barriers and problems to be solved, and the outlook. As an example, a blockchain solution for peer-to-peer transactions is outlined below:
Stakeholders should investigate blockchain, but be realistic about fit and readiness
Blockchain is an excellent tool for the power space, streamlining transactions that have long required manual work and enabling new types of transactions that would be impractical otherwise. Much of the compatibility between blockchain and power transactions is related to the ease of measuring electricity with metering infrastructure, which allows data and markets to coordinate in a robust way. However, power incumbents are generally conservative players that are difficult to woo or uproot.
Overall, peer-to-peer applications provide the greatest value over alternatives, but are held back by regulatory barriers and a limited market size for the time being. Blockchain has a lot of potential to modernize wholesale transaction systems, but it faces an uphill battle from influential, conservative stakeholders. Meanwhile, with low barriers to blockchain adoption but also relatively low value that can be gained, REC markets are already seeing some adoption, but are unlikely to feel much impact from the technology. Other applications like EV charging and retail have little to gain from blockchain. These findings are summarized below:
Contact us for more information.
Lux Research analyst Katrina Westerhof contributed to this article.
Industry 4.0 is the wave of the future for the manufacturing of things.
Originally conceived as a vision of the German government in 2006 as part of its High-Tech Strategy 2020 Action Plan, Industry 4.0 is frequently lauded as the fourth Industrial Revolution. The impact could well surpass that of the first Industrial Revolution that began in Britain in the late 1700s and took us from an agrarian economy on to a path for mass-producing affordable goods using steam power, electricity and, eventually, computers and automation. In other words, we’ve progressed from the horse and buggy to the Model T, and now we’re on to self-driving cars!
What will power the smart factories of the Industry 4.0 era? The internet of things, cloud computing and cyber-physical systems (CPS) technologies. Cyber-physical systems are powered by enabling cloud technologies which allow intelligent objects and cloud-based programmatic modules to communicate and interact with each other. These new cyber-physical manufacturing facilities use robotics, sensors, big data, automation, artificial intelligence, virtual reality, augmented reality, additive manufacturing, cybersecurity systems and other cutting-edge technologies to deliver unprecedented flexibility, precision and efficiency to the manufacturing process.
Yet while the Industry 4.0 revolution is forming, it’s important for companies aiming to be at its forefront to carefully consider which platforms are best positioned to deliver the promise of this exciting future, and what capabilities those platforms should possess.
Developing products, business processes and apps within an Industry 4.0 framework requires thinking beyond what any single product or system can be expected to do. In fact, the most exciting aspect of the Industry 4.0 vision is open and evolving industrial systems that can rapidly take advantage of the latest technological innovations. Imagine what the future could hold for an IoT product that has a complementary ecosystem build around it.
Intelligence sharing for smart factories
Up until now, the manufacturing automation landscape consisted of technology and data silos, based on hardware vendors. Companies with a global footprint of factories often end up with a heterogeneous and incompatible mix of automation technologies. And while these individual systems may each collect and transmit, they are not designed to easily make this valuable data available to other manufacturing systems, either within the same factory or perhaps located in another state or country.
IoT cloud platforms provide a powerful solution for harmonizing incompatible connected devices. On the factory floor, IoT compatible gateways provide a mediation layer between the proprietary protocols used by many vendors’ automation systems and the open internet-based protocols that are the foundation of IoT. Data from disparate manufacturers can be normalized in the gateways before transmission to an ingestion queue in the IoT cloud, while edge logic can be pushed to the gateways for local control of connected devices.
Mining potential of the cloud for healthy complex systems
But the real potential lies in the cloud. When cloud-based cybernetic intelligence is linked to global manufacturing operations, machine learning algorithms can identify patterns and extract insight that can optimize operations. As a factory in one region creates more optimized workflows that improve efficiency, those benefits can be rapidly exposed and propagated throughout the global operation. As predictive algorithms identify signs of potential system or subsystem failure in one factory, other factories can act quickly to avoid catastrophic incidents that can ripple through the entire business.
It’s useful to think of modern manufacturing environments as complex and interconnected living organisms, similar to the human body. Ensuring optimal health depends on the ability to:
- Rapidly identify exposure to pathogens
- Efficiently analyze root causes and potential secondary effects
- Develop effective remediation strategies
The first step requires diagnostic tools to visualize data across such interconnected systems, or steps two and three are very difficult to accomplish. And, if it is too costly or time-consuming to employ these tools, the patient is likely to get worse instead of better. So, what tools are necessary to realize an Industry 4.0 vision that will take smart factories to the next level?
The missing link: Programmer-less visual design
One of the most significant impediments to realizing the Industry 4.0 vision is implementing the necessary tools and applications to holistically visualize operations, identify opportunities for improvement and implement changes. Even in an IoT environment, applications must be created to take advantage of all of the data available in the cloud.
The traditional approach calls for hiring an army of programmers to build a “solution.” Not only is this approach costly and time-consuming, but it’s highly inflexible. Living organisms are constantly adapting to their environment; businesses and manufacturing operations are no different. As the environment changes, they must react rapidly. Business optimization is not a once-and-done event, but a constant battle to bring all the forces within a business into equilibrium. If every change to a cyber-physical system costs a million dollars and takes a year to implement, the promise of Industry 4.0 will never be fully realized.
But what if programmers can be removed from the equation? Okay, an entirely programmer-free approach to automation is still some distance off. But an approach that minimizes the use of programmers and maximizes the use of business analysts and subject matter experts is starting to emerge. The key is visual modeling, analytics and orchestration.
Visual modeling tools allow the elements of the system (and the data associated with those elements) to be modeled by the people who understand them best. Models can be refined and expanded without programming effort as the complexity of the system grows. The inter-relationships and dependencies of the component parts of any system are as important as modeling the components themselves. Any visual modeling tool should provide the capability to easily define these relationships.
Once a system has been modeled and the data is ingested to the cloud, that data should be exposed for decision making. Depending on roles, different stakeholders will see different slices of the data to inform their understanding. An Industry 4.0 development platform will allow easy extraction and normalization of data from a wide variety of sources (both real-time and non-real-time), with useful dashboard views and alerts using drag-and-drop technology. Platforms will allow data to be fed to machine-learning pipelines and emergent patterns easily viewed.
Orchestrations that influence behavior of the entire system will be created. These visual orchestration tools will allow drag-and-drop workflow design, built from a palette of programmatic components. These workflows can be tested and deployed as micro-services using a DevOps methodology.
Finally, with cloud platforms of the future, visual design services will be consumed as a service, and applications can be deployed on the public or private cloud of choice.
With tools such as these, the work of defining and evolving the efficiency of the system will be controlled by the people who know it best.
From IoT clouds to Industry 4.0 clouds
The vision of industry 4.0 is a long way from the days of producing cloth by laboring over spinning wheels by the hearth. Modern IoT clouds are now beginning to deliver the tools needed to make the Industry 4.0 concept a reality. Platforms running on popular IoT clouds are already delivering on this vision of visually designed factory automation to catalyze the transformation of the manufacturing landscape. These new technologies promise to unleash a virtual tidal wave of change throughout the manufacturing industry. And that rising tide promises to lift all boats, as businesses become increasingly capable of rapidly adapting their manufacturing to meet the needs of an ever-changing competitive landscape.