If you’re like me, you’ve been a believer in the potential of the internet of things since the beginning. It’s hard to deny the potential positive societal impacts a connected world can drive, but does it mean it’s a vital investment for companies? Like most emerging technologies, there are always questions on whether or not you can make a true business case for it. While some reports forecast that nearly $6 trillion will be spent on IoT solutions over the next five years, for this to truly occur IoT needs buy-in at the enterprise level. Spending on IoT projects will only continue to climb if it helps companies realize true ROI and it can ultimately drive positive business outcomes.
Since IoT is seeing a strong adoption rate around the world, it’s clear to me that there are significant benefits and ROI opportunities that prove that IoT technology can have true business value. We’re seeing companies move from simply adopting internet of things technologies to actually putting it in position to drive their business. IoT is producing measurable results, and the top performers who are seeing the greatest results treat their IoT initiatives as business projects, not IT purchases.
Let’s take a look at some of the key proof points from Vodafone’s 2016 IoT Barometer that outlines how IoT is becoming a vital investment for companies:
- Recognizing IoT ROI: Yes — a lot of acronyms, but IoT cannot move forward if there aren’t true return on investment (ROI) opportunities for organizations. Luckily, 63% of IoT adopters are seeing “significant” ROI in IoT projects, up from 59% in 2015. Whether it’s connected supply chains for manufacturers or smart office capabilities for employees, businesses are seeing significant results from their IoT deployments, changing the way they do business. In fact, adopters are seeing a 20% improvement in key business indicators like revenue growth and cost reduction as a result of investing in their IoT programs.
- Driving future success: Successful businesses always have an eye toward the future. For businesses in the Americas — e.g., U.S., Canada and Brazil — IoT was reported to be a top business focus, as 74% of companies view IoT as critical for the future success of their organization. Additionally, 48% of companies globally are using IoT to support large scale business transformations such as helping to change a manufacturing business into a service company.
- Uncovering new opportunities: IoT is driving ROI for companies by facilitating new partnerships to serve customers in new ways. In fact, 61% of businesses say they “consistently” see IoT as an integral part of wider business initiatives. In order to meet the rapidly changing demands of today’s customers, companies are continually forced to redefine their business strategies in order meet these needs, stay relevant and continue to see profitable growth. IoT data is informing business strategy in new ways, as 64% of businesses consistently use big data and analytics platforms to support decision-making.
These three key points illustrate how companies that are investing in IoT are seeing real results on their bottom line. For businesses, the internet of things is no longer just an IT project to help internal functionality. Today, investing in IoT is a business development strategy that drives growth, profitability and depth in to what companies can offer their clients. The proof is in the ROI — the internet of things is a vital investment for the modern company.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
Often technology companies focus on innovators who started with a clean slate. Unfortunately, this is far from reality for most businesses, as most organizations have critical initiatives to become digitally native and agile to meet customer expectations. My employer helps companies with both legacy and modern applications — which is the reality in most enterprises today. New initiatives to modernize still have a reliance on legacy systems which contain critical data and business logic. Although some organizations are proactive in attempting to understand their highly complex systems, most are struggling. The technology to help with this situation is not as widely deployed as necessary. Some application performance monitoring (APM) solutions provide technology to build maps and paths of transactions via tracing, but even fewer work at scale in production. This technology not only provides visibility into systems boundary interactions, but also measurement allowing for the correction of performance and scalability bottlenecks. Adding performance testing (load testing) products and services coupled with APM help immensely.
Many organizations decide to build these new systems and apps using elastic cloud capabilities or services, thinking they will protect them from flash traffic or the need to scale. The oversight is that these elastically scalable applications rely on legacy technology housed in traditional data centers housing without elastic scalability. As these new apps are launched, it may cause cascading failures. Digital transformation, unfortunately, typically requires the interfacing of legacy and new systems of engagement for most organizations.
An example of this: in reality, many enterprises are adding capabilities to take advantage of IoT requirements. Just last week I had one such discussion; the enterprise was adding location awareness in a mobile app. It wanted to present context-relevant app functionality and offers depending on the user’s profile, preferences and history. To meet this new requirement a lot of additional data was being collected and fed into legacy systems to allow these new capabilities to function. In this case, the legacy system was brittle and couldn’t handle the additional transactions and data. The enterprise realized this too late to make the required changes, and the net result was a rather embarrassing launch, which had to be rolled back. The overloaded system affected day-to-day business operations, not just the mobile app users.
The result was that proper end-to-end performance testing and scalability analysis were overlooked, and the business suffered. The organization reactively implemented visibility and did additional testing, but this could have been avoided. Poor planning and the need to quickly address system failures are a challenge for most. Typically our organization gets the frantic phone calls to license software to analyze the scalability bottlenecks in highly complex production systems. The question for me is when will organizations stop being reactive? Will it require a mindset change, software change or infrastructure changes?
Feel free to comment here or via Twitter @jkowall, and thank you for reading!
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
Sorry for the clickbait opening! What I mean is that IoT is just a tiny fraction, and possibly the most obvious evolution of the tools we need to remake humanities interaction with our world; how we consume, preserve and sustain our societies.
Computation and communications are cheap. Our things measure the world around us and tell us about their “lives.” They act as an extension of ourselves, feeling, tasting, sensing pressure, vibration, color, heat and a lot more. Modern internet and other communications technology allow us to transmit this data across the street or across the galaxy! Data science and applied mathematics techniques give us a way of learning about the relationships in this data. Big data technologies let us store and perform computations on this raw data, turning it from data to information and finally to knowledge. The internet of things is allowing us to discover the Universe through these things we’ve built.
Indulge me a bit.
Thousands of years ago, we solved everything empirically, but we had precious little data and only rudimentary interpretation techniques. The fire burned our hand, we stopped putting our hand in the fire … a Homo erectus genius is born.
A few centuries ago, we’d developed complex symbolic manipulation (algebra, the Calculus, etc.) that allowed us to develop analytical solutions; formulas that described often relatively simple phenomena … we plug in the inputs and we get the result. Sometimes innovation itself was the process of deriving formulas from trivial observations (e.g., coming up with Pythagoras’ Theorem relating the lengths of sides in a triangle). And before anyone defends the poor and maligned Pythagoras, by “trivial” I meant “simple” observations, not “unimportant” ones.
But we were still dealing with small data and phenomena that were relatively simple. Not so in today’s world. Between McKinsey, Gartner and others, you can get a sense for just how real is the big data deluge. IoT has created oceans of data (forgive the relapse into the water metaphor, I’ll spare you the data lake even), and there’s been a positive feedback cycle whereby our species has developed incredible numerical techniques (as opposed to the symbolic ones of the previous paragraph) for describing the world around us, further justifying capturing more data, accelerating the advancement of the applied mathematics and … well, you get the point.
When I say numerical techniques, I mean that we aren’t using formulas to derive new formulas or algebra to derive new laws. Now we use the relationships hidden in IoT data to work backwards, in a sense. We start with the big data, and we may not even use that to develop a theoretical understanding of the world … we may simply assume a “black box;” the world turns, the sensors of our IoT capture the details, and we derive what’s going to happen next without understanding why (e.g., artificial neural networks in the field of AI). Lacking the fundamental why isn’t acceptable for a PhD dissertation — it’s no way to do pure research, but in the commercial world we can get by without knowing the why. In fact, freeing ourselves from the why question can permit all manner of business innovation even if it won’t win anyone a Nobel Prize. Think about this as being able to say when a particular component in a factory is going to fail, but not having a complete understanding of how this comes about.
For those of you interested in the philosophy, you might even say that the symbolic techniques of academia are simply inadequate to the tasks of understanding the complex world, regardless of how far we advance (see Gödel’s Incompleteness Theorems).
So why is this important? Why is it exciting? Why does it matter? For the first time in the history of our species, we have a cost-effective way of learning about the “private lives” of our things, picking apart not how they’re supposed to work, but how they actually work. Not how they’re intended to be used, but how they’re actually used. Our things can learn and improve. They can work more effectively, efficiently, live longer and at a lower cost. Society gets more for less, and we all live better for it. Seen this way, IoT is transformative … now that’s exciting!
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
Data has value. Combined data has more value. The data pent up in the “things” of the internet of things will promote bringing more data together from more sources than has ever been achieved in human history. If your company doesn’t have an IoT strategy, consider getting one. For a good brief on IoT and its value, take a look at my other blogs.
A key moment in time
Boards of directors and CEOs that were questioning executives about PCs in the 80’s, internet and ecommerce in the 1990s, mobility and cell phones in the 2000s, and data centers and cloud computing in the 2010s, are today asking about the internet of things. Ignoring their interest isn’t likely to end well, considering how complex IoT can be — and how it will be essential for competing in the digital economy.
IoT: Everything you want it to be
Definitions of IoT vary for the same reason that everyone’s experience with IoT varies based on what the types of devices to which they connect, and what data they extract as a result. Pulling temperature data from a thermostat and activity logs from a furnace isn’t the same as tracking the flow of oil on an offshore drilling platform.
And those are just two examples. Thousands more are out there, adding to the rich diversity of IoT. This year alone, Gartner says that we can expect 6.4 billion things to be actively connected — a 30% increase over 2015’s total. Moving fast to glean value from the resulting streams of data should be a top priority for most global businesses. The big questions are how to do it and where to leverage it.
The first of these two questions can be tough and requires some detailed explanation concerning device connectivity, the type and variety of data, the networks, and the process of digitizing what is primarily analog information for the physical world.
Reasons your business should leverage IoT
Experienced enterprises are building platforms to handle the complexity of it all. Not surprisingly, we’re serving a diverse set of use cases and benefits such as predictive maintenance.
Consider offshore oil platforms receiving regular upgrades and repair whether they need it or not. Why? Safety requirements and cost avoidance demand that companies take necessary precautions with industrial equipment. So, in the absence of data to the contrary, energy companies order maintenance to keep crews and the environment safe. Now, if that same equipment were under constant sensor surveillance with data collected and analyzed regularly, more efficient and precise repair suddenly becomes possible. This can avoid both types of errors — withholding maintenance when it’s really needed, and performing maintenance when it’s not needed.
Farming is a tough business. Generally, every once in a while we rely on a medical doctor, a lawyer or a repair person. But every day we rely on farmers. Successful farming can be largely based on gut feel and experience. What if it didn’t have to be that way? Consider what I like to call “Agriculture 3.0” — farm tractors and equipment imbued with instrumentation and computer systems that place fertilizer with the precision of a lab pharmacologist could also read data from moisture sensors in the fields, adjusting doses as it travels? Mother Nature would still have her say during the planting season, of course, but the odds of a more optimal crop yield at the time of reaping would also increase dramatically.
There are many other internet of things benefits, and many more will emerge as time goes on. These will be seen as more business leaders answer the question raised by the “perpetual connectivity” of IoT. What could your business do if your product, location, assets, people and customers (things) that embody value were perpetually connected to each other and to a central system? What would you do first?
That we can even ask those questions today speaks volumes for how far we’ve come since the 1980s. From the PC revolution to the cloud to smartphones, data centers and finally IoT, there’s a never-ending sea of valuable information that is becoming available to smart business leaders.
Industrial IoT is ushering the era of IT and OT convergence. However, making traditional OT assets smarter with IT technologies also means a larger cyber threat surface and, hence, more exposure to cyberattacks which are growing smarter as well.
Unlike in IT, a breach in OT security means not just data and identity loss, but possible breakdown of critical infrastructure and even human safety.
Industrial equipment like turbine engine or industrial control systems was never built to evolve as cybersecurity-savvy IT products. But now these very OT assets are getting connected to open data networks. Lack of evolved cybersecurity stack in these “live” assets calls for immediate security countermeasures.
That’s when the spotlight turns on segmentation and DMZ (also known as perimeter network) for plausible rescue.
Network segmentation started in enterprise IT networks using shared media/Ethernet as a means to improve performance and bandwidth. Over time, however, it proved itself as a proactive security tool. Segmentation of a network into separate zones or subnetworks (such as HR, operations and engineering) reduces traffic collision and improves throughput. However, a side and more compelling benefit is that it allows the containment of network traffic within a specific zone. That in itself is a very effective way to prevent malicious traffic from spreading across the enterprise network.
According to ISACA, “a common technique to implement network security is to segment an organization’s network into separate zones that can be separately controlled, monitored and protected.”
DMZ, until now, was selectively used in industries by defining few broad perimeter networks. However with increased sophistication of cyberattacks and as traditional factory assets become intelligent and connected, just one line of defense isn’t enough.
In case of an intrusion, a countermeasure is critical to prevent lateral movement of the malware. As a result, zoning inside corporate IT networks, and even between IT and OT networks, is important.
Now the question is: Are conventional segmentation principles good enough to secure industrial OT assets?
OT and IT segmentation: The dynamics differ
Zoning industrial networks to create multiple perimeters is a promising way to secure OT assets. However, to implement it we just can’t replicate IT segmentation techniques such as VLANS, routing and firewalls as is. VLANs and routing can become complex very quickly. Implementing IP address and subnet configurations often run into complexity, require reconfigurations and demand specific technical skills. This may not be easy to get around in OT environments.
Most IoT devices are either too low in computing resources to integrate robust security stacks or not designed (think of turbine engines, industrial belts) to integrate security stacks prevalent in IT.
Easy virtual zoning
In the case of large industrial systems, they are too large to move around. In many cases the devices are located in remote locations and not easily accessible to maneuver physically. Besides, in OT, well-defined software upgrade cycles to integrate patches, etc., are missing. And most importantly, these systems must run uninterrupted. Industrial equipment downtime translates to disruptions which are mostly unacceptable.
That’s why when planning segmentation in OT environments, we must think differently.
A zoning solution needs to be easy and done in a centralized manner without having to move around bulky industrial gear or reengineer existing systems.
Instead of physical segmentation as in the case of IT, in OT environments we need to think of virtual or logical zones, without any physical dependency and simpler user interface through which these zones can be centrally configured and controlled.
Deep packet inspection
DMZ defines the periphery of a zone and uses firewalls, either standalone or cascaded. Firewalls are used for deep packet inspection and traffic filtering. Deep packet inspection and intrusion detection provide the needed information to process incoming traffic.
However, IT firewalls are designed to read IP protocols and as such cannot serve the purpose for many industrial protocols such as Modbus, MTConnect and OPC. To secure OT networks, we need firewalls which in addition to IP can also inspect the packets and extract contextual information to monitor, protect and control OT zones.
As security technologies in industrial IoT steadily evolve, segmentation can provide considerable relief to security concerns in OT environments — so long as we develop and use the right tools and methodology specific to OT. Simply replicating IT legacy may not help that far.
There is a fourth industrial revolution coming. Following the lean revolution of the 1970s, the outsourcing phenomenon of the 1990s and the automation that took off in the 2000s, cyber-physical systems have become the next frontier.
When it comes to IoT, the convergence of smart connected devices ($26 billion by 2020 according to Gartner, $212 billion by 2020 according to IDC) and their role in cyber-physical systems surfaces several important security concerns. These include scalability (the capability of a system, network or process to handle a growing amount of work), pervasiveness (also called ubiquitous computing) and persistence (referring to the characteristic of a computing state that outlives the process that created it). These security issues may also lead to a dramatic increase in threats against a much larger attack surface which many enterprises may not be ready to undertake.
The burning platform
Historically, the edge of a network — for example, internet-connected “things” — was not connected to an IP-based network and was out of the jurisdiction of the CIO. Advances in communication protocols, the miniaturization of electronic devices and the advent of IPv6 have enabled the expansion of the network to these new elements of the enterprise. With this increase also comes an increase in revenue opportunities, but at what cost to enterprise security?
A cyber-physical system can be defined as a large physical system that can be part of a system of systems (SoS) where a distributed set of computing elements interact to control, monitor and manage the exchange of information from machine-to-machine, machine-to-human or to other cyber-physical systems. Characteristics of these cyber-physical systems can include physical distribution of systems, distributed control, supervision and management, subsystem autonomy, dynamic behaviors and reconfigurations as well as continuous evolution of the cyber-physical system itself. Also, as part of this cyber-physical SoS, there may exist within it a number of partially physically or programmatically coupled elements, where some other elements may be able to provide services independently.
The lack of built-in security in cyber-physical systems can result in unauthorized access to services and data, exposure of key enterprise elements, compromise of private data, denial of services, backdoors and malware, as well as loss or damage to critical infrastructure. As an example, the global market for network intrusion detection and prevention equipment and services is estimated at $95 billion and expected to reach $155.7 billion by 2019, while at the same time the role of threat actors such as cybercriminals, nation states, hacktivists, cyberterrorists and insiders continues to accelerate at an unprecedented rate in terms of cybercrime specialization, monetization schemes and tactics, and exploits.
Cyber-physical systems increase the attack surface and, as a result, an important aspect of this new cybersecurity frontier is the need to go beyond confidentiality, integrity and availability to protect cyber assets and extend it to a physical system to provide stability, controllability and observability (MITRE, AFCEA Conference 2016). Stability refers to the ability of a cyber-physical system to provide services within specified criteria. Controllability refers to challenges related to time- and event-driven computing, software, variable time delays, failures, reconfiguration and distributed decision support systems. Observability is used to achieve resilience within a network.
What should I do next?
Organizations should start by evaluating their cyber-physical systems in terms of six control areas: confidentiality, integrity and availability as well as stability, controllability and observability. The impact of this new paradigm will require cyber-risk practitioners to work with the physical systems engineers they support and develop new techniques to monitor and control infrastructure and enable its performance.
Something big is happening, and it is changing everything — how we interact with each other, our products and the companies around us. And it’s the data explosion and the internet of things. The hype is real, everything around us is being equipped with sensor technology that sends data to the cloud. By 2020 the world’s data will be doubled every two months.
The internet of things has become the new and shiny toy, so companies are naturally eager to make IoT part of their business in order to create new experiences for their employees, customers and partners. But it is important to emphasize the word “new” here. There is no shortage of impressive, potentially groundbreaking ideas out there, but the technologies are still very new and therefore the solutions are not well defined, requirements are loose and changing, and ideas are hard to prove.
With the lack of use cases already out there, it is hard for users and customers to even know what they want. This is why it is important to constantly experiment and get something into the market quickly without spending a lot of time and money. With this nascent market come uncertainty and the need to adapt quickly to changing conditions and transform your business in different ways.
For a business to adopt IoT it will mean creating entirely new services and business models. Let’s say you have an idea for a digital project that would create a new revenue stream, or a better way to engage with your customers that would differentiate your company. Because these ideas represent new and uncharted territory, turning them into value-driving applications requires frequent iteration and close collaboration between developers and the business.
Enable rapid experimentation
In order to adopt a process of development that allows for this rapid, low-cost experimentation, companies need to approach IoT projects with a willingness to fail often in order to figure out how to succeed sooner. Thomas Edison didn’t invent the lightbulb with a brilliant eureka moment that all of a sudden got him to the right answer. Instead, he was an experimenter. In his eyes he thought, “I have not failed, I’ve just found 10,000 ways that won’t work.”
And Google, one of the most successful and innovative companies in the world, has a failure rate of 95% with the idea that you only need to succeed once in order to win. According to former CEO of Google, Eric Schmidt, “This is a company where it’s absolutely okay to try something that’s very hard, have it not be successful, and take the learning from that.”
With that idea in mind, companies need the right set of tools and processes to foster this low-cost, high-value experimentation.
Some best practices for fostering rapid experimentation in your organization include:
- Allocate time and resources for this type of experimentation to prove to your organization that this new approach can work, and then scale it widely as a new mindset.
- Create cross-functional teams that include the business and IT. Bring together a person with an idea and someone with the technical aptitude to bring it to life.
- Use visual, model-driven development in order to create a common language between business and IT to allow for faster experimentation and greater collaboration.
- Create a feedback loop. It is important to have a mechanism to continuously capture feedback from users that you can take back into the process for continuous innovation.
- Test a minimum viable product (MVP) early in the process to ensure the ability to change direction with minimal risk based on what you learn.
From idea to reality in just one week
A media and entertainment firm has adopted these practices to rapidly bring their ideas to market with an IoT app. The firm was faced with the challenge that many of their clients are forced to turn away attendees at the gate of their free events due to overcapacity without being able to take into consideration how many people left the venue. The firm had the idea to create a solution that would be able to leverage sensors in turnstiles to visualize attendee traffic in real time.
With constant ideation and experimentation, the firm was able to quickly make this idea a reality and present the results within a week of starting. Due to its new found collaboration between business and IT and its willingness to experiment and try again, the company was able to validate its idea quickly and continue to iterate and test frequently to evolve how it uses the incoming data, which has resulted in new business cases.
For example, the firm found that the data from these sensors could also be used to help clients understand how many people enter the venue to optimize how much food and beverages to provide the following night, and can ensure security by scanning for credentials to ensure that all event staff are legitimate.
The entertainment firm can now help clients optimize, secure and enhance the experience of its events.
Remember: It is important to get these ideas out there quickly in order to validate or invalidate them. Don’t get stuck in the “it must be perfect” mindset. Instead, build and deploy an MVP and continue to iterate with feedback from clients, customers, employees and partners to create the right new experiences.
While it is early days for 5G, one thing is clear: security and privacy will remain fundamental requirements, with the changes foreseen for 5G likely to broaden the range of attractive attack targets. We believe that massive IoT, one of the main application segments for 5G identified by standardization body 3GPP, is no exception.
The massive IoT segment will be extremely broad, covering not just M2M, but consumer-based services too. It is likely to consist of an ecosystem of potentially low-cost devices such as sensors and trackers as well as high-end home appliances, toys, some wearables, meters and alarms.
Device connectivity requirements will vary by use case. Some will require permanent connections, while others will only connect occasionally. Some devices may connect directly to a 3GPP network, whereas others will connect via a relay or gateway, using either a 3GPP or non-3GPP network. Some devices will be static, whereas others will move and will need the ability to manage network handover securely.
Data is likely to encompass geolocation data, sensor data such as meter readings, and private consumer data. Location and privacy protection for data must be enforced. For example, data from meters must not allow thieves to know if premises are occupied or not. In addition, much value in IoT comes from the integrity of the data so integrity protection is also vitally important.
Threats may include data manipulation, use of low-cost endpoints for entry into the network, rogue devices, equipment cloning and denial of service. Another major threat comes from suppliers themselves failing to invest in security for low-cost devices.
Means of mitigating these threats include secure provisioning, secure remote administration and configuration, authentication and data integrity measures. It’s also vitally important to understand that security should be proportionate to the value of the data being stored or in transit and the risk of compromise rather than to device cost.
Managing initial network connectivity securely will require secure provisioning of unique device and user identities for both network- and service-level access, network and service authentication credentials, and communication cryptographic keys as well as application identifiers.
Managing identities on the network will require identification of the application and corresponding application provider. It will also need secure storage of the unique identity on the device.
Mutual authentication of the device and network will also be necessary (it has been mandatory since 3G) as may mutual authentication for applications back to their service platforms.
In combination with the other major classes of use case: critical communications, enhanced mobile broadband and V2X, massive IoT poses a range of security challenges and requirements. SIMalliance believes that it is crucial that security is built into 5G from the outset. It has recently published An Analysis of the Security Needs of the 5G Market outlining its view of the security needs of each 5G segment.
SIMalliance is soon to launch a follow-up technical security requirements paper in late Q3 2016. Further articles from SIMalliance on TechTarget will draw on findings from that paper.
We hear so much about the potential of the internet of things every day in the news. For the most part, examples and discussion is focused on consumer trends and products for IoT, but in the real world, enterprise and industrial IoT will be where the cool stuff happens … here’s why.
Our fascination with consumer products is pretty straightforward — it’s easy to identify with things we use every day at home and in our lives. This has been useful to get people educated and excited about IoT, but we need to move the discussion beyond the basic example of a coffee maker that will brew you a fresh cup of Joe when a sensor tells it you’re awake. Yes, that’s possible and there’s value in automating things that are a little tedious, but how much would you really be willing to pay to avoid pressing a button and waiting five minutes? We need to think bigger to understand where the real value of IoT will come into play.
There’s lots of market research out there and it all points to IoT being big … really big. Consensus says there will be around 20 billion devices by 2020, up from around 10 billion right now. Gartner’s recent forecast says we will add about 5.5 million connected things each day!
But how will all those connected things be split between consumer and enterprise IoT? Some forecasts, like that from BI Intelligence, expect enterprise IoT to far outpace consumer while others see consumer as the force driving the number of connected devices.
Looking historically at adoption of technology shows us how rapidly things are changing. While it took 50 years for most of the population to adopt electricity, modern trends like smartphone adoption show full adoption in a 5 to 10 year span. Older technologies mostly began as something available to only industry followed by becoming accessible to consumers. With vertical adoption curves, however, that distinction no longer holds and new technologies are almost immediately available to both groups.
Consumers and the firms serving them, at least right now, see the primary benefit of IoT as making life easier (although they may not even do that sometimes). While many products and services do incrementally increase efficiency or save some money, the primary benefit is that life is a little easier and the tech is really cool! The question, however, is what are we willing to pay for this benefit. It clearly isn’t a must-have, although over time we may eventually see it that way.
Enterprises, on the other hand, see the primary benefit of IoT adoption as a positive ROI (or NPV if you prefer). This can take the form of efficiency gains, cost reductions or even opening up new business models that were previously unattainable. When implementing IoT to automate a process could save even 1%, which can equate to billions of dollars, it’s much easier for an enterprise to take a calculated risk and be willing to pay to implement IoT now. Certain industries are already investing.
In industries like energy and agriculture, we have seen significant interest and action already. In solar energy, monitoring assets like solar panels, inverters and more to maximize uptime and performance is critical. If one panel is performing poorly, it can affect many others, and poor performance on a sunny day can be costly. Solar developers are already implementing IoT solutions for asset management that can overcome many of the challenges faced in the past like low power requirements, intermittent connectivity, real-time data needs and more. The need is real and therefore these enterprises are willing to invest in both the development of these technologies and their implementation.
Agriculture is also looking to make positive ROI investments by implementing IoT. Companies have looked to optimize how they use resources like water for some time by monitoring crop yields, soil moisture and more. IoT can take this concept to the next level by enabling a flexible (and more importantly cost-effective) platform that can handle many data streams from sensors in the soil, silos and tractors, weather feeds, pricing signals, supply chain details and more. As more information is brought together, the system becomes more valuable and a farm can make better decisions. Each farm will be able to pick and choose which connected services make sense and layer different offerings from different vendors for key advantages including:
- Operate more efficiently: Reduce fuel usage by optimizing speed, route and dispatch timing of harvesters based on real-time orders and needs
- Open new revenue streams: Provide detailed fertilizer, water and environmental information for each piece of food to conscientious consumers
The success of enterprise IoT will be driven by technical innovations and services that enable it. Tools like a scalable and secure IoT platform that operates in real-time, low-power/low-cost hardware to power devices and modern networks to transmit data are the building blocks for enterprise IoT, as well as consumer IoT to change how we operate and live.
In the big picture, the technology is ready and the need is there to drive IoT forward. While consumer services will certainly improve quality of life, enterprise IoT will face down some of the biggest challenges facing the world today:
- How are we going to feed 10 billion people?
- How do we improve renewable energies uptime while reducing its OPEX?
- How are we going to manage the future’s most valuable resource — water?
Ultimately, both enterprise and consumer IoT will impact our world in ways we can’t even imagine now, but it is enterprise IoT that will be the next big thing!
We’ve been thinking a lot about the stages that an IoT innovation goes through, from the light bulb over the head moment through to (hopefully) mass deployment. We’ve grouped these into five steps, each presenting a set of challenges and considerations about IoT connectivity. The decisions you make at each point about connecting the “thing” to the internet will intensify as you scale.
We’ve summarized in the image below the five key stages of an IoT project and the IoT connectivity considerations (Wi-Fi, LoRa, Satellite, LTE, NB-IoT, etc.) you’ll need to make along the way.
Every IoT project starts with an idea, not a purchase order for $100,000. Today, makers have never had it so good for quickly turning those concepts and “what-ifs” into something real. In many cases, a Raspberry Pi or Arduino will do the job fine. Connectivity also isn’t a problem at this stage. You could even use a physical cable to link the sensor to an IoT gateway such as the Dell Edge Gateway, although Wi-Fi is the more likely option.
In my time in the industry, rapid prototyping has become easier and cheaper. It used to be very expensive and a big commitment to create early-stage proof-of-concept devices; now, you can get a Chinese manufacturer to build in volumes of one. But once you bring the proof of concept from your lab to the field — whether that’s to an actual farm or to an investor meeting — can you really rely on the Wi-Fi signal? Depending on the environment, you might think cellular connectivity is your best option, so traditionally, many makers have you gone to their nearest mobile operator to buy a cheap SIM card.
Here’s where the questions start to come thick and fast, however. Once you have an idea of what you’re trying to build, will any of the connectivity options you adopted at the earlier stage still be with you as you refine your original idea? Will you be able to source SIMs reliably at higher volume? Will the cost model of buying retail SIMs work with a 500-unit-a-month deployment? This step is where you’ll encounter issues like security, authentication or authorization on your connectivity. Can your chosen connectivity handle these issues reliably?
By design, IoT devices are constrained — that term has a specific meaning which implies limited processing power, storage and bandwidth. Where you plan to physically deploy will be a factor: if you aim to put a miniature computer on every streetlight, it isn’t practical to visit every one of them every Tuesday to install a patch. Or, if you fit a sensor in the road which is then embedded under concrete, you only have one time to get it right.
So, what type of connectivity are you going to embed in your service as you deploy it? This isn’t a trivial question. If you’re serious about building an IoT innovation to operate on a global scale, it makes sense to think about IoT connectivity from the very start. Otherwise, you’re stuck on the treadmill, continually solving the same problems over and over again as you progress through the five steps as outlined here. SIMs from the EU will not roam in Saudi Arabia, and a U.S. SIM won’t necessarily work in France. In some of the biggest growth markets, Saudi Arabia, Turkey and Brazil, governments don’t permit global or permanent roaming. We’ve seen similar restrictions in places like Cambodia, Indonesia and Myanmar. So if you want to ship your “thing” all over the world, you’d better be sure at the idea, prototype and iterate stages that you’ve built in connectivity that works for your global application.
The assumption we’ve made all along is that connectivity is hard. Simply putting your devices on Wi-Fi doesn’t give you guaranteed service, besides the fact that connecting constrained or low-power devices to the public internet is not a good idea. As we like to say, there’s nothing quite as complicated or dangerous as simple connectivity.
Fortunately, technological advancements have come to our rescue yet again. In the old days, you had to pay in advance for technological capacity that you guessed you might need to use months from now. Cloud computing services like Salesforce and, later, Amazon Web Services or Microsoft Azure completely upended that model, giving enterprise-class software tools to smaller companies who only had to pay for what they used.
That same model now applies to connectivity, to let you link services and “things” over the internet at industrial scale but at volumes of one (idea), so you can start small and then flex your network as you go through the subsequent steps (prototype, iterate, deploy and scale).
When considering what technology to enable your IoT project scale the network connectivity you need to look to the models developed for infrastructure as a service and computing elasticity. Look for technologies that can automate the “declaration” of an IoT network topology via simple API calls and that remove the manual, time consuming bespoke steps. Consider technologies that can aggregate different bearer networks so that you can mix and match multiple different connectivity types, and avoid lock-in from one single communication service provider.
For more information on this topic read our ebook on IoT connectivity.