The low-cost, low-power nature of the internet of things can cause security considerations to be deferred until much later in the product implementation stage, or sometimes, indefinitely. However to realize IoT’s true potential, it is critical that security be considered at the very early stages, and even take pole position in some applications. Luckily, there are some simple frameworks that we can follow to improve the security mindset and build a secure foundation for the IoT system from the start.
First, be paranoid.
Due to the sheer number of moving parts in an IoT system, there are numerous vectors that hackers can use to attack. A healthy dose of paranoia during the examination of the entire IoT system can ensure that the system is thoroughly explored and that appropriate security controls are selected. However, IoT security, like any other system, should be driven by business risk. Thus, things like privacy laws, compliance and business value should all balance the paranoia with pragmatism.
Most IoT devices are valued in terms of the nature of their function, e.g. temperature, telemetry, video stream. Since the data is often used upstream for higher-order decision-making, there is an implicit value that affects the outcome of the system at large. For example, temperature from IoT devices might drive the fire suppression systems, or facial recognition from video feeds may alert physical security services.
Consequently, the takeover of the IoT device could be used to conduct data integrity attacks that change the behavior of the upstream system(s). For instance, information regarding the energy consumption from your home or business to your utility provider could be altered in order to manipulate billing, services and status. All threats of this nature jeopardize the trust in the information being transmitted and ultimately in the overall infrastructure.
Attacks against device manufacturers, cloud service providers and IoT solution providers have the potential to inflict the widest degree of harm. These parties will be entrusted with large amounts of data, some of it highly sensitive in nature. This data also has value to the IoT providers because of the analytics it enables, which represents a core, strategic business asset — and a significant competitive vulnerability if exposed.
Second, use a persona-based approach.
IoT systems are made up of many vendors, each one focusing on its core strength, whether it’s manufacturing, connectivity, data warehousing, analytics or some other function. Thus, security in the IoT system can only be achieved through collaboration of all the vendors, but remains the responsibility of the system provider. An effective way to design and review security needs of the system is by borrowing the persona-based approach used in product development. This involves identifying every distinct persona involved in an IoT system — including the buyer, the device manufacturer, the cloud provider, developers and other vendors — and then analyzing all interactions between these parties and installing relevant security controls. These controls should be driven by a defense-in-depth strategy, while minimizing friction, especially for human personas. Any friction imposed by security controls motivates humans to find ways to circumvent them (all in the name of productivity and efficiency), so this process can help ensure that security controls are applied that maximize both security and empathy for the user.
Each persona should be assigned an appropriate level of authentication for their digital identities, for example, certificates assigned to machines and multifactor authentication for humans. The level of assurance of authenticity should be directly proportional to the business value of the data the persona can access. Further authorized access to the data should be based on the principle of least privilege, so any given persona is able to access only the data that it needs to deliver or consume business value.
All data related to the system should be categorized as well, and then given the appropriate level of protection it requires. Encryption should be used on all sensitive data and communications to maintain highest levels of integrity.
Finally, special attention must be paid to the basics — network monitoring, vulnerability patching, the use of tamper detection for the devices and code signing to validate what they’re doing.
Applying a healthy level of paranoia and preparing for the worst, from the very beginning, is key to building a secure foundation for IoT.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
It’s a fact: First responders are superheroes.
An AT&T survey* found that more than two-thirds of consumers think of firefighters, police officers and EMTs as everyday heroes. These brave men and women are there when we need them. And across every level of the company, we believe it’s our responsibility to support the first responders who protect us.
Earlier this year, all 50 U.S. states, five territories and the District of Columbia opted in to FirstNet — the first nationwide communications platform dedicated to public built with and for first responders across the country. FirstNet presents the unique opportunity to bridge public safety’s capabilities with the internet of things to create smarter cities and smarter responses.
Here are four ways that IoT is helping support public safety — today and in the near future.
Public safety stands to benefit in a big way as cities and municipalities become more tech-driven. Take smart lighting, for example. Today, we can retrofit existing streetlights and use IoT to help cities monitor traffic and road conditions. This helps cities identify and proactively manage repairs, road closures and maintenance needs to keep roads safer.
If we extend smart lighting for public safety, video sensors on streetlights could help firefighters monitor traffic patterns and choose the fastest route to put out a house fire. As a second example, decibel sensors on streetlights could help police officers detect gunshots when and where they happen.
Recreational drones are sources of entertainment. Delivery drones are factors of convenience. Both are purposeful uses of drone technology. But drones for public safety can make a meaningful difference — they can save lives.
Imagine if paramedics could see what was happening at a car accident before they arrived. The potential to deploy a connected drone and collect information from the scene of an emergency could help EMS personnel make key decisions and jumpstart their response.
Or, think of the potential for firefighters using a connected drone as they battle the blaze of a rapidly moving wildfire. As conditions quickly change, firefighters on the front line could still have access to the information they need in near real time from a camera-equipped drone.
Fleet tracking and management systems are a technology mainstay for corporations and enterprises. Today, these same technologies are rolling out to first responders across the nation, so they can see the status of vehicles in the field like never before.
This technology details location, destination, speed and engine diagnostics. It also allows for geofencing and customizable exception reporting. Take a crime scene, for example, or a duty station. Geofencing can give incident commanders better visibility into available resources or help create a log of who goes in and out of an area.
Additionally, a fully integrated push-to-talk feature gives first responders a built-in communications platform. This means fleet managers and dispatch centers can speak directly with field operators and first responders. In the event of an emergency, a dispatcher can visually identify assets in the field instead of putting out a generalized request over the radio, which is what happens today.
Wearable cameras can provide “see-what-I-see” capabilities to support search and rescue. And along with cameras, wearable sensors can feed data to incident commanders and first responders.
For example, cardiac arrest is common for firefighters. Monitoring their health data can send early warning alerts to help keep them safe. If a firefighter’s heart rate increases, it could trigger the wearable camera to auto stream their current conditions. This would let command centers evaluate the need for backup and medical support.
And as these capabilities are brought onto the FirstNet platform, first responders can be confident they have highly secure, reliable access to near real-time data and video feeds.
Bringing it all together
Connected infrastructure, vehicles, drones and gear can greatly assist first responders. Together, these IoT technologies are helping build smarter cities for smarter responses in emergency situations.
We are connecting first responders to the world around them like never before. And we’ve only just scratched the surface of possibilities as we empower first responders with the tools and technology needed to respond to incidents more quickly, safely and effectively.
Why do we keep pushing? Because of the impact on outcomes, the potential to keep first responders out of harm’s way and for more lives saved. This motivates me and my team. It inspires innovation. And it serves as a rallying cry across AT&T to help the (super)heroes who help us.
*Source: AT&T and Kantar Added Value survey of 4,475 mobile users, June 1-30, 2017
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
As the hardware components for IoT continue to become smaller, cheaper and more accessible, the power of this technology will continue to creep into new areas of retail. Its impact is already evident in the retail space, but more so with larger assets of the retail supply chain — think warehouses, manufacturing facilities and logistics. But what about IoT’s downstream impact on aspects such as merchandise, stores and the consumer? IoT components are becoming more integrated into retail products, so what should we expect moving forward?
More connected products mean less shrinkage
Retailers have continued to deal with the issue of losing merchandise through theft, both from external sources as well as their own employees. U.S. retail suffered $48.9 billion in shrinkage throughout 2016, according to a recent study by the National Retail Federation (NRF). The reasons for these losses include theft, as mentioned, but are also due to inefficiencies in paperwork as well as fraudulent returns, which is a growing driver behind retail’s struggles. So, how could IoT play a role in reducing shrinkage? Think smarter inventory that allows greater track and trace, as well as less confusion regarding the purchasing data associated with products. If the inventory could speak and stand up for itself, it can reduce the illegal activity associated with those products.
Greater connected inventory, means better store shelf management
When the products that are stocked are smarter, our shelves and backrooms will become more organized too. Think better product assortments. This is becoming increasingly important as retailers go through a transformation of their physical stores. The notion of “stack it high and watch it fly” no longer applies in today’s retail landscape. Retailers ranging from Nordstrom to Target are rethinking a variety of aspects of their physical stores. Being as smart as possible about product assortment is vital, as is using real-time, rich data coming from connected inventory, which can provide retailers with a huge source of information. Which products are being taken from the shelf faster than expected? Which ones are lingering? Can employees do real-time assortment testing and get instant feedback? With smart inventory, yes.
More sensors, better connected stores and employees
As stores and their inventory become smarter and more efficient, a key aspect of the store will be freed up to focus on the most important role: customer service. Currently, store associates spend a disproportionate amount of time dealing with inventory rather than customers. But, as NRF reports, most retailers are looking for store associates that are truly skilled in sales. Greater connectivity of the store can bring incremental improvements, allowing for the mundane activities to become more automated, and freeing up store associates for value-added services regarding customers.
As IoT becomes more affordable, we are seeing its power impact more aspects of the retail landscape. Retailers need to think about how IoT can help improve daily activities, such as basic inventory counts. Once they target and address some of the fundamental daily issues for retailers, employees can focus on the alternative activities necessary for a successful store. Connected inventory at the store level will bubble up information and insights to the greater retail supply chain. The pieces of the puzzle are getting smarter and allowing retailers to take a much better network view of their business.
Before long, IoT will not be a luxurious add-on, but rather a business necessity. As the retail landscape continues to populate with new, inventive forms of shopping, all retailers, from the traditional brick and mortar to those strictly focused on e-commerce, will use IoT technologies to stay agile and competitive.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
Just 10 years ago, the idea of an interconnected world seemed like a bit of near-future fiction. But the convergence of a number of technologies — including Bluetooth communication, smartphone availability, analytics and machine learning, and cloud-based servers — has brought this vision close to reality. Consider the following example: At 3:00, your smartwatch beeps. It tells you that you need to stand up and drink some water. You don’t have anything nearby, so you go to the corporate vending machine and flash your Apple Pay on your watch to make the transaction. After a few minutes, you get a ping on your phone that your camera has detected movement on the front porch. However, you also get a text message from UPS saying that a package has arrived, so you don’t feel a need to check.
Since you want to get to that package, you decide to leave work early. Your smartphone’s AI uses data combining GPS location at the parking lot and time of day to determine that you are heading home, so it auto-loads traffic and weather for the drive home via a map program. When you hit the road, you check your home’s thermostat and sync your favorite podcast with your car to listen to on the ride home. As you sit in traffic, you look at your smart refrigerator’s contents and see that you’re low in bread, so you schedule a small grocery delivery. And when you get home, you ask your smart home assistant (for example, Siri or Alexa) to turn on your favorite sports team as you unwind.
Staying secure in a connected world
This example has become the norm for many of us, and the addition of so many data-based checkpoints and decisions has been gradual enough that it’s hard to understand how far things have come in the past 10 years. But when broken down in a simple workflow of actions taking place over an hour or two, it’s clear to see just how many times technology is used to communicate information or make a choice. Every time devices connect, take an input or process an output represents a potential security hazard.
What’s at stake? It’s not a robot apocalypse, or even being stalked by drones and Roombas as seen in a recently acclaimed episode of The X-Files. Instead, the most realistic nightmare scenario is more in line with what has taken over headlines: compromised data and user privacy. Even when the iPhone first appeared in summer 2007, it didn’t represent the vulnerability that today’s technology cluster does. It wasn’t until the gradual integration of payments, identity and accounts into the smartphone that it became an identity risk.
When those details began residing in the cloud, that meant that they were also transmitted over data networks rather than on pen and paper. Thus, any device that connected to those accounts represented a way in for hackers. Consider your Google or Apple ID as the center of your digital identity, with spokes connected to each registered device — and in many cases, interconnecting paths between those devices as well. Then you add in third-party devices, from payment readers to QR scanners and more. The network is vast — and every device requires the latest updates to ensure the maximum level of security. The question remains: Is that enough?
How secure are updates?
The consumer’s data is at risk, but they are not the ones in control of security surrounding their data. Updating apps, operating systems and devices is a good first step on their end, but a virus or malware can easily circumvent that. Thus, a true solution to such an issue lies more with the platform provider rather than the consumer. In this case, if platforms move their data to a blockchain, then it’s possible to ensure that updates are processed while maintaining security.
Using a blockchain, every data point, critical piece of information or snapshot can be hashed and added to a blockchain — thus verifying it against the devices itself. The size and frequency of the hash is based on granularity of data. The hash of the updated files verifies the accuracy and authenticity of the download. By crafting a platform for security, it ensures that nothing will get hijacked between point A and point B; it’s the digital version of driving an armored car across a protected street.
This doesn’t absolve the tech industry of security holes. First, developers are still responsible for identifying risks and rolling out regular patches. Second, a hash discrepancy doesn’t identify a specific issue, it only flags that an issue exist.
One giant leap ahead in cybersecurity
However, despite these elements, blockchain would still represent a leap forward for cybersecurity. By sealing off the easiest avenues of criminal intent, blockchain seals off a significant amount of avenues for stealing data — and as the internet of things era continues to proliferate, every new device added to the network represents another way in. Consumers can only do so much, and in order to secure their trust — and the trust of private businesses developing devices/apps and public entities providing the infrastructure for such things — data must be shown to be as secure as possible. With blockchain currently being tested for voting, medical records and governments, the only question is when will companies’ pain points in data integrity be strong enough for blockchain to be the production solution?
Thirty years ago, we did not have cell phones. The most advanced people were beginning to get “car phones.” Those were so cool. They drew from the car’s power supply and had the little antenna on the back of the car that showed everyone how impressive you really were. Most communications were from landlines. And somewhere around that time, people began to use pagers. BlackBerrys became the pager of choice, offering a small keyboard and what was basically text messaging. What a futuristic world it had become!
Today the world looks entirely different. More and more people are abandoning their landlines altogether. The advances in technology have changed the world. We see world events streaming live on Facebook as they unfold. We have news of many varieties available 24 hours a day. And increasingly, we have the convenience of augmenting many aspects of our lives, at home and work, though the internet of things. The advances run far and wide, and the machines themselves are interacting with one another, often on our behalf. That can be a great thing, but it can also be a nightmare. With all this technology that has evolved over the years, we have also seen the emergence, and increasing effectiveness, of bad actors using these tools for malicious purposes. That link that looks harmless might just be a downloading malware to your computer. The guy intently staring at his Mac in the coffee shop might not be working on a report, he might be hacking you over Wi-Fi. With IoT, the attack surface expands dramatically, as we have now seen through tire valves, TVs and even thermometers in aquariums. But there is no sign of the progression towards the machine economy slowing. The level of instrumentation and augmentation of every aspect of our lives moves forward ever faster, and it is easy to see we are moving from smart, connected “IoT-enabled products” to, as Michael Porter and Jim Heppelmann stated in their 2014 Harvard Business Review article, “systems of systems.”
This progression can and should be great, but it won’t be unless we have trust in the ecosystem. This means we have to trust the devices, especially when they are interacting with other devices, often on our behalf, autonomously. Your electric car charger may be buying electricity on the open market from the best bidder at 3:00 AM. But do you really know “who” it is “talking” to? This is why trust is so important. In a world with billions of devices interconnecting (creating the “machine economy”), being able to trust the devices is absolutely critical. Without trust you have nothing. You have a crapshoot every day. And the level of damage can then be so much more. The thief doesn’t need to break the lock on your door; he just hacks the lock digitally. He might even shut the power off in your house. Who knows? All of these devices create misery, not happiness, unless you can trust them.
This means you have to find a way to ensure when you activate the device it is what it is represented to be and does what it is represented to do. There are many ways to do this, but it seems like this is a perfect application for blockchain technology. Using a consensus-based distributed ledger, there can be a much higher degree of confidence that the device you are activating is what it seems to be. The device itself then needs to have adequate security over its lifecycle to keep it from being hacked. And not all devices will escape hacking. We know this. So while we could trust that device when it was first activated, now we are back to anarchy. No trust. No good.
But we know how the device is supposed to behave. It sends messages of a specific type. It operates at a particular cadence. It performs transactions within a reasonably narrow band of transaction amounts. But we can also track the device behavior. This can be noted and written to a consensus ledger as well, where the reputation of a given device is also monitored and managed. Think of it like a credit score for the device. And if the device is hacked and then “misbehaves,” the credit score may go down and, in fact, it may implode. So the other devices that have trusted, and thus interacted with the device in question, may see the changed behavioral reputation score and determine the device can no longer be trusted and choose to engage elsewhere. This is how trust is maintained.
Thirty years from now, the machine economy will usher in a life we can hardly imagine. But trust is the lifeblood of that machine economy. Establishing the identity of any device on the network in a trusted fashion, and then monitoring the reputation of that device to maintain that trust, is essential. There are efforts underway to do just that. Some are in the early stages; others are on the verge of coming to market. The time is now to create such a platform. It must be a simple, straightforward approach that can easily be enabled for any device to participate in the machine economy with trust.
And that means everything. The more people, organizations and devices that participate, the more we can all trust the power and harness the opportunity that lies ahead.
More than a decade ago, Amazon Web Services was launched as a system that offered flexible compute instances. There are a lot of stories about the formation of AWS; some say that Amazon originally created AWS so that sellers on its marketplace wouldn’t have to deal with the IT burdens of setting up their own systems. We do know that it was launched without much fanfare as a side business for Amazon.com.
As enterprises large and small started to embrace AWS, it grew steadily but also quietly. The silent growth of AWS from a no-name system to over $5B/annual revenues in 2015 (which was the first time Amazon shared AWS figures) was a very clever play by Amazon. Today, almost all of Amazon’s operating income comes from AWS.
During this phase (2006-2015), most on-premises server, storage and networking vendors were going through a slow growth period, and they routinely cited the broader market slowdown (2008 crash and recession) as a rationale for their slowing growth. However, the truth was that a bulk of the growth was going to cloud or sophisticated homegrown technologies at cloud-scale companies (including Amazon, but also LinkedIn, Facebook, Twitter and Google, to name a few).
Microsoft eventually did make some bold decisions and got rid of the “strategy tax” that it imposed on customers and partners, and it finally rightsized the importance of Windows in the marketplace. This led to the “Azure Age” and the reinvention of Microsoft.
Google, not to be left behind, realized that enterprises are a pretty steady and margin-appropriate business model that can provide additional growth while it pursued moonshot projects with Android, autonomous cars, Google Lens, longevity research and so forth.
Fast forward to 2018, and the game is well into its middle innings. There is a reasonably clear roadmap of what needs to happen over the next three to five years for cloud services to improve and compete with each other. What classically started as a feature/performance fight quickly changed into the battle for data. Data has gravity and the “you can check out any time you like, but you can never leave” business model of cloud data (i.e., ingress is free, but we will charge you for taking data out of the cloud) leaves a lot to be desired. Then optimizations started that brought in deeper lock-in such as API (Cloud Vision APIs from Google) and algorithms as a service (such as the conversational interface with Amazon Lex).
There is a major trend that the cloud vendors now realize they must address, and that is the emergence of IoT edge. IoT is defined as intelligent systems that not only generate humongous amounts of data (imagine an autonomous vehicle, oil rigs with thousands of sensors, multiple devices on each and every thing and person), but that also require local intelligent computations.
The IoT edge is a challenge that the cloud was never prepared for. And it is one of the larger trends that, unless addressed, puts a ceiling on how large clouds can grow. Gartner Analyst Thomas Bittman says in his article “The Edge will Eat the Cloud” that “the agility of cloud computing is great … but it doesn’t overcome physics — the weight of data, the speed of light.”
This fact is not lost on cloud vendors, and they are starting to figure out ways to address it. In February, Google acquired LogMeIn’s Xively IoT platform, and in April, Microsoft announced that it will triple its spending to over $5 billion on IoT.
This is because cloud vendors already see that they are missing a huge portion of the IT market — the IoT/edge market. Not only that, they also understand that if left unaddressed, it threatens to upend their cloud revenues because the amount of data the IoT edge will generate dwarfs what is stored in their cloud data centers.
The reason current cloud offerings are not adequate to address the IoT and edge market stems from the fact that cloud design principles are orthogonal to the edge requirements.
The infrastructure for IoT and edge systems is very different from what the cloud was designed for:
- Low footprint power sensitive software stack — Software cannot assume unlimited elasticity. The cost of an incremental container in the cloud is practically zero. Therefore, it’s optimized to fit in the least amount of memory and CPU footprint. Although it’s good for margins on a large scale, it’s a nonissue when it comes to solving important challenges quickly. It’s quite the opposite on the IoT edge. The IoT edge can be a very small device or a sensor that barely has any room for additional software.
- Variable latency — The latency from the edge device to the cloud may vary from a few hundred milliseconds to infinite latency. Clouds are built with sophisticated inter-VM and inter-region latency targets.
- Diverse network — The edge stack has to operate across a very diverse network system. Some edges may connect through Ethernet, cellular, satellite or Wi-Fi, to name a few. Clouds are predominantly standardized on wired, Ethernet network-like systems.
- Unpredictable bandwidth — As a consequence of the network and the cost to get high-quality network bandwidth into IoT systems, the bandwidths also are variable. Software needs to be able to deal with this.
- Occasional connectivity — IoT/edge systems may connect occasionally. This is true for high-speed trains or container ships when they connect back up to the network when they stop at stations or docks. This is also true for systems where connectivity is provided when a contractor with a “cell connection” drives up to an IoT device occasionally.
As organizations embark toward their data- and artificial intelligence-driven reinvention of their IoT edge systems, it’s important to keep in mind some principles that history has provided for us:
- Open API — Use an open API platform and not one that provides further lock-in tangles into your cloud provider.
- Fabric — Understand that connecting several IoT systems to regional and global cloud systems essentially constructs a data fabric. Solve this issue by making sure applications are able to access data in a single global namespace so that data movement does not result in rewriting applications.
- Act locally, learn globally — Artificial intelligence and its sub-branch of machine learning on the edge require the ability to act locally and independently to react to local situations effectively without relying on the cloud. Having said that, the system/fabric needs to work in a way that it can learn globally from all the edges and then feed that intelligence back to every edge.
- Don’t make it too big to fail — Putting all the control and management functions in the cloud will make the system prone to failures. The system should be decomposable into individual data clusters — whether on the edge, in regional clouds or in global clouds that allows for the ability to manage them either as a unified unit or individually.
- Secure — As data-driven strategies push aggregated intelligent data to the cloud, there are security mechanisms now available to secure data in the cloud. But what about the edge? The security policy framework should not have to be recreated for the same data multiple times depending upon where the data currently resides.
Cloud providers are starting to realize that the afterthought approach to edge computing won’t work. That is the next battle, and it is going to be bigger than the cloud itself.
To paraphrase Francis Fukuyama, one could have been forgiven for thinking that we had reached the end of market economy history. The trend over the past 40 years has been for trade barriers to come down, and it was a truth pretty much universally acknowledged that opening up economies was good for growth and social prosperity.
Even non-market economies made a pretense at being open and the General Agreement on Tariffs and Trade (GATT) was considered great. The rise of protectionism has, therefore, come as quite a surprise.
Protectionism can come in many forms. It isn’t just the obvious taxes and tariffs that can make trade in goods and services unappetizing. The Food and Agriculture Organization (FAO) has provided a rather nice list of things that can hinder entry to new markets including:
- Government interventions in trade — Government procurement, stock trading, export subsidies or taxes, countervailing duties, trade diverting aid, etc.
- Customs and administrative entry procedures — Customs valuation, customs classification, antidumping duties, consular and customs formalities and requirements, sample requirements, etc.
- Specific limitations on trade — Quantitative restrictions, export restraints, health and sanitary regulations, licensing, embargoes, minimum price regulations, etc.
- Charges on imports — Tariffs, variable levies, prior deposits, special duties on imports, internal taxes, etc.
- Standards — Industrial standards, packaging, labeling and marking regulations, etc.
So what can IoT offer in this new era of trade? And why are digital ecosystems important?
In relation to the first, while protectionism may not be good for the global economy as whole, completely open borders are probably also not a good idea. Some controls are always necessary to prevent trafficking, in people as well as in good, and to contain hazards to health and safety, for instance.
IoT technologies can help governments patrol borders, enable asset owners and insurers track assets as they move across national boundaries, and enhance transparency and efficiency in cross-border administration. Combined with artificial intelligence, unusual patterns can be spotted to thwart illegal activity. They can also play a valuable role in less obvious areas such as real-time security access permissioning or protecting lone workers in dangerous areas.
So where do ecosystems come in? Firstly, the greatest advances on the status quo may come from startups. While their technology may be sound, scaling to the level required by border authorities may well require sandboxing or some other form of piloting. Finding IoT startups that have these capabilities can be like looking for a needle in haystack. Working with reputable startup ecosystem players can help sweep away the surplus hay.
Secondly, IoT is by very definition an ecosystem play. By contributing to ongoing debates on everything ranging from security to standards, would-be deployers can make sure that they understand what the real risk-reward of implementation, but also actively help shape the IoT landscape and enhance the value it brings to their own domain.
Disclosure: The author is CEO of IoT Tribe, an equity-free accelerator that brings corporates and startups to do business and a member of the Board for the Alliance of Internet of Things Innovation, an industry body that aims to foster IoT innovation in Europe.
We are at a pivotal moment in technical advancements, where waves of digital innovation are changing how we work, play, travel, communicate, dine, interact and even think. Just ask Alexa or Siri.
In March, our collective imaginations soared when SpaceX launched a Tesla into space. Drones are starting to deliver packages and meals. Robots are transforming factories, supply chains, restaurants — even the environment. In the latest on the #tech4good movement, Urban Rivers has built a remote-controlled, floating, trash-collecting robot to clean up Chicago’s rivers. IoT-enabled smart cities, such as Barcelona, Singapore and Denver, are already saving millions in energy and labor efficiency while improving citizen services and public safety. 5G services will be available in U.S. cities by end of year, ushering in new augmented reality mobility applications that will make Pokémon Go a relic of the past.
This digital innovation has one thing in common: It’s all enabled by fog computing. Fog is the distributed technology that brings compute, networking and control closer to where the data is generated for ultra-fast response that increases user privacy and system protection. It’s an architecture that is gaining traction with the growing awareness that not all information can — nor should — be streamed to the cloud. Processing data closer to where it’s generated is not only beneficial, but absolutely crucial on many fronts: sub-millisecond latency, less network bandwidth and cost, more efficient operations with enhanced system security. While sometimes referred to as edge technology, fog is more comprehensive: It bridges between device, edge and cloud in a superset of functionality that communicates over multiple networks in a north-south, east-west approach between systems.
Recently published use cases by members of the OpenFog Consortium highlight the technologies behind some of the sophisticated fog-based applications for autonomous vehicles, delivery drones, streaming video for onsite sports venues, real-time energy exploration, smart factories, connected healthcare and more.
The bellwether: Investments in fog are up sharply
Growing awareness of these benefits is driving corporate investments in fog infrastructure. Last fall, 451 Research released its landmark report showing a 4x growth in fog computing, up to $18 billion in spend by 2022. More recently, a survey by Futurum Research of 500 North American companies found that 93% will be investing in edge technologies in the next 12 months, and 72% reporting that their edge strategy is either critically or very important to improve business processes and productivity. That’s a whole lot of innovation to come.
Investments are growing on other fronts as well — those who are creating the technologies and applications. Technology leaders and startups alike are developing the advanced technologies that make machine learning and artificial neural network technologies more affordable and easier to use. In 2017, venture capital funding into U.S.-based IoT startups alone reached $1.46 billion, according to Crunchbase, sharply up from the $461.7 million raised in 2013. According to new research from KPMG, artificial intelligence — most notably in the health, finance and automotive sectors — attracted $12billion in investment from venture capitalists globally in 2017, doubling the 2016 numbers.
Last year, Fog World Congress featured a Fog Tank, where five startup finalists competed on stage before an elite group of venture capitalists to be named the biggest game-changer in fog computing and networking. Master of Ceremonies Sam Bhattarai, director at the office of the CTO of Toshiba Corporation, kicked off the event by telling the audience to take note, as it was highly likely that the companies on stage will be the Googles and Amazons of tomorrow. In the engaging 90-minute session, Silicon Valley startup Nebbiolo Technologies took home the grand prize, facing down stiff competition from ActiveAether, Forecubes Inc., Kiana Analytics Inc. and NGD Systems Inc. In October, new global contenders will take the stage to claim bragging rights to this rapidly growing market.
New business, and new business models created by fog
Creating an open, interoperable architecture for fog is complex — and necessary to enable advanced applications such as IoT, 5G and AI. At OpenFog, we are privileged to have a front-row seat into this future vision of innovation through the work of our forward-thinking members.
Research members in China are already looking far ahead at the possibilities enabled by 6G, which some technologists expect will be an autonomous, self-learning and self-planning network. In Detroit, a small team at Wayne University is pioneering connected ambulatory care that keeps critical data accessible in areas of poor connectivity or in poor weather. Princeton University’s Edge Computing Laboratory is leading the work to solve high-density data distribution over remote networks. Several OpenFog members are doing groundbreaking work in blockchain deployments through fog.
And some are creating entirely new business models. SONM is offering IaaS and PaaS based on fog computing as a back end. Computing power suppliers all over the world can contribute their computing power to SONM marketplace. AetherWorks has also created a global computing marketplace in which anyone can rent out, or purchase, processing power for peak demand. Its business model is driven through FogCoin, the blockchain-powered cryptocurrency that incentivizes users to join and use the network.
SCALE to deployment
As defined in the OpenFog Reference Architecture (note: registration required), the attributes of fog can be simplified according to the SCALE model: security, cognition, agility, latency and efficiency:
- Security — In addition to its distributed approach for trusted transactions, fog offers better GDPR compliance/privacy protection by anonymizing data locally. Should an incident occur, public safety officials can disable it for real-time identification purposes.
- Cognition — Local fog nodes enable autonomy by processing information at or near the source of the data for real-time decision-making and response.
- Agility — Clusters of fog nodes can help rapidly moving assets achieve levels of communication and coordination at sub-millisecond speed and at scale.
- Latency reduction — Fog enables real-time processing and cyber-physical system control for rapid response.
- Efficiency — Fog nodes enable dynamic pooling of local unused resources from participating end-user devices.
In less than two years, fog has emerged from a relatively unknown concept to one that is powering industry shifts in hot areas such as Industry 4.0 and autonomous transportation.
We invite you to join OpenFog as we help to enable and create the future, and to join in the innovation, research and technologies at Fog World Congress, October 1-3, 2018.
Currently, more than 4 billion users, about 54% of the world’s population, are digitally connected. At the same time, Gartner estimates that there are more than 8.4 billion devices currently connected and predicts this number will reach more than 20 billion by 2020. Before long, almost everyone and everything will be connected.
Today, we speak of the “internet” when we are talking about the devices and services that help people communicate. We make a distinction when we talk about the system of interrelated devices, machines, objects, animals or people that have the ability to transfer data over a network without requiring human-to-human or human-to-computer interaction. We call that the internet of things.
The potential benefits from this increasing connection are incredible. But as we move towards this era of everything becoming connected, there will be more and more concern about the security and usage of the personal data that gets collected, used and disseminated. Data breaches highlight the vulnerability of our personal information and underscore the importance.
We must take into account the aspects of IoT and information technology that determine what data in an organization’s computer system can be shared with third parties, what is generally called data or information privacy. Just as the value of IoT depends so much on your industry, when we talk about privacy, it is useful to divide the market into three basic segments: consumer, services/public sector, and industrial/enterprise. Let’s look at a representative use case for each segment and potential privacy concerns.
The potential safety, mobility and efficiency benefits of automated vehicles are many. These vehicles are also expected to generate an enormous amount of data, some of which will be personal and sensitive, such as real-time precise geolocation data and the contents of driver communications that result when they connect their mobile phones to a vehicle’s computer system. Connected devices often ask users to input personal information, such as their name, age, gender, email address, home address, phone number and social media accounts.
Other consumer IoT use cases, such as virtual assistants, connected health monitoring, consumer marketing, tracking product usage and performance, and so forth, often ask users to input personal information, such as their name, age, gender, email address, home address, phone number and social media accounts, creating further privacy concerns.
Public sector and ‘smart’ city use cases
So-called “smart” cities have the potential to provide higher quality services at lower costs by eliminating redundancies and streamlining city workers’ responsibilities. Cities are installing boxes on municipal light poles with sensors and cameras that can capture air quality, sound levels, temperature, water levels on streets and gutters, and traffic, identifying ways to save energy, to address urban flooding and improve living conditions.
Public sector IoT data also includes the data present in city registers, the data from government or corporate surveys, and the data from social media updates. This data is often combined and linked in order to produce joint indicators of city well-being, economic vitality or safety. Increasingly, local governments make this data also available to the wider public. All of this raises issues about who has legitimate access, which data can be opened up to public usage and what is the appropriate privacy framework for the linkage of different data.
Industrial/enterprise and the connected factory
Using IIoT, a company can connect devices, assets and sensors to collect untapped data. This also allows a company to deliver scalable, reliable applications faster to meet the ever-changing demands of its customers. For example, in a connected factory, sensor-enabled equipment can supply helpful data about its continuing condition. This information can be analyzed to predict when and where equipment might break down, helping factories to prevent production shutdown. In the event a breakdown does occur, a factory can analyze this data to determine the problem and take corrective actions to prevent future occurrences. This allows them to shift more attention toward innovation instead of infrastructure management.
Tips for maintaining data privacy in your IoT program
In each of the segments above, at some point data includes not just input from sensors, but also personal data. That’s easy to see in the consumer IoT and public sector IoT, where data sets regularly include customer usage and behavior data. But even in IIoT, understanding how analytic data will be used and by whom, as well as integrating human needs and concerns into the system, means including some amount of “personal” information.
The most obvious policy any company can implement to enhance data privacy is to implement and enforce technological data protection measures to help prevent breaches in the first place, including but not limited to encrypting data at rest and destroying “personal” data when not needed. When personal data is encrypted, even if an unauthorized third party finds a way to see the data they cannot read it or collect it. (For more about privacy and security see this Deloitte Insights article.)
IoT is on track to connect ever more devices that do not require human-to-human communication and provide data to fundamentally disrupt enterprises by turning linear processes into networks. More and more data is being collected and processed, which is unearthing previously unimagined value by making companies ever more efficient and responsive.
And, more and more, some of this data will require new approaches to security and privacy. As we look forward, companies should develop integrated, enterprise-level approaches to data governance by strengthening and implementing data protocols and policies.
IoT has become a major focus area for mobile network operators worldwide. However, one fundamental question remains: Where are the real IoT business opportunities for MNOs to strategize and achieve volume, scale and growth?
First, let’s be clear — IoT is not just an isolated business segment, but it is an integral part of a bigger digital transformation strategy. Some operators are already in advanced stages with a mature IoT strategy in place, while others are lagging behind or somewhere in between.
It’s no secret that most mobile network operators (MNOs) face challenges in the commoditized voice and data markets while experiencing saturated subscriber growth. Additional challenges come from technology disrupters and global digital platform players that have become technology havens, attracting fresh innovation, large investments and financial market attention.
However, MNOs still have a solid position in regard to the ownership of their customers’ engagement, be it consumers or enterprises. MNOs will eventually be able to generate long-term IoT growth in the broader digital transformation landscape, where digitization has begun to transform a wide range of other industries.
First, let’s look at past lessons learned from two examples: media content monetization and mobile payment.
In the first example, leading MNOs realized how critical a media content strategy was to enabling growth and successfully moving up the value chain. Different enabling strategies were put in place, including the acquisition of media channels, the creation of partnerships and the building of media content delivery platforms. All of these have enabled MNOs to monetize, bundle new subscription-based services and generate higher data traffic on their networks.
In the second example, specifically in emerging markets, MNOs have created, through partnerships with the financial sector, new purpose-built payment services around their low ARPU customers to execute secured mobile payment transactions. These services have enabled a wide range of community-driven business models.
The main lesson learned from these two examples is that MNOs’ strong position in customer ownership and engagement combined with a deep understanding of market needs and dynamics has been key to driving new opportunities.
In the era of digital transformation, IoT is bringing a new set of data attributes that relate to devices and sensors associated with customer data in an ecosystem — let’s call this device data.
For example, one customer might have a car, phone, medical device and smart watch, and she may bike or use public transportation and she also visits the doctor, lives in a farm or smart city, travels, consumes electricity and water, and goes to work, school, shopping and so forth. All surrounding and embedded devices and sensors will generate a huge amount of data that relates to this specific customer in one way or another. The amount of device-generated data will surpass the customer’s data over time, and will both eventually converge.
As the IoT market begins to mature, we will find the amount of converged customer and device data will form unique new IoT opportunities to explore. This converged data holds many promises for new opportunities and will proliferate in many day-to-day industries, such as healthcare, transportation, smart cities, utilities, retail, energy, smart living and wearables, in developing and developed worlds alike. In many cases, this converged data proliferation will also be driven by evolving community-based activities.
MNOs can play an essential role in using their customers’ and enterprises’ ownership by thinking in this way. This will also require putting strategic plans in place to enable user integration across multiple industries through IoT aggregation platforms, data integration tools, developer and API-centric services, secure authentication, on-demand capacity, data analytics and integrated flexible billing capabilities, to list a few.
Equally important, innovation to enable business integration, data contribution models and data sharing incentive mechanisms must be culturally motivated. With socioeconomics as the underlying baseline, MNOs can take a more active and leading role with regulators and data policymakers to adapt to a data-driven mindset. This will require further business and data integration capabilities with a service-oriented approach, enabling a cross-industry digital transformation and thereby creating new possibilities.
It’s important to note that this won’t happen overnight, but it surely will happen fast — depending on how quickly MNOs’ strategies are put in place and adapted.