Does the phrase “internet of things” cause people to forget everything they ever knew about security? It sometimes seems so. Best practices like defense-in-depth and role-based access controls? Out the window once IoT enters the picture. That said, there are systematic differences that should make us view security differently in an IoT world.
Here are seven factors that set IoT security apart from physical security and even many aspects of conventional IT security.
1. Lifecycle mismatch
Many types of physical objects — buildings, automobiles, refrigerators, light switches — last a long time, decades even. They typically require maintenance, but they’re often not replaced until the repair bills get too high or they just don’t work. We certainly don’t expect to replace them because a manufacturer decided not to support them after five years.
Yet much of the software involved in IoT is intended to be disposable. There may be no provisions for upgrading client devices at all. Software support for even relatively expensive consumer devices is usually just on the order of a few years.
From a security perspective, this means otherwise functional devices are likely to be exposed to unpatched vulnerabilities as they get older.
2. General-purpose extensible devices
If these network-connected systems used specialized hardware and software to operate and communicate, outdated software wouldn’t necessarily be a major issue. It would likely be hard to force such devices to take actions they weren’t originally designed to do.
However, in practice, it’s very common for IoT devices to effectively be general-purpose computers running open source operating systems and network stacks. There are good reasons for this; among other things, it’s easier (at least in principle) to update them and add new capabilities over time. However, it also means that an attacker who gains control of a device has more options to wreak havoc.
3. Bad economic incentives
None of the above is unfixable. We keep industrial equipment running for decades. Software vendors, including my employer, offer a variety of extended life support options for subscription products. These models work because customers are willing to pay for ongoing maintenance and support at levels where it’s profitable for vendors to supply them.
Those same incentives aren’t in place when you buy a light switch, or perhaps even a vehicle. No consumer is likely to pay for an ongoing light switch contract. Some may do so for cars, but it’s not common after the initial warranty period. As a result, there are no incentives for most device makers to continue supporting what they’ve sold beyond a fairly short window.
4. Connected by default
Vulnerability to attackers who connect to an IoT device or gateway wouldn’t matter so much if making that connection were difficult or impossible. But increasingly it is not. The norm is to connect to networks, usually wireless networks, and often public networks. Even when there’s no compelling reason to do so.
It’s long been recognized that protecting computer systems against intruders who have gained physical access can be extremely challenging. (Witness breaches at the NSA and elsewhere.) However, pervasive and routine network access introduces many of the same threat vectors. Certainly it creates a far greater attack surface than physically isolated systems.
5. Ecosystem effects
When general-purpose, network-connected computers are breached, it doesn’t just affect the target of the attack.
Data breaches can affect millions of customers when sensitive information is stolen; this applies whether we’re talking IoT or more conventional IT systems. IoT multiplies the issue by collecting more and more ambient data that people may not even be aware is being collected.
Ecosystem damage can go beyond data. The Mirai botnet’s DDoS attack caused significant disruption to the internet as a whole. It resulted from outdated versions of Linux on webcams being turned into remote-controlled bots for large-scale network attacks.
6. Common widespread failure modes
Speaking at the Open Source Leadership Summit earlier this year, security expert Bruce Schneier noted that computers and networks fail in a different way than non-computerized systems. “You worry about crashing all the cars. You’re concerned about the five sigma guy, not the average guy. It doesn’t happen in lock picking in the same way,” he said, because no matter how skilled, one person can only break into one physical building at a time.
I mentioned the Mirai botnet earlier. But it’s the nature of IoT and connected systems more broadly that vulnerabilities and attacks usually affect many systems. Of course, individual attacks can still lead to data breaches or the shutdown of a critical system. But even breaches that would be relatively innocuous in isolation can cause serious failures in systems like the power grid if multiplied by a thousand or a million. Scale matters.
7. Actuators can affect our environment
We’ve seen how IoT can differ in scale, connectedness and vendor support from more conventional IT systems. But if I had to pick one aspect of IoT that’s fundamentally different, it’s this one: IoT is not read-only.
Schneier calls it “an internet that affects the world.”
Software already controls many critical systems or directly tells humans what to do. But the degree to which IoT is replacing manual and disconnected controls pervasively and by default is striking.
That IoT can take physical actions may not really change its security model, but it certainly raises the stakes.
We prioritize features. We prioritize low prices. We prioritize today. We do not prioritize security over a product lifecycle that may span decades. In devices that have the power to affect the physical world.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
Deep learning and IoT are two game-changing technologies that have the potential to revolutionize the stakes for oil and gas companies facing profitmaking pressure in the face of the dramatic drop in price of oil. In this blog, based on Flutura’s extensive experience in the oil and gas industry, we have highlighted three practical use cases, from the trenches, where these technologies are practically applied to solve real-life problems and impact meaningful business outcomes.
1. Deep learning algorithms detect risks in oil pipelines
In our first use case, we take a look at how algorithms can reveal patterns and information not easily seen in other ways. For instance, drones are increasingly being used for pipeline inspections. As these drones fly through a pipeline, they record an enormous amount of video footage. It’s very difficult for a human being to detect risks such as leaks and cracks in a pipeline. Deep learning algorithms can automatically detect pixel signatures from drone footage for cracks and leaks that humans can miss, thereby minimizing infrastructure risk.
2. Deep learning algorithms detect asset behavior anomalies
While working with several oil and gas companies, we were able to collect a great deal of data from sensors strapped onto upstream assets like frack pumps and rod pumps. Looking for anomalies in high-velocity time-series parameters is like looking for a needle in a haystack for mere mortals. Deep learning algorithms can “see” anomalies that traditional rule-based electronic condition monitoring systems miss and can alert rig operations command centers.
3. Rig diagnostic bots
While providing remote diagnostic services to industrial assets, the conventional form of interaction is through traditional dashboard communications. With the advent of natural language processing algorithms powered by deep learning, field technicians can interact with the asset diagnostic applications through voice interactions just as bots help in customer service.
The advent of deep learning and IoT has brought about great strides in learning, such as predicting and determining attributes, including insights on anomalism, digital signatures, and acoustic changes and patterns. Being able to see beyond what can be seen provides the potential, as is illustrated in our uses cases, to head off potential problems and structural failings, saving organizations time and money and keeping all that benefit from their services safer. We envision a future where the twin digital capabilities of deep learning and IoT will differentiate the winners from the laggards in the competitive energy marketplace — and the first steps are being taken right now.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
We recently released our latest “Threat Landscape Report,” in which we gathered insights into the cybersecurity landscape around the world. The data spans the cybersecurity kill chain, focusing on three central aspects of the landscape — including application exploits, malicious software and botnets –against the backdrop of important enterprise technology and industry sector trends. The research reveals that while more high-profile attacks have dominated the headlines, the majority of threats faced by organizations are opportunistic in nature. What’s more, the internet of things continues to present security challenges, both within the connected devices themselves and as a change agent in terms of how data — and threats — are shared.
Hyperconvergence and IoT are accelerating the spread of malware
As networks and users increasingly share information and resources, attacks are spreading rapidly across distributed geographic areas and a wide variety of industries.
Studying malware can help provide views into the preparation and intrusion stages of these attacks.
The prevalence of mobile malware remained steady from Q4 2016 to Q1 2017, with about 20% of organizations detecting it. Only one family of Android malware broke into the top of the charts in Q4 2016, but three did in Q1 2017. The percentage of mobile malware jumped from 1.7% of total volume in Q4 to 8.7% in Q1.
In terms of regional prevalence, mobile malware rose in every region except the Middle East. The rate of growth was statistically significant in all cases rather than simply random variation. Compared to some other regional threat comparisons, Android malware appeared to have stronger geographic tendencies.
Exploit attempts against IoT devices themselves, however, dropped in Q1. Detection volume, shown in the chart below, fell an order of magnitude for the most-targeted device categories. This was most notable within the DVR/NVR category, which sprang to life last quarter and neared 100 million daily detections at one point during the Mirai-fueled DDoS attack again Dyn. It’s certain that the universe of vulnerable internet-connected things weren’t fixed after one quarter, so we can’t help but interpret this as a slight calm between storms.
Threats by region and industry? Not really
The internet knows no geographic distances or boundaries, according to our research. Modern tools and pervasive “crime as a service” infrastructure enable attackers to operate on a global scale at light speed. As a result, most threat trends appear more global than regional. This was even the case with ransomware. We learned that ransomware affected 10% of firms over the quarter, and it’s reported by a little over 1% on any given day. Furthermore, this is true of all industries and regions to some degree. This is a complex threat that won’t go away with simplistic approaches.
To complicate matters more, cluster analysis by vertical industry shows that the attack surface across most industries was the same, with a few exceptions, such as the education and telco sectors. This means that adversaries can exploit similar attack surfaces across verticals more easily, especially with automated tools. Cybercriminals today don’t even have to tweak their attack methods; they can go after several industries at once. Think about the ease of scale and speed this enables; WannaCry is a perfect example. A huge proportion of exploit activity is fully automated via tools that methodically scan wide swaths of the internet, probing for opportunistic openings.
You can see this in more detail in Figures 2 and 3 and how they compare. Many industries fall within the nexus of the “mega-cluster” in each of those charts. More interesting is the fact that they also share many of the same outliers (e.g., education, telco/carrier, MSSP). Could it be that an organization’s infrastructure usage has a stronger relationship to its threat profile than its industry?
The takeaway here is that cybersecurity strategies need to increasingly adopt trustworthy network segmentation and high degrees of automation to prevent and detect adversaries’ efforts to target the newly exposed flanks of businesses and governments. You have to fight automation with automation, especially as attack vectors across industries are looking similar.
In our report, we looked at threats that span from pre-attack reconnaissance (exploits) to weaponization (malware) to post-compromise command and control (botnets). While targeted attacks often grab the headlines, the bulk of threats faced by most organizations are opportunistic in nature. It is a reminder that defenses should be spread along the kill chain, and we recommend reviewing security postures to assess capabilities at each phase. A few other IoT-focused takeaways that stand out:
- Protecting against mobile malware is particularly challenging because devices are not shielded on the internal network, frequently joining public networks, and often are not under corporate ownership or control. Mobile security strategies must acknowledge these conditions and yet still thwart malware through mobile application controls and malware protections integrated into the network.
- Our findings pertaining to Mirai-style botnet attacks serve as a reminder that monitoring what’s going out of your network is just as important as what’s coming in (likely more so). Protecting all hosts and users from all inbound threats is an impossible task, but severing C2 communications at key chokepoints in your network through a combination of smart tools and good intel is much more achievable.
For a deeper analysis and other valuable takeaways of all the data we gleaned, download the “Q1 2017 Threat Landscape Report” here.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
From the candlestick to the first lightbulb patented by Thomas Edison 140 years ago, lighting has always been a necessity. Whether it’s in the home or the workplace, lighting simply needs to work. Next-generation smart lights are no different. Light-emitting diodes (LEDs) are the most significant lighting advancement since the invention of the incandescent bulb. Smart LEDs are opening up new use cases for lighting that Edison never dreamed of.
Today, LED lighting is a $30 billion global industry paving the way for smart lighting solutions, but the smart lighting industry still hasn’t hit its full potential and achieved widespread adoption to bring value to the lives of consumers. With improved interoperability, control and security, smart lighting is poised to become the next killer app for the internet of things.
The brighter the better
One reason why connected lighting has not taken off among consumers is cost. A typical home in the United States has 40 lighting sockets, so imagine spending up to $14-15 per connected bulb for an entire smart home. Prices for basic LED bulbs as well as connected bulbs are steadily dropping, and the rapid growth of the smart bulb market will keep driving down costs. Lower price points will help drive the deployment of other IoT products, particularly in the smart home where more integrated and flexible wireless connectivity options are available. This price reduction will also continue to make smart lighting more accessible to every house, commercial building and city as we continue advancing toward a more connected society.
A second barrier to consumer adoption is the level of effort required to install connected lighting systems, which until recently were beyond the do-it-yourself level of most homeowners. Today’s smart LED bulbs integrate sophisticated power management and wireless circuitry that ultimately makes the latest LED products much easier to install even by tech-averse homeowners. Smart lighting networks and ecosystems are also easier to use and maintain, providing plug-and-play simplicity and enabling consumers to commission connected LED lights with their smartphones.
Lighting on command
Most users will ease into the connected lighting world by installing a few smart LEDs at home. Smart lighting is also steadily spreading throughout commercial spaces and entire cities. Even the way we control lights is changing as the simple on/off switch gives way to flexible control options through smartphones, touch panels and even voice control. Cloud connectivity gives users the ability to remotely control lighting and monitor energy usage through smartphones, anywhere and anytime. Users can also access new, sophisticated features like occupancy and ambient condition sensing, which enable a lighting environment to respond and adapt autonomously.
These advanced capabilities such as turning off lights when no one is in the room are just the beginning. Looking ahead, no one will want to use multiple apps to control their lighting. Virtual assistants are the next solution for lighting control beyond switches and apps. Similar to how remote controls transformed the TV viewing experience, voice services like Amazon Echo’s Alexa or Google Home Assistant will eliminate the need to get up off the couch and touch the closest light switch or control panel.
Getting in sync
Connected devices are meant to make our lives easier and less complicated. Today, however, different smart home products use different technologies that do not always speak the same language, ultimately stalling the widespread adoption of many IoT devices. Solving this interoperability challenge begins with using connectivity technology that enables efficient switching between different wireless protocols, simplifying lighting designs while also helping to satisfy end user needs for easy installation. This multiprotocol technology is becoming increasingly important to future-proofing IoT product designs by enabling device markers to support multiple wireless options instead of betting on one protocol over another. A larger challenge yet to be resolved is driving system-level interoperability among the many divergent lighting ecosystems. To help move the industry forward, lighting manufacturers must agree to broader consolidation among competing application-layer solutions.
Bringing security to light
Security is a tough issue for any connected device, and lighting is no exception, especially since most consumer electronics makers are not in the security business. A system or solution is only as good as its weakest link. A single, unsecured connected light bulb can be a vulnerable attack surface, opening an entire connected home to malicious hacking.
Security is not simply an add-on feature at the end of a product design cycle. It must be viewed as a continuous process, from initial concept to final product to over-the-air updates. Everyone serving the smart lighting market — from chip designers to device manufacturers — must shoulder this security responsibility together. Securing IoT requires constant vigilance and ongoing development efforts across all layers of the IoT ecosystem. This even means that chips and software embedded in today’s smart bulbs must be designed with security in mind. With each high-profile attack, the fallout goes far beyond revenue loss or brand damage. The loss of consumer trust can slow the market for months or even years.
The future of lighting
Imagine coming home from work and your lights are already on before you walk through the front door. Your bedside lights automatically dim at night and then help you wake up in the morning by adjusting color and brightness. As these scenarios illustrate, lighting is a powerful technology that can enhance our daily lives in many ways while also helping us conserve energy in our homes, offices and smart cities. It is crucial that we address the interoperability, control and security aspects of smart lighting. Solving these challenges will spur technological innovation, and help drive widespread adoption other connected technologies that will contribute to the growth of IoT.
3D printing is an emerging technology that is becoming a crucial part of practically all the industries: aerospace, medicine, education, civil work and more. Housing is fast becoming an application area leveraging the 3D-printing capabilities. A number of companies are using 3D printing to make homes: PassivDom from Ukraine, Apis Cor, Dur Architects from Amsterdam and Branch Technology from Chattanooga, Tenn.
PassivDom uses a 3D-printing robot to print different parts of a small house such as walls, roof and floor. The robot is capable of printing the parts of a 380-square-feet model in about eight hours. A human worker then works to add the windows, doors, electrical systems and plumbing.
Homes made using this process are solar powered and have their own electrical, plumbing and sewerage system, thereby making them autonomous and mobile. Solar energy powers the electrical systems, while water is drawn from the humidity in the air and purified. You can also pour water yourself in the system. The sewerage system of the building is also independent. You can order a PassivDom house online and it costs a shade under $32,000.
PassivDom employs a three-step process to make a home using the 3D-printing technique. First, the team maps out the plan for the building. In the second step, in PassivDom factories in Ukraine and California, a large seven-axle robot prints the roof, floor and walls as per the layout. The printer utilizes carbon fibers, basalt fibers, resins, polyurethane and fiberglass to make these structures. The final step involves manually adding windows, doors, electrical systems, plumbing and sewage systems.
Make nature your home
This provides an exciting opportunity to people. You can now choose the location of their home: closer to nature and away from the busy city life. Thanks to 3D printing, traditional living is now affordable and efficient anywhere.
Some enlightened people like Elon Musk do not dream of electric sheep, they dream building human-computer hybrids. Elon Musk’s Neuralink company’s goal is to explore technology that can make direct connections between a human brain and a computer. Musk floated the idea that humans will need a boost of computer-assisted artificial intelligence to remain competitive as our machines get smarter.
Likewise, Facebook’s uber-secretive Building 8 is where the company is working on top-secret brain-computer interfaces similar to Musk’s Neuralink, however it isn’t the only future-tech Facebook has in the works. According to Michael Abrash, chief scientist at Facebook-owned Oculus Research, super augmented reality (AR) glasses could replace smartphones as the everyday computing gadget in the next five years.
A little intimidating, right?
From bring your own device to bring your own wearable
I was very skeptical about the promised benefits IT vendors sold us with the famous bring your own device trend. Many IT departments struggled to keep up with yearly technology changes, now company employees increasingly wanted to use their own devices to access corporate data. It was part of a growing trend dubbed BYOD, which encompassed similar bring your own technology, bring your own phone and bring your own PC initiatives. All of them evolved empowering workforces through the so-called “consumerization of IT.”
But BYOD also had a dark side. On one hand for enterprises, it threatened IT security and put a company’s sensitive business systems at risk; and, on the other hand, prohibiting it threatened employee satisfaction and happiness.
Have any of you improved your work-life balance because you used your own device to continue working after you leave your office? There are so many things that improve productivity and increase innovation, how enterprises can ensure that it was because you BYOD?
In 2013, bring your own wearable (BYOW) was one of the major trends analysts and tech writers started to hype. Let’s admit that one of the reasons why this trend has not had the impact we expected was the miserable failure of Google Glass. But Google wasn’t the only giant to fail, Apple also failed with its little-useful Apple Watch.
Other sound failures, such as Fitbit, Nike FuelBand or Toshiba Glasses, have created a bad reputation to the still-awaited success of BYOW. If you are one of those who enjoy technological disasters, here you can read about the biggest wearable tech disasters.
Let’s be optimistic; the past numerous wearables failures mean nothing. Large enterprises are now using wearables. It’s hard to predict when success will come, but it will happen. Wearables technologies will be adopted in mass in the enterprise world because very soon will be economically and financially profitable. Let´s trust the component revolution in wearables reported by TechCrunch.
What will happen after BYOW
Chris Dancy is the most connected human on the planet, wired with up to 700 sensors. Chris may be the first case of an augmented human.
Augmented reality is scaling in industrial settings, and wearables are providing the human interface to industrial IoT with new and compelling results. The convergence of machine sensors, big data and artificial intelligence (AI) tied to AR smart glasses will create the augmented workers needed for this next industrial revolution.
The novel use of AR, AI, devices and wearables are enabling easy use of devices and wearables by individuals, employees and families to measure vital signs anywhere and remotely and securely send data to the healthcare provider. New advancements in technologies (AR/AI/IoT/wearables) are demonstrating that they can aid patients and maybe change the human condition, creating augmented patients.
An increasing number of people are voluntarily contributing the data they generate to improve public infrastructure, enhance public services and increase public safety. The “citizen sensor” trend is not just limited to activists and data enthusiasts, rather it encompasses regular, everyday citizens empowered by a new generation of connected technologies that makes it easier than ever to contribute their data for the public good. If we are convinced to share our data and we are on course to wear wearables 2.0 all day, we will become augmented citizens.
In my post “The future of “The Internet of Olympic Games,” I predicted that in Tokyo 2020, technology and innovation will be pushing human performance to the limit. Technologies such as wearables, IoT, artificial intelligence, virtual reality and artificial intelligence are becoming core ingredients of training and preparation for competition. In Tokyo we will see the first augmented athletes.
How far can we — and how far should we — push these technologies? We must avoid the creation of the psychopath and sociopath augmented human, but we should not stop the bring your own augmented human revolution.
Thanks for your likes and shares.
For computing, security has traditionally been an endpoint game with antivirus protecting the desktop and intrusion detection on the server, monitoring for suspicious activity and alerting when it detects anything. Today, the endpoints have evolved to be a multitude of different devices connected to the internet in what we call the internet of things. They range from thermostats and security cameras to refrigerators and smartphones, and they aren’t designed to be standalone computers that can run endpoint protection software. We need a new security paradigm in the age of IoT.
For example, the Mirai botnet targeted networked devices, like IP cameras and home routers, that are running outdated versions of Linux. The malware infected the devices through security holes that result from default settings in the devices and turned thousands of them into a botnet that launched huge attacks last September and October, temporarily shutting down a number of high-profile sites like Twitter, Netflix, GitHub, Airbnb and Reddit. There are still hundreds of thousands of devices that are vulnerable to compromise and more coming on the market all the time.
Often we can’t just put antivirus or other malware-alerting software on those IoT devices though. They have scarce resources like CPU, storage and memory, and in many cases lack a typical operating system. For the end user, even if it’s running Linux, the device may not allow shell access, which would enable people to install arbitrary security software on the device. For example, with a Linux-based thermostat, you can’t just log in and set up antivirus on it and manage it. Attackers, in contrast, leverage exploits to gain shell access and install their code. When an owner does this to install their own software it’s typically called “jailbreaking” the device, and it may be an option for some scenarios. There may emerge a market wherein these jailbreaking methods are used to secure the device by their owners.
The market for IoT operating systems extends far beyond Linux, however, and includes Windows IoT, Zephyr, RIOT, Brillo, Nucleus RTOS and more — it’s truly in the explosion phase, with dominant players sure to shake out the next decade or so. Each of these operating systems presents a unique set of opportunities for security, as well as challenges. Very few of them, however, have security monitoring built in. Like anything else, IoT platform “security by obscurity” works until it doesn’t.
The consequences of failing to secure these IoT devices are great and go way beyond just botnets. With IoT, failure modes can manifest themselves in the physical world. In an enterprise we increasingly see HVAC systems connected to the internet (the compromise of a HVAC vendor is what led to the big data breach at Target four years ago). In the Target case, the attackers used the HVAC system as a conduit to get to Target’s network and customer data. Compromising an HVAC system could give attackers a way to interfere directly with an enterprise environment. Access control systems, which prompt employees to swipe a card or provide a fingerprint to get into an office, are also increasingly internet-connected. One can easily imagine malware that could lock people in or out of a building.
There are only a handful of options available today for IoT device security. In many cases you can’t run endpoint alerting tools as you can on desktop. One of the challenges is that you might get a notice that the IP address for your webcam appeared on an IP reputation list, yet you still have to track down which device it is and identify what to do next.
The challenge is we need to do security monitoring as close to the device as possible without doing it on the device. But how do we do that in this specialized IoT environment?
Evaluation and purchase
When reviewing IoT devices, their security should be a primary concern. Because of the different way they’re managed — often at the mercy of the original vendor and not the administrator — security review questions matter even more. Questions to ask include:
- Does the vendor offer security updates? If so, how are they delivered?
- What is the supported lifetime of an IoT device?
- What is the track record of the vendor’s device security?
- How will the vendor notify me of an update? How easy can I install it on a fleet of devices?
- What kinds of security controls does the device have? Is its baseline configuration secure? Did the vendor follow any security guidelines when building the device?
With traditional IT equipment like laptops and servers, these are valuable questions but the administrator has a lot of control over the device security. For an IoT device, however, that control is often stripped; making these questions prior to purchase is significantly more important. Remember, you’re likely putting a device in the network that can lower the security of the network, treat it as you would any risk you have little control over.
When thinking about a network of high-value resources and reduced-security resources, a natural reaction is to segment those two groups. They also serve distinct business functions, further strengthening the argument to segment the traditional network and the device network. Admins should segregate business-critical IoT devices into their own network that’s not accessible from the open internet or the corporate local area network even. There is no good reason for the devices to be accessible to others. This is standard network architecture advice, but has increased relevance with IP-enabled devices.
Segmentation should be physical if possible, with a logical segmentation as a fallback option. A physically distinct network for IoT devices will ensure they can’t work unless they’re connected to the right network and prevent accidental exposure to the open internet. A logically distinct network would utilize VLANs, network addressing, routers and firewalls to isolate devices into their own network.
The goal of this segmentation is to minimize the risk to the rest of the network posed by devices with lower security guards in place. In their own network, if they’re attacked, these devices won’t enable pivoting to servers and other high value resources. This network segmentation will facilitate monitoring, as well.
Admins will still need the IoT equivalent to firewall logs. They need to track all traffic to and from the devices and look for oddities and see what traffic has been trying to talk to the devices, or worse, what servers the devices have been trying to talk to, such as a command-and-control server used for controlling botnets. As with any log monitoring effort, they also need to review the logs for oddities and follow up on those interesting hits.
There are management devices that have a very concrete set of users and use cases, but you can lock them down. These are special-purpose devices so they’re not doing random things. Utilizing the logs for IoT devices is far easier when compared to a laptop or a server because the number of devices it talks to is small, so you look for deviations. For example, the only connections to it should be from a small pool of management nodes, not the outside world. IoT devices should only initiate contact with a small handful of local systems. Similarly, the kinds of traffic it sends should fall into a narrow category, for example measurement data.
These are only stopgap measures. There is only so much admins can do, and consumers have basically no capability to secure their devices beyond changing the default settings and passwords to lock the devices down more. We need the industry to step forward to help solve this problem.
The National Institute of Standards and Technology (NIST) just last winter developed a framework that tries to consolidate and develop good guidelines for IoT security. There are also a handful of specialized consortiums trying to address the IoT security issues, but those are focused on specific devices or use cases and aren’t applicable more broadly.
There is also some exciting DARPA-funded research that seeks to externally monitor IoT devices by looking at their power consumption. When there’s a spike on an IoT sensor or it’s acting abnormally, that could signal a security incident. Although this research is early stages and not likely to wind up in technology any time soon, some of these concepts have been around for a while, for example monitoring power usage or vibrations.
Eventually, we will see IoT cybersecurity solutions from vendors, such as an IoT firewall. Vendors will have a pretty good set of guidelines for a device to be secure, either based on the NIST framework or those other third-party groups such as the Cloud Security Alliance and the IoT Security Foundation. But the complexity of setting up the network will still be a huge challenge for many admins and most consumers. We can’t have default passwords or allow certain devices access to the internet. We will need a security paradigm with a combination of special user and administration policies geared just for IoT.
When I think about the industrial internet of things, just like pretty much everyone else, I get excited about its huge potential to transform our world through new efficiencies, reduced risk and enabling entirely new business models. However, I must admit that the second thought that springs into my mind is a picture of spaghetti junction. If you’re designing, developing and deploying IoT solutions, you know exactly what I mean. It’s like that massive highway interchange with so many twists and turns that it seems way too confusing simply to get from point A to point B. The sheer volume of fragmented IoT connectivity protocols (both standard and proprietary), or “protocol soup,” as I also like to call it, is one of the most frustrating challenges in realizing the clear benefits from deploying IoT solutions.
An inherently heterogeneous market
IoT is inherently heterogeneous — a growing collection of technologies rooted in embedded systems and machine-to-machine communications across countless verticals and use cases. It’s a myriad of hardware types, operating systems and development tools, not to mention a plethora of connectivity standards, many of which are dictated by existing installations that require a gateway to bridge data from sensors and machines to a broader network for analytics-driven ROI. This diversity provides incredible richness but also huge complexity to contend with.
Fragmentation is hindering adoption
In today’s market, selecting technologies and developing an IIoT solution that can quickly deliver ROI can be so complex that it becomes paralyzing. The current fragmented landscape is confusing and has resulted in a patchwork quilt of custom solutions that’s slowing down the overall rate of adoption and general growth of the industry. Ultimately, this is likely to stifle innovation.
Unifying the village
So how do we get to a common center of gravity that allows developers to quickly and easily deploy working industrial IoT solutions while still enabling hardware, software and services providers to differentiate and monetize their value-add? At Dell, we’ve always been big believers in openness, choice and driving standards; we’re members of several IoT alliance/standards activities like the OpenFog Consortium, Industrial Internet Consortium and the OPC Foundation.
These organizations are doing important work to promote reference architectures, facilitate standardization and generally make the solution developer’s job easier. However, as much as we should all be focused on narrowing in on a more manageable collection of standards, the practical reality is that the IoT market is way too complex for there to ever be one standard to rule them all. We therefore need to find a way to help IoT-relevant standards, hardware, operating systems and development tools work together.
Making sense of the spaghetti
Back in 2015, we began to think about how best to resolve the problem of rendering all of these fragmented solution ingredients more interoperable. Our take was that in order to speed up market adoption we needed to address key interoperability challenges at the edge of the network, where data flows “north, south, east and west” between both standard and proprietary protocols and applications in an intertwined, distributed IoT fog architecture. Due to the aforementioned spaghetti, the edge is where most of the key challenges in IoT are today.
The answer: An open source platform for edge computing
Fast forward two years to the formation of the EdgeX Foundry Project, hosted by the Linux Foundation and backed by over 50 member organizations. The charter of this vendor-neutral, open source project is to deliver a flexible, industrial-grade edge software platform that can quickly and securely deliver interoperability between things, applications and services across a wide range of IoT use cases. The platform leverages a loosely coupled microservices architecture and interoperability foundation that comprehends both IP and non-IP based connectivity and is surrounded by reference services that can be easily replaced with preferred alternatives.
Reducing the need to reinvent the fundamentals
It’s important to note is that that this is not a new standard. There are plenty of great ones already in existence. EdgeX is an industrial-grade software framework that’s purposely architected to be deployed on distributed edge nodes including embedded PCs, gateways and servers. This helps unify existing standards with plug-and-play commercial value-add such as analytics, security and system management tools, and services. The primary goal is to reduce the need to reinvent the fundamentals while enabling technology providers and end customers alike to focus on value-added differentiation.
Big markets are built on interoperability. It’s in everyone’s interest for companies to offer plug-and-play solutions that can be easily combined to create secure, scalable solutions. Together, let’s steer away from spaghetti junction and get moving!
IT leaders are looking in the wrong places for IoT talent: The solution involves a platform approach
Many IT organizations face growing demand from the business to build new innovative and connected experiences. The goal is for these IoT applications is to create new products and services, provide better customer experiences and increase operational efficiency. However, this demand can seem lofty for many IT leaders who are grasping at thin air for scarce, hard-to-find talent to get these projects over the finish line.
Delivering IoT solutions requires specialist skills to configure, implement, integrate and manage a mix of different and complex IoT technologies, endpoints, platforms, back-end systems and data. This nascent market also opens new roles that need to be filled.
According to Gartner, the top barrier to CIO success is skills and resources. Through 2021, market demand for app development will grow at least five times faster than IT capacity to deliver it. With the high demand for innovative IoT solutions, IT leaders need a new approach to find the right talent.
Take a platform approach to IoT app development
Instead of looking for specialized IoT talent, CIOs should consider how emerging technologies can help alleviate the talent challenge. With the combination of an IoT software platform and a high productivity application platform as a service, the technical complexity of building IoT applications can be abstracted.
IoT software platforms lend a helping hand for connecting all the technology required for an IoT solution. Some IoT software platforms and their IoT components to consider include IBM Bluemix with IBM Watson IoT, Amazon with AWS IoT, Microsoft Azure IoT Hub, GE Predix and SAP Cloud IoT.
These IoT software platforms provide a wide array of functionality that minimizes the complexity of connecting, securing and managing devices, and analyzing sensor data. But one area where they lack capabilities is application development — a key requirement for making IoT insights actionable to power smarter operations and customer experiences.
As a second abstraction layer, organizations should consider adopting a high productivity platform that is uniquely suited to address these challenges.
A high productivity platform can help your organization close the talent gap by enabling a much broader range of developers to build IoT applications using visual models, reusable components and connectors to IoT services. By eliminating the need to code IoT applications, your organization can utilize not only professional developers, but also developers with hybrid business-IT skill sets, like business analysts, to build IoT solutions.
Find talent with the right mentality for IoT projects
With the adoption of these platforms, the talent you acquire doesn’t need to have specific IoT experience and specialized skills. Look for resources that fit into the following roles:
Tech-savvy business analyst — This business expert doesn’t need to know multi-tier development, but can layout screens, rapidly build prototypes, build logic flows and model data.
Business-savvy developer — This individual has a good understanding of the tiers of software development (database, middleware, business logic, UI) and the software development lifecycle (requirements, developing, testing, releasing), but does not need to be a proficient coder. She has a proficient understanding of business requirements and can envision the big picture when delivering software projects.
Integration/extension developer — Not all projects will need this role, so you won’t need as many of them. This person can:
- Build widgets that other developers can use to build user interfaces in the presentation tier of the app.
- Build Java-based custom components that can be used within visual flows that extend functionality in a way that can be used by non-developers.
- Can integrate with web APIs, third-party systems and ERP systems. The way to do these last two things is by writing extension components or integrations in Java and creating a connector that can be used by non-coding developers.
Vanguard architect — This individual is a strategic thinker who understand the “why” of technical decisions and how new technologies can be applied to deliver business value. The vanguard architect is also human-centric and possesses the behavioral skills required to influence people, build trust and lead the organization through disruptive change.
With the ability to utilize a wider range of people, you can more easily fill your talent pool and ramp up these individuals to rapidly build IoT applications. The next step is to structure your team to bring your first application to completion.
As a homeowner, navigating the mass of smart device options is intimidating. The sheer volume of existing products, newly announced gadgets that have not been released yet (but are coming soon!), and the seemingly unending list of new options can make it tough to make any decision — never mind the basics like cost and availability.
So what criteria should you consider when deciding to pull the trigger or not on a smart home device? I always like to consider the following questions when it comes to technology and the home:
What is it that you really want to happen?
What challenge, problem or opportunity are you really trying to address with your purchase? If you can’t answer that question, stop right there. You are probably just bored, and it will pass — unless, of course, you would like to ultimately add to the treasures hidden in your connected-gadget junk drawer.
Most — if not all — connected products are designed with very specific use cases in mind. When you try to stretch those uses, things can stop working and the frustration begins. The best advice here is to buy the device based on what it is specifically designed to do, not what you hope it can do. In some cases with over-the-air (OTA) updates the products will improve over time. I’ll touch on that below.
Is it a secure product?
Does it come with the aforementioned ability to take OTA updates? What do they do with your data — and what do you get in return for sharing it? Can I determine the level of security by adding my own passwords or authentication choices? These are table stakes for a dependable product. Don’t compromise.
How much time and money am I willing to invest in this?
Connected tech can be expensive. In addition to the initial purchase price, does it involve a subscription or ongoing cost? Is this purchase based on something that you are interested in playing with? Or is it something that you will depend on in your residence for the long term? If it is a hobby purchase, manage your spend and understand the path forward. Otherwise focus on the next question below.
Am I depending on it?
If you are considering a device that you will depend on long term, know the market players and manufacturers involved. Are they experts in the area that you are considering? You hear the phrase, “hardware is hard” way too often in relation to the connected space recently, as players who understand software try to deliver things outside of their core competency. For example, in my company’s category we have been making locks for nearly 100 years, and there are industry standards for everything from fire ratings to sledgehammer tests. We have robust customer and product support capabilities. We aren’t going away anytime soon, and it is a pretty good bet that we will continue to improve the product and grow around it. In other words, a smart bet.
What about working with everything else?
“Integrations” (the fancy word for the ability to work with other connected stuff) are fickle. Some are better than others, and again you should consider the market players and manufacturers involved. Integrations aren’t always useful and, as a matter of fact, can be downright annoying — so head back to the first question, “what is it that you really want to happen?”
There are a lot of exciting new devices and technologies out there to choose from. Leverage these simple questions to help you get the most out of your smart home device investments of time and money!