Today’s competitive world of embedded systems and the internet of things is placing ever greater demands on developers. They need to produce products that optimize size, weight and power, as well as have focused feature sets that are also flexible so they can be quickly adapted to meet changing customer requirements. In terms of processors, this means a family that is built around a common, high-performance core that features low power consumption, but which as a series offers a range of selections in terms of on-chip functions and peripherals that present a compatible platform for software. This newer generation tends to be organized into such families that, with a common instruction set, encompass a range that provides scalability from small-scale to large-scale applications along with I/O, on-chip flash and RAM in market-leading sizes.
Matching a real-time operating system (RTOS) to such a processor family requires that it be able to smoothly scale along the lines of power and performance offered by the line of processors. Ideally it should have a familiar, standard API such as POSIX, which is a Linux-compatible standard for embedded systems. It must have a set of core characteristics in a compact package along with a wide selection of tested and proven functional software modules that can be quickly integrated to match the application and selected processor along with the mix of on-chip and off-chip peripherals needed. This, of course, means a wide selection of such functional modules.
Many vendors now offer integrated development kits that include a single-board CPU with an integrated RTOS along with peripherals and tools to help the developer get started immediately adding innovative value. Such an integrated platform supports eight principles required for fast and efficient IoT development:
- Cutting edge
These features not only help guide development for focused design and meeting time-to-market demands, they also guarantee the functional considerations needed to work effectively in the internet of things.
The platform’s rich modularity supports the lean development model by providing standardization, interchangeability of drivers, protocols and service modules, and portability of applications. This lets developers quickly adapt to changing customer requirements in the midst of a project. The integrated platform optimizes both hardware and software design on the go.
Those same features make it adaptable — able to meet new market demands for features and functionality. Once a product is in place with a customer, the OEM must be able to quickly react to calls for additional features and expanded functionality — or even a smaller, lower-cost version of a product. Existing code can be moved to a higher performance processor and new features quickly added without serious revision of existing code.
Security and safety go hand-in-hand and must be designed in from the ground up. If it can be hacked, it isn’t safe. Security begins with the selection of a secure initial design and extends through communication protocols, strategies such as password, electronic key and physical recognition, the use of secure booting, encryption and many more strategies. However, the judicious selection of the basic system architecture, hardware and software is also a key requirement.
Two main features help ensure safety in systems. Determinism guarantees quick response to threatening conditions and makes the operation of the system predictable so that it can be reliably tested to meet strict timing requirements. Emergency stop with zero boot time means that a device can be halted instantly and restarted with zero boot time if required. Thus an unsafe condition can be halted immediately and brought back to a safe condition or the device diverted to an action to deal with the emergency.
The internet of things is connected and must accommodate a very broad range of sensors, both wired and wireless. This means the full range of both wired and wireless connectivity. An RTOS that can deliver virtually any connectivity solution that can be selected and integrated into the RTOS design off the shelf is complete in that it contains everything you are likely to need for a project as it evolves.
Among the supply of protocols should be those that can be used on the cloud side to easily and securely connect and transfer data and process commands. This requires the latest components and tools to deal with it and its newer applications, along with the ability to work with other leading-edge applications like Microsoft Azure. It also means the use of the best tools such as a wide selection among the IDE offerings, along with advanced support tools that allow use and tracking of memory and objects such as dials, gauges charts for variable displays, plus timing tools and displays to understand scheduling, interrupt processing and more.
The incorporation of these latest tools, protocols and technologies and their availability across a matched RTOS and processor family makes this a truly cutting edge platform.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
Industries of all kinds face governmental regulation. It’s the price of doing business in civilized societies. But how businesses approach regulations has enormous bearing on organizational success. We got a stark reminder of the huge gulf between acting “by the book” and, well, something more earlier this month when United Airlines forcibly removed a paying passenger from a flight leaving Chicago. The airline’s choices in this case earned it a PR drubbing and a serious erosion of customer trust, with more pain likely to come.
Air travel is a highly regulated consumer service industry. A range of options is always available when considering how to handle the challenges of trying to fit crew members onto an already full flight. Speaking of “having to re-accommodate” customers reflects a by-the-book, compliance-oriented mindset. If that’s one end of the range, what’s the other end? Seeking to build trusted relationships with customers and end users of your product by offering them transparency and control.
What does dragging and dropping passengers have to do with privacy?
What does all this have to do with the EU General Data Protection Regulation — or the internet of things? Good question. It’s clear that IoT is leading to an explosion of business opportunities. But companies attempting to develop their own IoT solutions from soup to nuts are quickly learning that providing safe, secure, privacy-sensitive interactions is extraordinarily difficult. At the same time, as May 25, 2018, the date of GDPR implementation, approaches, this far-reaching regulation looks likely to be the central governing framework for consumer-oriented companies pursuing IoT business models across the globe. If your organization works with the personal information of anyone in the EU, whether you’re based there or not, GDPR applies to you.
What are the regulatory objectives of GDPR? Data privacy with choice and control: strengthening the exercise of fundamental privacy rights of individuals and putting users back in control of their personal data.
The lesson here for any organization looking toward realigning its privacy practices for May 2018 couldn’t be clearer: Never miss out on an opportunity to create a trusted digital relationship. Too often organizations fall into process-oriented thinking: “Technically speaking we’re in compliance here, so it’s all good.” You fall into this trap at your peril!
So what to do? Take these steps to make progress in your IoT data privacy journey and get ready for your GDPR close-up with actual users.
Step 1: Identify where digital transformation opportunities and user trust risks intersect
We know IoT is driving all the interesting business opportunities, from connected clothing and athletic gear to breakthrough smart health devices. But where have data flows lain fallow because it’s impossible to build them securely and in a compliant fashion? These unacceptable trust risks can potentially be made acceptable if the right stakeholders from the privacy and business professional sides of the house can work together. You can work to bring these dark data relationships into the light once you know what they are.
Step 2: Conceive of personal data as a joint asset
Often businesses — or at least their marketing departments — become quite proprietary about the personal data they collect from consumers. However, in the GDPR era, that’s simply not a useful mindset. Thinking of users’ personal data as something you both have a stake in sets you up for success. It puts you into your users’ shoes, which is always useful because on another day, for another product or service, you yourself are “just another user.” It’s also good for compliance, since regulations do tend to change and grow (new GDPR guidance documents are coming out at a rapid clip lately).
Step 3: Lean in to consent
GDPR defines six legal bases for processing personal data. One of them is consent, and if used, it gives various information management freedoms and responsibilities to an organization. Crucially, it also comes with user trust implications. Some others, like “in the exercise of official authority vested in the controller,” essentially tie the organization’s hands. But some others, like “necessary for the purposes of legitimate interests pursued by the controller or a third party,” could be used for trust-destroying mischief.
Step 4: Take advantage of identity and access management for building trust
The IoT world seems to be quickly picking up on a lesson that the web and API worlds took a lot longer to learn: Adding security and privacy features is a lot harder if you don’t have a means of checking for authenticated identities and then authorizing their access. Identity and access management infrastructure has a great deal to offer toward building trusted digital relationships.
Think of building trust in layers of identity support. The data protection layer lays the groundwork for your organization’s trustworthiness against security breaches; it includes identity data governance and building a single view of the customer across what may be many individual smart devices and applications. The data transparency layer ensures all these devices and apps have proper terms of service, privacy notices, federated connections to other systems and provable consent; this is about giving users a single view of their consents. The data control layer ensures users can proactively start, monitor and stop sharing as they see fit at a fine grain — for example, deciding to give access to insulin pump data, or even to the pump’s control functions, to a doctor or caregiver.
Trusted digital relationships with users are yours to lose
The GDPR it is intended to be one of the most contemporary regulations in a long time. That’s lucky for all of us, since the internet of things is one of the fastest-moving technology and business spaces in a long time.
To prepare for the GDPR, organizations need to go beyond data protection and embrace data transparency and data control. The choices you make about your customers’ data increasingly reflect on not just your data protection officer’s actions but your entire business model. Addressing user trust risks is certainly something you can do something about; the more important question might be whether you can afford not to.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
The internet of things has reached mainstream status on the evidence that corporate managers and business unit heads from established businesses (as distinct from startups) are responding to the strategic implications of IoT.
All too often, however, their approach is technology led. It is reminiscent of the way that organizations reacted to the commercial internet some 20 years ago. Today’s preoccupation is with connectivity and selecting the “right” technology from different wireless alternatives. To some extent, technology suppliers are inflating the merits of different approaches. First mover, proprietary offerings (LoRa, Ingenu, SigFox) are trying to advance their solutions relative to the lumbering response of the mobile industry (LTE-M, NB-IoT family of technologies).
The focus on technology is a missed opportunity for businesses to leapfrog to new sources of innovation and business strategy. How much more might businesses achieve by working back from commercialization and innovation ideas to define their technology needs?
Consider how many businesses develop their IoT solutions at present. The process begins by finding some way to “test the waters.” This involves adding connectivity to an existing product or sensor and using the data to enable an IoT application. Many organizations find value simply from visualizing the data that their products report back. Often, this triggers some interesting insights (e.g., product usage patterns) which can lead to a follow-up action (e.g., to introduce product design enhancements, to adjust maintenance procedures, to interact with the end user etc.).
With a successful proof of concept, an organization would then roll out its solution operationally, taking the first steps to “build IoT into the business.” This involves modifying operational procedures and introducing new technologies, such as an IoT platform, to manage and support a population of connected devices. An example of a self-contained IoT solution in the smart city context is a “smart” streetlight solution that dynamically alters streetlights while reducing energy consumption and maintenance costs.
Over time and with greater experience, many organizations wonder, “What else can IoT do?” They may decide to support multiple IoT applications. Or they may develop new applications and improve the performance of existing ones by integrating third-party data. Interoperating across multiple (siloed) applications exposes yet another set of possibilities.
Eventually, businesses start to think about ecosystems (closed and open variants) which are the basis for collaboration with partners to make their connected devices accessible to external IoT applications. Ecosystems, which are also known as multisided business models, also expose new prospects to monetize data through innovative application opportunities.
This bottom-up experimentation, from device to multisided applications, tends to be highly tactical. In going through the process of “testing the waters,” an organization is likely to tie itself to a single-technology, proprietary approach through a narrowly scoped in-house development project or by partnering with an off-the-shelf vendor solution. Investment outlay would be an overriding consideration. At the point of operational deployment, its IoT platform would contain the bare minimum of elements needed to deploy and manage a population of connected devices. Typically, these would include software-based tools to manage service activation, connectivity service quality, remote-device profiles, application development and the user interface. And finally, any expansion of the initial application would rely on systems integration investments to graft on new features.
Organizations that recognize the full potential of IoT take a different approach. They build on a long-term vision, deploying investment resources intelligently to anticipate future needs. By prioritizing value creation, to the right-hand side of the illustration, organizations can explore new ways of doing business based on the many different commercial concepts that IoT can enable. Examples include better use of data along the supply chain and data sharing with non-traditional business partners, including those from other business sectors, to enable new services and revenue streams. In practical terms, this implies the need for common data models and technologies to enforce commercial and legal policies around data ownership and monetization.
Working back, the strategic approach to “building IoT into the business” favors an IoT platform whose capabilities are designed to be extended over time. This means having the ability to:
- Accommodate a growing number of sensor types;
- Function as a data exchange that can handle multiple data feeds (own and third-party sources); and
- Support new business models (e.g., alternative pricing schemes, sharing and monetization of IoT data with multiple partners via configurable commercial and privacy policies).
Applying this perspective means that the choice of technologies to create an early application is influenced less by pure cost considerations and more by total cost of ownership, technology-neutral and open standard factors.
With IoT becoming a fixture on the competitive landscape, very few organizations will escape its influence or the resulting impact on business operations, product roadmaps and technology investment plans.
Businesses can choose to invest for the long term, applying lessons from the commercial internet era. They can focus less on connectivity and more on preparing for new service concepts to capitalize on their connected products. The risk of acting tactically is to under-leverage investments and overlook chances to capitalize on innovative business opportunities.
Most people have heard of the internet of things by now, but few are familiar with the possibilities this platform opens for businesses and consumers alike. This platform, that has long been associated with the smart home, offers real and tangible value beyond the cool factor. Consumers and businesses fail to recognize that the internet of things acts as a key component to the daily operations of companies in several industries, such as fitness, agriculture and fashion.
A sensor does not make a product or technology part of the “internet of things,” it’s all about creating a holistic experience for the end user. It’s time consumers and businesses drop this notion and realize the positive impacts this technology has or could have. From the clothes we wear to the food we eat, IoT is creating new conveniences and helping us maximize our resources every day.
Beyond fitness trackers, IoT taps into equipment
Most people are familiar with wearables that track fitness like Fitbit. IoT is also taking exercise machines a step further into the future by creating realistic and tangible user experiences. The Peloton Cycle Exercise Bike, for example, comes with a screen that allows you to connect to real-life classes hosted by fitness trainers, giving users access to world-class instructors and allowing them to compete with other users from the comfort of home. With the busy schedules many of us keep, it’s easy for self-care to fall to the bottom of the priority list. Connected fitness machines make it easier to prioritize fitness by saving time spent traveling to the gym and encouraging healthy competition between users all over the world.
Hotels are even catching on, providing Peloton bikes within the room so guests can get their exercise without heading down to the gym. This is a smart idea in that it differentiates those hotels and draws guests away from the competition by offering a premium experience. The bike’s ability to import data from your wearables means you have access to the latest health metrics whether you’re using a hotel’s bike or your own. And perhaps best of all, when Peloton added tech, it didn’t sacrifice good design. The bike is one of the most attractive fitness machines on the market, and sturdily made so it can handle intense exercise sessions.
IoT provides solutions to farmers and harvesters
IoT is even helping to maximize our agricultural resources. Farmers face immense challenges in creating a steady crop yield year over year — unpredictable weather, water shortages, and limited availability of land can impact the yield significantly. Brilliant IoT innovators have been focusing their expertise in this industry for years to create some pretty amazing resources to help solve those problems, increasing the quality, quantity and sustainability of agricultural production.
Large farms can now use remote sensors to monitor the amount of moisture in the soil and even the amount of growth the crop has achieved. Harvesters can now be controlled from anywhere in the world, provided there’s an internet connection available. Inventors have even begun using artificial intelligence to analyze metrics about the crop and even weather patterns to make predictions about the future, taking some of the guesswork out of a traditionally unpredictable industry. This is not just great news for farmers — it’s great news for everyone because it increases crop yields to make food more widely available.
Fashion meets technology
The fashion world is even beginning to catch onto the many benefits of IoT. Most people today appreciate the ability to customize; we want our clothes, our homes, our cars, everything we own to not only behave in ways that make the day go smoothly and conveniently, but to reflect who we are as people. We live in an increasingly connected world in which we interact with new people more frequently. First impressions are more important than ever, so the ability to express who you are through the clothes you wear has become equally important.
Services like NikeID allow you to customize your product (in its case, shoes) from the ground up. You start with a blank canvas, choose the basic structure of the shoe based on the functionality you want, and then you get to customize the fun stuff — materials used, colors and graphic prints. You can even choose how the bottom of the shoe looks. Another service called Trunk Club behaves as a personal fashion assistant, a luxury typically reserved for the rich and famous that’s becoming available to all thanks to IoT. The service shows you several style combinations and asks for your opinion, building a profile of your tastes that allows it to make suggestions based on what it knows you like. It even seeks out the clothes that are best for your body type, as well as those that fit within your budget, so you aren’t pressed into spending more than you normally would.
The reason these innovations work is that they create a holistic experience for the end user. It’s not about the latest gadget or simply adding sensors to everyday items — it’s about integrating technology into your lifestyle in a way that makes sense and creates convenience and efficiency. It’s about creating a beautiful user interface, in terms of both usability and design. It’s about providing you with the essential knowledge you need in order to be as successful as you can. In its simplest terms, IoT is about making life better, and the opportunities to do that are boundless. Every industry on the planet, and even the planet itself, stands to benefit from technology used in this way.
To learn more about Schlage and Rob Martens’ stance on the tech landscape, please click here.
Microservices are emerging as a preferred way to create enterprise applications. Just like mobile app development adoption five years ago, a lack of expertise can slow down some companies in their pursuit. However, with IoT development on the rise, it’s inevitable that microservices will become the architecture of choice for developers — today and tomorrow.
Although it has received criticism for not fitting into certain DevOps cultures, microservices are increasingly adopted and gaining fans across numerous industries. Large-scale online services like Amazon, Netflix and Twitter have all evolved from monolithic technology stacks to a microservices-driven architecture, which allowed them to scale to their size today. Microservices are ideal for supporting a range of platforms and devices spanning web, mobile and IoT (including wearables). When developing for IoT specifically, here are the five considerations for why the future is bright with microservices:
- Lower cost: IoT sensors and devices are fairly affordable today. That said, it is almost always more cost effective to roll out hundreds of small sensors that each do one thing really well instead of opting for fewer, but more powerful and more pricey options. One big reason for this is that no matter the device, in just a few years, most will become “obsolete” or superseded by more sophisticated, more cost-effective alternatives. The beauty of going with the simpler hardware is that you can rely on microservices to add value and fill functional gaps. You can also gradually roll out the network and continue to upgrade and maintain it in a cost-effective manner as individual components get replaced. Done right, this means you’re never in a position where you have to replace an entire monolithic system in one go.
- Faster innovation: The world of IoT deployments is generally still very much in beta. Although there are already billions of cool and useful devices deployed, we’re still only scratching the surface when it comes to unlocking their full potential. A microservices development approach allows you to unlock innovation (and thus value) faster by making it easy to test new combinations of “things” and “services.” No need to build an entire technology stack or invest in big infrastructure over many months. With microservices, you can tinker and test to your heart’s content and quickly reap the benefits of innovative solutions to your problems.
- Isolated risk: Assembling your solution via microservices allows you to adjust and iterate quickly, thus avoiding the danger of missing the mark. You can do this without having to re-architect your entire system or IT environment. Most mobile and web application developers have already found great success in applying agile development. Developing for IoT, it’s unlikely you’ll be able to build a full feature on top of a device in just a week. However, by focusing on building microservices in one to two week sprints, you can keep moving towards the finish line and connect all the APIs you need, one by one, with dramatically lowered risk.
- Flexibility and agility: Another major benefit of leveraging microservices is that if, after testing, you determine that a particular service isn’t working out, you can replace it with something better or more tailored to your needs. A microservices approach to development and integration allows you to build a feature quickly and improve on it over time. When it’s ready to be replaced, you’re just updating one piece of the puzzle without having to worry about impacting the rest of the picture.
- Unlimited value-add: The device you deploy is never going to transcend its physical capabilities until you upgrade or replace its physical components. The digital upgrades you can provide via constantly evolving microservices, however, are unlimited both in their scope and their frequency. A camera may be designed to only capture 2D images, but depending on the third-party service it’s linked to, it might provide you with statistical traffic information, queue sizes or weather information.
Soon enough, it will become hard to remember the time when enterprises did not by default turn to microservices. With the rise of IoT, there’s a perfect storm brewing that will push microservices into new and traditional industries. The benefits are high, the risk is low and the savings in terms of cost and resources make this one a no-brainer.
Whether you’re a hockey fan or just a business person, you will have heard the Wayne Gretzky quote, “Skate to where the puck is going, not where it has been,” and in the automotive sector of IoT there is no better example than Delphi’s recent investments in high-tech companies. My sources in the auto sector tell me that Delphi has been playing catch-up in areas like cloud, SaaS, apps and big data. Until recently, it was behind its rivals like Harman, Bosch and Visteon in many of these growth areas. If we consider where the puck is today, I would argue that the connected car is focused on telephony, telematics, apps and infotainment, with a healthy dose of SOTA/FOTA (software and firmware over-the-air updating). On that subject, Delphi’s recent acquisition of Movimento, a leading SOTA/FOTA provider, was a smart move to solidify its offering around today’s automotive challenges, of which updating cars remotely rather than in the dealership is definitely one. However, Delphi recently did what all good hockey players do; it lifted its head up, it didn’t chase to where the puck is in 2017, and instead it predicted where it will end up in 2018, 2019 and beyond, and it invested to meet the puck there. So where exactly is the puck heading?
In automotive, and IoT in general, it will all be about the data. Data will be the new currency, with vehicles being a means to an end for collecting valuable data, and everyone in the value chain will be scrambling to monetize the data generated by vehicles. Vehicles will become what today’s Android smartphone has long since become, a data generator for the likes of Google to benefit from. Those of us that use Android are prepared to give up some of our personal privacy in order to get terrific free services like Google Traffic. Today, automakers and their tier 1 vendors are fighting against Google and Apple, taking over the dashboard with initiatives like SDL (SmartDeviceLink) for app mirroring, and in the future they will need to do the same to stop Google and Apple owning the car’s data. Today’s car does generate data such as the outside temperature, whether the car is skidding on ice, and driver behavior, but very little of it is even stored locally, let alone sent anywhere for processing. That will all change in the future, when each car will send millions of packets of data monthly, giving terrific insight into macro factors such as traffic conditions all the way down to individual driver habits and traits. In 2017, the likes of Delphi are figuring out how to pull that data from the vehicle and where to send it to, but in 2018 and 2019 the puck will be somewhere else completely, so Delphi is skating towards a world where the data is crunched, analyzed, brokered and monetized. We can imagine a world where a single vehicle and driver will be contributing data to traffic services, transit operators, smart cities, insurance companies, retailers, weather forecasters, media companies, dealerships and more. The value of the data generated depends on the recipient; for example, traffic patterns or parking availability is only valuable in aggregate, whereas a driver’s routes and associated buying habits are extremely valuable to retailers wishing to attract that driver to their stores.
From all of that we think we can see where the puck will end up, so exactly what did Delphi do in April 2017? It invested in Otonomo and Valens, two companies that will help Delphi to skate faster. Otonomo’s technology aggregates data generated as cars are driven and sends it to automakers, who can then use that data to sell services to the owner of the car. The company cites services including emergency assistance services and location-based advertising. Valens specializes in technology that helps companies move high volumes of data around the vehicle by delivering chip-to-chip technology that allows for connectivity speeds six times faster than today’s industry standard.
In an article in the Detroit Free Press, Delphi Chief Technical Officer Glen De Vos was quoted as saying that one of the biggest emerging challenges for automakers developing connected cars is the ability to move and transmit the data required for autonomous cars to operate. “Delphi’s vision is to enable the brain and nerve center of the vehicle through our leadership in computing and complete electrical architecture solutions,” De Vos said. “We have selected world leaders in three areas: high-speed data transmission, connectivity and data management.”
In the same article Mike Ramsey, a research director at Gartner, gave Delphi credit for investing in companies that will help automakers solve some of the toughest issues facing the self-driving vehicles. “All these companies have a role that helps to tame the tyranny of data,” Ramsey said. “What I like about what Delphi is doing is, more than any other company I have seen, they are putting together the right pieces.”
The automotive sector is undergoing a massive wave of digital transformation. If the car of the future is merely a big IoT sensor on wheels, and if the connected car sector is a microcosm of the wider connected IoT space, Delphi’s strategy is a good indication of how many other “old school” companies will have to figure out where the puck is heading, and how they can skate to it before the opposing team gets there in order to capitalize on the massive opportunity presented by the collection, dissemination and monetization of the data that devices are generating.
In 2015, the Illinois Technology Association (ITA) launched a first-of-its-kind cross-disciplinary effort to bring together leaders from the technology industry and academia, as well as consumers and civic leaders to collaborate towards driving the growth and use of internet of things technologies in the Midwest. The result of this effort was the creation of the ITA IoT Council. Its purpose is to drive the advancement of IoT technology, policy and industry, establishing Chicago and the Midwest as an epicenter of IoT.
For tech communities outside of Silicon Valley, there is often a lot of talk about how to be “the next Silicon Valley.” The thought being that all the talent and capital flows in that direction. At the ITA, we firmly believe that Chicago and the Midwest don’t need to be “the next” anything. There are many things about Chicago that make it a distinctive tech hub: our proximity to many Fortune 500 companies and our depth in B2B technology are just two of our exceptional strengths. These two advantages, along with a few other traits inherent to the Midwest, are also helping to drive our prominence in IoT.
Key factors in Midwest IoT development
At the ITA, we firmly believe the Midwest is uniquely positioned to take a leadership role in IoT, more so than any other significant platform shift of the past 20 years. Here’s why:
- Manufacturing. The Midwest is the core of the nation’s manufacturing base, which is making a comeback. The efforts of UILabs DMDII are based on the belief that digital technologies will minimize the cost discrepancy versus countries with cheap labor costs. IoT will require manufacturing prowess to build the devices that are necessary to enable this shift.
- Software. Chicago has a keen history in software development, especially in transformative industries. Chicago is the intellectual base of software centered on solving business problems — a necessary ingredient for a leadership position in this space.
- Data analytics. No community has seen a stronger aggregation of data analytics companies over the past 30 years. From SPPS to IRI and many in between, Chicago is home to both the collection of data and the analysis of it. Fast rising, Uptake is a prime example of how Chicago’s past in this space is pushing it into the future.
- Customers. Chicago has an active consumer base in the industrial, healthcare, retail, automotive and agricultural industries, all of which have been working in M2M technology, the precursor to the internet of things, for many years. As we implement IoT, much of the early adoption will be launched by Chicago companies.
- Government Commitment. The City of Chicago is emerging as a clear leader in the smart city space. Currently, the city is focused on using technology and data to make it as efficient and responsive as possible to resident and business needs.
The ITA IoT Council regularly publishes an inventory of Midwest-based companies centered on IoT to benchmark growth within the region. In its first year, the inventory has grown from 70 companies to 90, showcasing the breadth and depth of IoT in Chicago and the Midwest.
As the companies focused on IoT technology continue to grow, the application of IoT throughout a wide range of industries will grow exponentially. With Chicago’s broad economic base and the city’s commitment to being at the forefront of data and analytics, Illinois and the Midwest as a whole are in a unique position to be a leader in the internet of things transformation.
As the industry adopts new technologies such as IoT, 5G and AI, fog computing has likewise been rapidly increasing in popularity because it fills a necessary gap. In fact, fog is a critical architecture for these emerging architectures because it enables reliable, low-latency operations and removes the requirement for consistent cloud connectivity. Fog provides the missing link in the cloud-to-thing continuum.
To illustrate the value of fog, let’s investigate the network logistics of package delivery via drones. This is an ideal example because it’s an emerging industry that just a few years ago seemed futuristic, but now, thanks to fog and other enabling technologies, it’s not only realistic, it’s imminent. There are business plans for drone mail and product delivery on boardroom tables of some of the biggest companies in the world. Fog will make the concept a safe and profitable commercial reality. Here’s how.
What you need to know about fog
Fog architectures selectively move compute, storage, communication, control and decision-making closer to the network edge where data is being generated. This solves the limitations in current infrastructures to enable mission-critical, data-dense use cases.
Fog is an extension of the traditional cloud-based computing model where implementations of the architecture can reside in multiple layers of a network’s topology. These extensions to the fog retain all the benefits of cloud computing, such as containerization, virtualization, orchestration, manageability and efficiency. The computational, networking, storage and acceleration elements of this new model are known as fog nodes. They comprise a fluid system of connectivity and are not completely fixed to the physical edge.
Now let’s overlay fog to the use case of drone-based package delivery.
Fog addresses five challenges to drone-filled skies
With drones, there are the obvious issues with bandwidth and scale, safety and expense. And there are operational challenges and complexities for autonomous and non-line-of-sight flights. For high-scale drone delivery to become a reality, the following five problems must be solved:
1. Drone hub management. As drone delivery takes off, it will become impractical for every vendor to have an independent fleet that operates independently of every other fleet on the planet. To scale drone use, the industry will need to develop drone hubs that can coordinate the flights of many companies, much like airports do. Managing these hubs with advanced technologies is critical in order to coordinate drone activities with traffic and operations on the ground and in the air.
Fog computing controllers on the ground will communicate with fog nodes on drones to coordinate the complex, split-second coordination of landings, takeoffs, loading, unloading and maintenance.
2. Collision prevention and safety. Shared airspace is the only way to arrive at high-scale drone delivery services. When there are only a few thousand drones in the air, there may not be much concern about collisions. When there are millions, however, there will be a very real threat of drones crashing into each other (and into birds, airplanes and tall structures). Unmanned vehicles will need a way to recognize unforeseen collisions and make course alterations in sub-milliseconds.
All drones must have flight plans, so with fog, takeoff and landing can be scheduled to prevent collisions. In addition, fog communications enable the split-second timing required to coordinate the loading, takeoff, landing and maintenance of this volume of drones. Fog controllers on the ground provide the proximity required to shorten the communications loop between the drone and the “control tower.” Latency can be reduced to such a degree that a drone will only travel two inches before the next update is delivered. (If that same communications were to go through the cloud, the drone will have traveled 12 feet.)
3. Bandwidth bottlenecks and expense. Whenever a drone is being prepared for a delivery or is in transit, constant communication and tracking is required. This translates to a constant stream of data for every drone flight. Access to the cloud may require expensive satellite links in remote areas, making some commercial applications cost-prohibitive.
Fog computing solves the many problems of communications, storage and computation with the speed necessary to safely control high-volume drone traffic. Fog computing provides for a more efficient way to upload continuous software updates, massive amounts of data and other computational and communication requirements.
4. Security on the ground and in the air. Imagine hackers targeting and rerouting drones carrying medical supplies, drugs and even data. Adding security features such as encryption and anti-cloning chips on multiple sensors will increase the cost of the drone. Downloading security credentials, patches and updates from the cloud to the drone can consume valuable bandwidth.
A fog node on a drone can handle security, without adding complexity, size or cost to any other drone parts. The fog node can take care of security updates even in mid-flight. It provides a full complement of security measures, from downloading and installing patches to perimeter defense.
5. Autonomous operation. Because drones are unmanned, they require sufficient on-board intelligence to act with autonomy when needed.
Fog computing includes autonomous response: the ability to take the appropriate automated corrective action sequence. Fog nodes on the drone can provide awareness of anything within close proximity to the drone — including weather conditions, other drones, birds or buildings.
Fog nodes also provide analysis and sub-millisecond response to changing conditions in the air. Trying to make this loop through the cloud would take far too long.
Fog-based drone delivery will extend the reach of delivery to inaccessible places and greatly reduce the customer wait time between order and delivery. With fog, drone fleets will realize their potential to reduce costs, congestion and environmental impact. In addition, fog will help drone delivery create new markets that simply aren’t practical with traditional vehicle fleets.
To access the OpenFog Reference Architecture, which provides a high- to medium-level overview of systems architectures for fog networks, please click here.
Recently, I was reading about how voice records from an Amazon Echo’s Alexa had been subpoenaed in a murder investigation. Police were trying to determine if the machine had picked up anything incriminating during a suspicious death, while Amazon argued that handing over the device’s data would violate the First Amendment and consumer privacy rights.
While I’m aware my own Alexa is always listening, I’ve never been too concerned about casually going about my day while it records the mundane conversations I’m having with friends and family.
I’ve also read countless articles about the distributed denial-of-service (DDoS) attack that happened in October last year. I remember it severely impacting some of my favorite online communities, like Spotify, Reddit and the PlayStation Network.
This attack was unique in that the botnet was primarily driven by malware deployed on unsecured or under-secured IoT devices. Security cameras, routers and baby monitors paved the way for one of the worst DDoS onslaughts in history.
Managing your devices
These two examples got me thinking about our current world of connected devices. It’s amazing how quickly — and, in some cases, haphazardly — they are being deployed within corporate ecosystems. Clearly, the possibilities and benefits of these connected devices are huge. MOBI has customer examples of these connected devices saving time, money and even lives.
The real challenge for an enterprise today is capturing these advantages in a safe, responsible and well-thought-out way. The deployment and management of these devices needs to be balanced with a prudent and centralized method of managing all other connected program endpoints. It’s relatively easy to deploy vast quantities of IoT devices across a workforce; it’s much harder to not only know where all devices are located, but also ensure they are properly functioning, secured and updated with the latest firmware.
Before making any IoT procurement decisions, always do your research. Some devices stop receiving automatic manufacturer firmware patches or updates when connected to another network. Others come with potentially dangerous default settings enabled, such as Universal Plug and Play for example. Make sure your devices can be configured to meet relevant policy requirements before they’re implemented, not weeks or months down the road. Just because new devices can be connected to the internet right away doesn’t mean they should be.
Ongoing management of connected devices is the hardest part. Without proper tools and processes in place, it’s much harder to find out where each device is and how it’s functioning once deployed. Centralized management solutions with years of IoT experience and in-depth diagnostic capabilities help concentrate the constant stream of data and vast number of communication paths, allowing a business to monitor the health and functionality of its individual devices.
Corporate IT has always been (justifiably) paranoid about what devices can access their internal network. Ensuring IoT doesn’t become an unintended source of information gathering, firewall backdoor or host for a massive botnet (and assume the corresponding liability that entails) is critical to making this technology more widely adopted, accepted and trusted.
Before your business deploys its first IoT device, create an IoT-exclusive network that can separate existing program devices from the new and/or unknown. Not only does this make it easier to manage a device inventory, it also ensures that, in a worst-case scenario, malware is unable to gain enterprise-wide network access and damage data outside of your IoT program.
Understanding built-in security capabilities and how to change default device passwords/passcodes goes a long way toward successfully monitoring network traffic too. Once native security is understood and maximized, devices can be accurately assessed and assigned an appropriate level of network access to preserve the integrity of whatever data it’s capable of handling. Knowing which devices are connected and what they’re doing at all times is critical to network security.
Expect network bandwidth and carrier data consumption to rise dramatically as a result of IoT and plan accordingly. Software-defined networking can help you prioritize network resources and bandwidth to meet mobile policy expectations and reduce the overall network strain caused by these devices. If a network needs to analyze and store big data, prepare and test strategies so that your technology is ready and able to handle this new workload from the start.
As IoT continues to evolve, securing enterprise devices and networks will only become more critical to business success. Hopefully these tips and a little preparation can keep your devices off the front page and away from tomorrow’s headlines.
Enterprise IoT adoption is trending up globally. In February 2017, Aruba published IoT adoption stats surveying 3,100 enterprise decision-makers from 20 different countries across industrial, government, retail, healthcare, education, construction, finance, and IT/technology/telecommunications sectors.
Fifty-six percent of participants said they have already adopted IoT, and another 32% plan to deploy it by 2019. This is huge and reminds me of how fast enterprise mobility and BYOD penetrated organizations not too long ago.
Connecting enterprise mobility and IoT
While IoT is way more pervasive than mobile, can we make IoT adoption smoother based on lessons learned from enterprise mobility?
What’s pricking enterprises as they embrace IoT solutions (and their associated business benefits) is increased exposure to cyberthreats. And also the inability to adequately counter this “insecurity.”
Again referring to Aruba’s study, 84% of enterprises that already adopted IoT had faced security breaches. Forty-nine percent dealt with malware, 38% had been the target of spyware, 30% experienced phishing and 26% suffered from a DDoS attack.
As in case with any emerging tech, security parameters and security education is lagging in enterprise IoT’s initial adoption curve. But if we consider how enterprise mobility management (EMM) helped organizations overcome data privacy hurdles, it may not hurt to ask how much of that approach can benefit enterprise IoT adopters.
Mobile technology, be it wireless connectivity, devices (smartphone, tablets) or apps, is already at the center stage of IoT deployments. But that’s not the only reason to consider EMM. The complexity, threat vectors of mobile deployments, how it exposes internal resources and critical assets, etc. are all relevant to IoT as well. Only to a greater degree.
Now the question is: can EMM suites be extended to secure enterprise IoT? Alternatively, would it help for IoT security solutions to learn or take a similar approach as EMM?
One may argue that unlike mobility, enterprise IoT pushes the envelope from just IT resources to operational assets and critical infrastructure. Especially in case of industrial. After all, IoT is about monitoring and predicting asset health, temperature, pressure, etc.
That’s why in IT security, confidentiality, integrity and availability are prioritized in that order (CIA). But in the case of OT security, control and availability are most important; then comes integrity and confidentiality (CAIC).
IoT also introduces new organizational roles such as the chief digital transformation officer.
These are all valid facts. Yet in reality, much of the onus of securing IoT deployments continues to rest on CIOs, CISOs and IT security managers.
So let’s explore few ways CIOs can utilize EMM principles to centrally manage enterprise IoT security.
Identifying key components and assets to protect
EMM technologies most commonly support management of mobile devices (MDM), application (MAM), identity (MI), content (MCM) and containment. However, a recent survey by U.S.-based research firm J. Gold Associates found large enterprises deploy only a subset of these EMM functions.
EMM functions which directly affect day-to-day mobility operations and user experience — such as managing and securing mobile assets, emails, file/data sharing, secured browsing, etc. — are most commonly deployed and used by enterprise IT departments, while other nice-to-have functions like geofencing, private app stores, etc., are scarcely implemented.
This gives an idea which capabilities should be prioritized to design IoT security management solutions to channel resources accordingly.
Restrictions in architecture, software and supply chain
Cloud infrastructure, whether private, public or hybrid, is integral to an enterprise IoT architecture.
In AlienVault’s February 2017 survey of 1,000 RSA attendees, 62% said they expected cloud security to worsen as more IoT devices and services get added to enterprise clouds.
EMM’s MAM capability to whitelist or blacklist applications or to mark those as mandatory can help secure cloud-based deployments. Similar to EMM’s MDM capabilities, IoT device management can also publish policies and trust that the users can follow in addition to password and on-device encryption policies.
Secure containers or “sandboxes” can provide isolation and encryption of enterprise IoT data, and securely run authorized applications in the devices.
Best practices in enterprise mobile app development are equally applicable to IoT apps. To serve industrial IoT markets mobile apps, it is critical to follow newer app restrictions as well.
Malware and DDoS attacks are the latest tactics targeting enterprises. These attacks exploit IoT devices (like smart cameras, DVRs, etc., available in retail with very little security built in) to form botnets and bombard cloud servers with terabytes of traffic.
Supply chain restrictions requiring security compliance for these retail IoT device vendors seems to be the only viable option to manage such threats.
Open guidelines and certifications
In 2013, NIST published EMM guidelines for managing the security of mobile devices in the enterprise. Targeted towards CIO, CISO and IT management teams, these guidelines laid a framework to “help organizations centrally manage and secure mobile devices.”
Addressing the four main pillars of EMM to securely manage mobile assets (devices), applications, identity and content, NIST laid out standards and recommended practices for organizations involving deployment architecture, user and device authentication, cryptography, configuration, device provisioning, data communication and storage.
NIST’s guidelines recommend organizations have a documented security policy and develop system threat models for mobile devices and the resources that are accessed through the mobile devices.
In principle, the same EMM standards approach can be adapted in an enterprise IoT framework as well.
Would open guidelines from central authorities like NIST help enterprises to better manage IoT Security? Share your thoughts.