More than a decade ago, Amazon Web Services was launched as a system that offered flexible compute instances. There are a lot of stories about the formation of AWS; some say that Amazon originally created AWS so that sellers on its marketplace wouldn’t have to deal with the IT burdens of setting up their own systems. We do know that it was launched without much fanfare as a side business for Amazon.com.
As enterprises large and small started to embrace AWS, it grew steadily but also quietly. The silent growth of AWS from a no-name system to over $5B/annual revenues in 2015 (which was the first time Amazon shared AWS figures) was a very clever play by Amazon. Today, almost all of Amazon’s operating income comes from AWS.
During this phase (2006-2015), most on-premises server, storage and networking vendors were going through a slow growth period, and they routinely cited the broader market slowdown (2008 crash and recession) as a rationale for their slowing growth. However, the truth was that a bulk of the growth was going to cloud or sophisticated homegrown technologies at cloud-scale companies (including Amazon, but also LinkedIn, Facebook, Twitter and Google, to name a few).
Microsoft eventually did make some bold decisions and got rid of the “strategy tax” that it imposed on customers and partners, and it finally rightsized the importance of Windows in the marketplace. This led to the “Azure Age” and the reinvention of Microsoft.
Google, not to be left behind, realized that enterprises are a pretty steady and margin-appropriate business model that can provide additional growth while it pursued moonshot projects with Android, autonomous cars, Google Lens, longevity research and so forth.
Fast forward to 2018, and the game is well into its middle innings. There is a reasonably clear roadmap of what needs to happen over the next three to five years for cloud services to improve and compete with each other. What classically started as a feature/performance fight quickly changed into the battle for data. Data has gravity and the “you can check out any time you like, but you can never leave” business model of cloud data (i.e., ingress is free, but we will charge you for taking data out of the cloud) leaves a lot to be desired. Then optimizations started that brought in deeper lock-in such as API (Cloud Vision APIs from Google) and algorithms as a service (such as the conversational interface with Amazon Lex).
There is a major trend that the cloud vendors now realize they must address, and that is the emergence of IoT edge. IoT is defined as intelligent systems that not only generate humongous amounts of data (imagine an autonomous vehicle, oil rigs with thousands of sensors, multiple devices on each and every thing and person), but that also require local intelligent computations.
The IoT edge is a challenge that the cloud was never prepared for. And it is one of the larger trends that, unless addressed, puts a ceiling on how large clouds can grow. Gartner Analyst Thomas Bittman says in his article “The Edge will Eat the Cloud” that “the agility of cloud computing is great … but it doesn’t overcome physics — the weight of data, the speed of light.”
This fact is not lost on cloud vendors, and they are starting to figure out ways to address it. In February, Google acquired LogMeIn’s Xively IoT platform, and in April, Microsoft announced that it will triple its spending to over $5 billion on IoT.
This is because cloud vendors already see that they are missing a huge portion of the IT market — the IoT/edge market. Not only that, they also understand that if left unaddressed, it threatens to upend their cloud revenues because the amount of data the IoT edge will generate dwarfs what is stored in their cloud data centers.
The reason current cloud offerings are not adequate to address the IoT and edge market stems from the fact that cloud design principles are orthogonal to the edge requirements.
The infrastructure for IoT and edge systems is very different from what the cloud was designed for:
- Low footprint power sensitive software stack — Software cannot assume unlimited elasticity. The cost of an incremental container in the cloud is practically zero. Therefore, it’s optimized to fit in the least amount of memory and CPU footprint. Although it’s good for margins on a large scale, it’s a nonissue when it comes to solving important challenges quickly. It’s quite the opposite on the IoT edge. The IoT edge can be a very small device or a sensor that barely has any room for additional software.
- Variable latency — The latency from the edge device to the cloud may vary from a few hundred milliseconds to infinite latency. Clouds are built with sophisticated inter-VM and inter-region latency targets.
- Diverse network — The edge stack has to operate across a very diverse network system. Some edges may connect through Ethernet, cellular, satellite or Wi-Fi, to name a few. Clouds are predominantly standardized on wired, Ethernet network-like systems.
- Unpredictable bandwidth — As a consequence of the network and the cost to get high-quality network bandwidth into IoT systems, the bandwidths also are variable. Software needs to be able to deal with this.
- Occasional connectivity — IoT/edge systems may connect occasionally. This is true for high-speed trains or container ships when they connect back up to the network when they stop at stations or docks. This is also true for systems where connectivity is provided when a contractor with a “cell connection” drives up to an IoT device occasionally.
As organizations embark toward their data- and artificial intelligence-driven reinvention of their IoT edge systems, it’s important to keep in mind some principles that history has provided for us:
- Open API — Use an open API platform and not one that provides further lock-in tangles into your cloud provider.
- Fabric — Understand that connecting several IoT systems to regional and global cloud systems essentially constructs a data fabric. Solve this issue by making sure applications are able to access data in a single global namespace so that data movement does not result in rewriting applications.
- Act locally, learn globally — Artificial intelligence and its sub-branch of machine learning on the edge require the ability to act locally and independently to react to local situations effectively without relying on the cloud. Having said that, the system/fabric needs to work in a way that it can learn globally from all the edges and then feed that intelligence back to every edge.
- Don’t make it too big to fail — Putting all the control and management functions in the cloud will make the system prone to failures. The system should be decomposable into individual data clusters — whether on the edge, in regional clouds or in global clouds that allows for the ability to manage them either as a unified unit or individually.
- Secure — As data-driven strategies push aggregated intelligent data to the cloud, there are security mechanisms now available to secure data in the cloud. But what about the edge? The security policy framework should not have to be recreated for the same data multiple times depending upon where the data currently resides.
Cloud providers are starting to realize that the afterthought approach to edge computing won’t work. That is the next battle, and it is going to be bigger than the cloud itself.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
To paraphrase Francis Fukuyama, one could have been forgiven for thinking that we had reached the end of market economy history. The trend over the past 40 years has been for trade barriers to come down, and it was a truth pretty much universally acknowledged that opening up economies was good for growth and social prosperity.
Even non-market economies made a pretense at being open and the General Agreement on Tariffs and Trade (GATT) was considered great. The rise of protectionism has, therefore, come as quite a surprise.
Protectionism can come in many forms. It isn’t just the obvious taxes and tariffs that can make trade in goods and services unappetizing. The Food and Agriculture Organization (FAO) has provided a rather nice list of things that can hinder entry to new markets including:
- Government interventions in trade — Government procurement, stock trading, export subsidies or taxes, countervailing duties, trade diverting aid, etc.
- Customs and administrative entry procedures — Customs valuation, customs classification, antidumping duties, consular and customs formalities and requirements, sample requirements, etc.
- Specific limitations on trade — Quantitative restrictions, export restraints, health and sanitary regulations, licensing, embargoes, minimum price regulations, etc.
- Charges on imports — Tariffs, variable levies, prior deposits, special duties on imports, internal taxes, etc.
- Standards — Industrial standards, packaging, labeling and marking regulations, etc.
So what can IoT offer in this new era of trade? And why are digital ecosystems important?
In relation to the first, while protectionism may not be good for the global economy as whole, completely open borders are probably also not a good idea. Some controls are always necessary to prevent trafficking, in people as well as in good, and to contain hazards to health and safety, for instance.
IoT technologies can help governments patrol borders, enable asset owners and insurers track assets as they move across national boundaries, and enhance transparency and efficiency in cross-border administration. Combined with artificial intelligence, unusual patterns can be spotted to thwart illegal activity. They can also play a valuable role in less obvious areas such as real-time security access permissioning or protecting lone workers in dangerous areas.
So where do ecosystems come in? Firstly, the greatest advances on the status quo may come from startups. While their technology may be sound, scaling to the level required by border authorities may well require sandboxing or some other form of piloting. Finding IoT startups that have these capabilities can be like looking for a needle in haystack. Working with reputable startup ecosystem players can help sweep away the surplus hay.
Secondly, IoT is by very definition an ecosystem play. By contributing to ongoing debates on everything ranging from security to standards, would-be deployers can make sure that they understand what the real risk-reward of implementation, but also actively help shape the IoT landscape and enhance the value it brings to their own domain.
Disclosure: The author is CEO of IoT Tribe, an equity-free accelerator that brings corporates and startups to do business and a member of the Board for the Alliance of Internet of Things Innovation, an industry body that aims to foster IoT innovation in Europe.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
We are at a pivotal moment in technical advancements, where waves of digital innovation are changing how we work, play, travel, communicate, dine, interact and even think. Just ask Alexa or Siri.
In March, our collective imaginations soared when SpaceX launched a Tesla into space. Drones are starting to deliver packages and meals. Robots are transforming factories, supply chains, restaurants — even the environment. In the latest on the #tech4good movement, Urban Rivers has built a remote-controlled, floating, trash-collecting robot to clean up Chicago’s rivers. IoT-enabled smart cities, such as Barcelona, Singapore and Denver, are already saving millions in energy and labor efficiency while improving citizen services and public safety. 5G services will be available in U.S. cities by end of year, ushering in new augmented reality mobility applications that will make Pokémon Go a relic of the past.
This digital innovation has one thing in common: It’s all enabled by fog computing. Fog is the distributed technology that brings compute, networking and control closer to where the data is generated for ultra-fast response that increases user privacy and system protection. It’s an architecture that is gaining traction with the growing awareness that not all information can — nor should — be streamed to the cloud. Processing data closer to where it’s generated is not only beneficial, but absolutely crucial on many fronts: sub-millisecond latency, less network bandwidth and cost, more efficient operations with enhanced system security. While sometimes referred to as edge technology, fog is more comprehensive: It bridges between device, edge and cloud in a superset of functionality that communicates over multiple networks in a north-south, east-west approach between systems.
Recently published use cases by members of the OpenFog Consortium highlight the technologies behind some of the sophisticated fog-based applications for autonomous vehicles, delivery drones, streaming video for onsite sports venues, real-time energy exploration, smart factories, connected healthcare and more.
The bellwether: Investments in fog are up sharply
Growing awareness of these benefits is driving corporate investments in fog infrastructure. Last fall, 451 Research released its landmark report showing a 4x growth in fog computing, up to $18 billion in spend by 2022. More recently, a survey by Futurum Research of 500 North American companies found that 93% will be investing in edge technologies in the next 12 months, and 72% reporting that their edge strategy is either critically or very important to improve business processes and productivity. That’s a whole lot of innovation to come.
Investments are growing on other fronts as well — those who are creating the technologies and applications. Technology leaders and startups alike are developing the advanced technologies that make machine learning and artificial neural network technologies more affordable and easier to use. In 2017, venture capital funding into U.S.-based IoT startups alone reached $1.46 billion, according to Crunchbase, sharply up from the $461.7 million raised in 2013. According to new research from KPMG, artificial intelligence — most notably in the health, finance and automotive sectors — attracted $12billion in investment from venture capitalists globally in 2017, doubling the 2016 numbers.
Last year, Fog World Congress featured a Fog Tank, where five startup finalists competed on stage before an elite group of venture capitalists to be named the biggest game-changer in fog computing and networking. Master of Ceremonies Sam Bhattarai, director at the office of the CTO of Toshiba Corporation, kicked off the event by telling the audience to take note, as it was highly likely that the companies on stage will be the Googles and Amazons of tomorrow. In the engaging 90-minute session, Silicon Valley startup Nebbiolo Technologies took home the grand prize, facing down stiff competition from ActiveAether, Forecubes Inc., Kiana Analytics Inc. and NGD Systems Inc. In October, new global contenders will take the stage to claim bragging rights to this rapidly growing market.
New business, and new business models created by fog
Creating an open, interoperable architecture for fog is complex — and necessary to enable advanced applications such as IoT, 5G and AI. At OpenFog, we are privileged to have a front-row seat into this future vision of innovation through the work of our forward-thinking members.
Research members in China are already looking far ahead at the possibilities enabled by 6G, which some technologists expect will be an autonomous, self-learning and self-planning network. In Detroit, a small team at Wayne University is pioneering connected ambulatory care that keeps critical data accessible in areas of poor connectivity or in poor weather. Princeton University’s Edge Computing Laboratory is leading the work to solve high-density data distribution over remote networks. Several OpenFog members are doing groundbreaking work in blockchain deployments through fog.
And some are creating entirely new business models. SONM is offering IaaS and PaaS based on fog computing as a back end. Computing power suppliers all over the world can contribute their computing power to SONM marketplace. AetherWorks has also created a global computing marketplace in which anyone can rent out, or purchase, processing power for peak demand. Its business model is driven through FogCoin, the blockchain-powered cryptocurrency that incentivizes users to join and use the network.
SCALE to deployment
As defined in the OpenFog Reference Architecture (note: registration required), the attributes of fog can be simplified according to the SCALE model: security, cognition, agility, latency and efficiency:
- Security — In addition to its distributed approach for trusted transactions, fog offers better GDPR compliance/privacy protection by anonymizing data locally. Should an incident occur, public safety officials can disable it for real-time identification purposes.
- Cognition — Local fog nodes enable autonomy by processing information at or near the source of the data for real-time decision-making and response.
- Agility — Clusters of fog nodes can help rapidly moving assets achieve levels of communication and coordination at sub-millisecond speed and at scale.
- Latency reduction — Fog enables real-time processing and cyber-physical system control for rapid response.
- Efficiency — Fog nodes enable dynamic pooling of local unused resources from participating end-user devices.
In less than two years, fog has emerged from a relatively unknown concept to one that is powering industry shifts in hot areas such as Industry 4.0 and autonomous transportation.
We invite you to join OpenFog as we help to enable and create the future, and to join in the innovation, research and technologies at Fog World Congress, October 1-3, 2018.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
Currently, more than 4 billion users, about 54% of the world’s population, are digitally connected. At the same time, Gartner estimates that there are more than 8.4 billion devices currently connected and predicts this number will reach more than 20 billion by 2020. Before long, almost everyone and everything will be connected.
Today, we speak of the “internet” when we are talking about the devices and services that help people communicate. We make a distinction when we talk about the system of interrelated devices, machines, objects, animals or people that have the ability to transfer data over a network without requiring human-to-human or human-to-computer interaction. We call that the internet of things.
The potential benefits from this increasing connection are incredible. But as we move towards this era of everything becoming connected, there will be more and more concern about the security and usage of the personal data that gets collected, used and disseminated. Data breaches highlight the vulnerability of our personal information and underscore the importance.
We must take into account the aspects of IoT and information technology that determine what data in an organization’s computer system can be shared with third parties, what is generally called data or information privacy. Just as the value of IoT depends so much on your industry, when we talk about privacy, it is useful to divide the market into three basic segments: consumer, services/public sector, and industrial/enterprise. Let’s look at a representative use case for each segment and potential privacy concerns.
The potential safety, mobility and efficiency benefits of automated vehicles are many. These vehicles are also expected to generate an enormous amount of data, some of which will be personal and sensitive, such as real-time precise geolocation data and the contents of driver communications that result when they connect their mobile phones to a vehicle’s computer system. Connected devices often ask users to input personal information, such as their name, age, gender, email address, home address, phone number and social media accounts.
Other consumer IoT use cases, such as virtual assistants, connected health monitoring, consumer marketing, tracking product usage and performance, and so forth, often ask users to input personal information, such as their name, age, gender, email address, home address, phone number and social media accounts, creating further privacy concerns.
Public sector and ‘smart’ city use cases
So-called “smart” cities have the potential to provide higher quality services at lower costs by eliminating redundancies and streamlining city workers’ responsibilities. Cities are installing boxes on municipal light poles with sensors and cameras that can capture air quality, sound levels, temperature, water levels on streets and gutters, and traffic, identifying ways to save energy, to address urban flooding and improve living conditions.
Public sector IoT data also includes the data present in city registers, the data from government or corporate surveys, and the data from social media updates. This data is often combined and linked in order to produce joint indicators of city well-being, economic vitality or safety. Increasingly, local governments make this data also available to the wider public. All of this raises issues about who has legitimate access, which data can be opened up to public usage and what is the appropriate privacy framework for the linkage of different data.
Industrial/enterprise and the connected factory
Using IIoT, a company can connect devices, assets and sensors to collect untapped data. This also allows a company to deliver scalable, reliable applications faster to meet the ever-changing demands of its customers. For example, in a connected factory, sensor-enabled equipment can supply helpful data about its continuing condition. This information can be analyzed to predict when and where equipment might break down, helping factories to prevent production shutdown. In the event a breakdown does occur, a factory can analyze this data to determine the problem and take corrective actions to prevent future occurrences. This allows them to shift more attention toward innovation instead of infrastructure management.
Tips for maintaining data privacy in your IoT program
In each of the segments above, at some point data includes not just input from sensors, but also personal data. That’s easy to see in the consumer IoT and public sector IoT, where data sets regularly include customer usage and behavior data. But even in IIoT, understanding how analytic data will be used and by whom, as well as integrating human needs and concerns into the system, means including some amount of “personal” information.
The most obvious policy any company can implement to enhance data privacy is to implement and enforce technological data protection measures to help prevent breaches in the first place, including but not limited to encrypting data at rest and destroying “personal” data when not needed. When personal data is encrypted, even if an unauthorized third party finds a way to see the data they cannot read it or collect it. (For more about privacy and security see this Deloitte Insights article.)
IoT is on track to connect ever more devices that do not require human-to-human communication and provide data to fundamentally disrupt enterprises by turning linear processes into networks. More and more data is being collected and processed, which is unearthing previously unimagined value by making companies ever more efficient and responsive.
And, more and more, some of this data will require new approaches to security and privacy. As we look forward, companies should develop integrated, enterprise-level approaches to data governance by strengthening and implementing data protocols and policies.
IoT has become a major focus area for mobile network operators worldwide. However, one fundamental question remains: Where are the real IoT business opportunities for MNOs to strategize and achieve volume, scale and growth?
First, let’s be clear — IoT is not just an isolated business segment, but it is an integral part of a bigger digital transformation strategy. Some operators are already in advanced stages with a mature IoT strategy in place, while others are lagging behind or somewhere in between.
It’s no secret that most mobile network operators (MNOs) face challenges in the commoditized voice and data markets while experiencing saturated subscriber growth. Additional challenges come from technology disrupters and global digital platform players that have become technology havens, attracting fresh innovation, large investments and financial market attention.
However, MNOs still have a solid position in regard to the ownership of their customers’ engagement, be it consumers or enterprises. MNOs will eventually be able to generate long-term IoT growth in the broader digital transformation landscape, where digitization has begun to transform a wide range of other industries.
First, let’s look at past lessons learned from two examples: media content monetization and mobile payment.
In the first example, leading MNOs realized how critical a media content strategy was to enabling growth and successfully moving up the value chain. Different enabling strategies were put in place, including the acquisition of media channels, the creation of partnerships and the building of media content delivery platforms. All of these have enabled MNOs to monetize, bundle new subscription-based services and generate higher data traffic on their networks.
In the second example, specifically in emerging markets, MNOs have created, through partnerships with the financial sector, new purpose-built payment services around their low ARPU customers to execute secured mobile payment transactions. These services have enabled a wide range of community-driven business models.
The main lesson learned from these two examples is that MNOs’ strong position in customer ownership and engagement combined with a deep understanding of market needs and dynamics has been key to driving new opportunities.
In the era of digital transformation, IoT is bringing a new set of data attributes that relate to devices and sensors associated with customer data in an ecosystem — let’s call this device data.
For example, one customer might have a car, phone, medical device and smart watch, and she may bike or use public transportation and she also visits the doctor, lives in a farm or smart city, travels, consumes electricity and water, and goes to work, school, shopping and so forth. All surrounding and embedded devices and sensors will generate a huge amount of data that relates to this specific customer in one way or another. The amount of device-generated data will surpass the customer’s data over time, and will both eventually converge.
As the IoT market begins to mature, we will find the amount of converged customer and device data will form unique new IoT opportunities to explore. This converged data holds many promises for new opportunities and will proliferate in many day-to-day industries, such as healthcare, transportation, smart cities, utilities, retail, energy, smart living and wearables, in developing and developed worlds alike. In many cases, this converged data proliferation will also be driven by evolving community-based activities.
MNOs can play an essential role in using their customers’ and enterprises’ ownership by thinking in this way. This will also require putting strategic plans in place to enable user integration across multiple industries through IoT aggregation platforms, data integration tools, developer and API-centric services, secure authentication, on-demand capacity, data analytics and integrated flexible billing capabilities, to list a few.
Equally important, innovation to enable business integration, data contribution models and data sharing incentive mechanisms must be culturally motivated. With socioeconomics as the underlying baseline, MNOs can take a more active and leading role with regulators and data policymakers to adapt to a data-driven mindset. This will require further business and data integration capabilities with a service-oriented approach, enabling a cross-industry digital transformation and thereby creating new possibilities.
It’s important to note that this won’t happen overnight, but it surely will happen fast — depending on how quickly MNOs’ strategies are put in place and adapted.
It was just over a year ago when the Mirai botnet made its dramatic debut on the world stage. After initially flooding a security journalist’s website with traffic, it went on to grind the internet to a halt for millions of users. The botnet did this by overwhelming Dyn, a company that controls much of the internet’s domain name system infrastructure. Since then, variations of the malicious source code have been used in a number of high-profile attacks on internet infrastructure.
Mirai and its offspring are self-propagating botnets that target and infect poorly protected IoT devices by exploiting people that repeatedly use default usernames and passwords. Once hijacked, these devices have been used to mount some of the biggest distributed denial-of-service attacks we’ve ever seen.
In some ways, enterprise IoT security is being managed a bit like IT security was way back in the ’90s — as an afterthought. This has to stop. After all, we forecast that there will be 20 billion connected IoT devices by 2023. Clearly, there are new and different security requirements than there were in the past.
IoT devices are being targeted for many reasons. Often, poor understanding of enterprise IoT security leads to weak protection for many IoT devices, which makes them easy targets for hackers.
Many IoT devices still don’t have advanced security features. This is especially true of simpler devices, such as a temperature sensors, which have limited processing power and basic operating systems. As they are designed to be plugged in and forgotten about, owners tend not to do security updates frequently, if ever, making it quite easy for an attack on such devices to go unnoticed.
In a connected world, these simple devices can be connected to more critical systems further up in the network. If even a small, simple device malfunctions or is tampered with, it can lead to serious security issues.
The Mirai botnet was an eye opener, not least because it neatly illustrates that the IoT industry is facing ominous threats and that we need to prioritize securing the IoT ecosystem. But what can be done to help secure the IoT?
In my opinion, the prevalence of insecure IoT devices makes it likely that, for the foreseeable future, they may be one of the main entry points for future attacks on mission-critical systems. The silver lining is that IoT botnets can be averted if IoT vendors follow basic security best practices.
In fact, all participants in the IoT ecosystem need to have security as a top priority, from device manufacturers, through networks to platforms and applications. Which means security is no longer a “nice to have” add-on feature. It is a necessity.
Thankfully, enterprise IoT security is steadily becoming an issue of high concern. Measures are being taken by many IoT vendors to prevent security breaches at the device level, and efforts are being made to tackle major disasters before hacks occur.
It’s still not enough.
When you’re going up against expert hackers, you can’t partner with amateurs or you risk paying the price. To ensure their customers’ telco infrastructure is secure from complex attacks, IoT vendors must work with competent partners whom they can trust.
The far-reaching effects of globalization on the manufacturing industry in the United States — and those who work in it — are impossible to downplay. While certain sectors like pharmaceuticals, aerospace and electronics have surged, others have been slowly losing ground. By some estimates, the erosion of manufacturing in the U.S. has contributed two-thirds of the fall in labor’s share of U.S. gross domestic product, according to a recent report from the McKinsey Global Institute — with much of that hitting small and midsize manufacturers the hardest.
Even amidst that scene, the U.S. still accounts for 20% of the world’s manufacturing activity, McKinsey pointed out. Far from signaling a sort of slow demise of manufacturing in the U.S., a host of factors are coming together that can incite a U.S.-based manufacturing resurgence. In fact, the McKinsey report finds that the U.S. could boost annual manufacturing value added by up to $530 billion (20%) over current trends by 2025, and create more than 2 million jobs.
How can the U.S. achieve these targets? With Industry 4.0 technologies (IoT and related technologies) that will enable U.S.-based manufacturers to hone greater efficiencies in production, streamline labor costs and even lay the foundation for developing service-driven business models.
In many ways, manufacturers have been the earliest adopters of base IoT technologies — as operational technologies have for decades used sensors and RFID technology to automate supply chain management and ensure equipment health and safety. With the advent of new IoT technologies, the data from those machines and processes can be used in new ways to drive further efficiencies in asset management and the supply chain.
Taking a closer look, here are some ways industrial IoT technologies can help manufacturers:
Boost efficiencies by reducing downtime. Predictive maintenance is the single biggest driver for industrial IoT implementations. Enabling a piece of crucial equipment to “tell” a plant manager when it deviates from standard temperature or vibration parameters means the team can plan service before the equipment breaks down, potentially avoiding delays in production and all the ripple effects of associated costs. In more advanced scenarios, action can be automated all the way through ordering the parts and scheduling the work order to fix the issue long before it becomes a problem.
Enable remote monitoring and equipment maintenance. Gartner recently reported that nearly half of the 200 companies it surveyed either had digital twins — which use IoT technology to create a digital mirror of a physical asset — in use or were planning to implement them within the next year. The number of participating organizations launching digital twins will double in 2018, and by 2022, that number will triple. Digital twins enable manufacturers to accomplish two very important goals. They reduce the complexity inherent in industrial IoT development, which can require embedded programming skills not often in rich supply with traditional IT developers. And subsequently, digital twins can bridge the IT and OT worlds by building a digital thread, of sorts, which connects disparate systems to enable visibility and traceability.
Reduce energy consumption. U.S. manufacturing energy consumption increased between 2010 and 2014 for the first time in nearly a decade, according to data from the U.S Energy Information Administration. Equipping factories with smart thermostats and smart lights can enable cost savings, and further integrating unstructured data sources can bring even greater reductions in overhead costs. In a use case detailed by Cognizant, IoT-enabled HVAC systems also offer integrated weather data and prediction analysis to help manufacturers understand expenses and plan energy usage.
Training the next generation. With impending retirements of the most experienced workers, U.S.-based manufacturers will face the difficult challenge over the next decade of properly training the next generation for these crucial roles. Companies like Honeywell are working to marry IoT technologies with augmented reality to meet this need. In one technology solution, workers donning wearable sensors will collect data on their job as they move through it in real time, which will then be used to create immersive training and competency development.
By using IoT technologies, U.S.-based manufacturers can lead a data-driven manufacturing resurgence — gaining efficiencies in the way processes are currently managed and uncovering ideas on how to evolve those processes to meet the more service-driven business models the industry is moving toward. IoT enables better visibility into every aspect of the business — something that is crucial for seizing opportunities to digitally transform business models.
5G holds great promise. The next-generation wireless network will enable large-scale internet of things applications and support blisteringly fast data processing across a diverse array of devices on a massive scale. Today’s 4G status quo will not be an option when it comes to supporting low-latency use cases like mobile 4K videos, virtual and augmented reality, autonomous driving, robotics and a plethora of yet-to-be-conceived innovative services.
But the promise of 5G will remain just that without the all-in commitment by mobile operators to use artificial intelligence.
It’s not that mobile operators are currently failing to harness data. Many are processing user data, network data and, increasingly, IoT data. However, the sheer scale of 5G means that current data analytics capabilities must evolve. That means operators must invest in AI if they expect to take full advantage of the potential of 5G.
There are two complementary areas for the application of AI. First, as a productivity driver in the product development process, and therefore time to market; and second, to power autonomous intelligent platforms for IoT and other customer-facing services. Together, the tangible benefits of AI are significant for operator business growth, margins and the delivery of the all-important customer experience.
The stakes couldn’t be higher. Here is what AI will mean for mobile operators.
AI brings the ability to enhance the development lifecycle. Integrating machine learning across the DevOps process will result in significant productivity gains and faster time to market. Operators already see significant efficiencies from increased automation across DevOps, but more sophisticated forms of machine learning will deliver improved uptime, predictive defect detection and root-cause analysis. These will come from the integration of AI into specialist machine-learning toolchains that can be used to support many use cases.
Beyond development, AI will play a decisive role in managing network capacity, freeing up resources, improving availability and deepening customer engagement.
Smart thinking in an IoT world
Central to the 5G proposition is the technology’s ability to support the demands of IoT. That is, to provide mass connectivity across diverse devices and process data into business insights in real time. To make good on the promise of smart cities and factories, connected cars, telematics and wearables require AI. Gartner predicted there will be nearly 21 billion connected devices by 2020. A handful of operators are finding ways to define and manage virtual network functions, policies, resource consumption and performance metrics to meet this challenge. Today, there are AI-based platforms dedicated to the management and orchestration of networks capable of these tasks.
Data-driven customer engagement
In a 5G world, operators will be judged on their ability to deliver enhanced, contextual services to customers. Advanced data analytics and AI will be essential for delivering these services, in part, because they identify the bandwidth requirements for the customer experience. An AI-powered management layer translates a customer’s intent and uses that insight to configure the network, optimize resources, remediate issues and launch services based on workload models.
This intent-based layer will control physical and virtual networks and network function virtual orchestrators. Operators can further drive customer loyalty by utilizing AI throughout the customer relationship — for example, by providing service recommendations based on existing usage, telemetry and cutting-edge customer care.
As Gartner advised, operators need to do the groundwork now to ensure they are positioned to roll out effective AI technologies for 5G. This includes investing in skills, processes and tools for setup, system integration, algorithm and approach selection, data preparation and model creation.
The bottom line
As consumers are exposed to an increasing array of AI-based applications, their expectations for rich, immersive experiences will dictate the suite of network services network operators must provide. 5G has the power to deliver these experiences, but mobile operators must be realistic about the challenges ahead. They must transition from automation to predictive capacity planning and service delivery. Intelligent resource deployment in areas such as service provisioning, orchestration, network function virtualization and software-defined networking are key.
In the end, the success of a 5G network rollout will depend in large part on the operator’s ability to harness AI across the network value chain.
Amazon has released an unbranded, customizable Dash IoT button under the name Amazon IoT Button. It uses the horsepower of Amazon Web Services to allow you to create out-of-the-box, push-button (literally) functions.
The IoT Button takes the Amazon Dash button and allows a developer to make it do whatever they want. Want a button that sends a text message that you want to play foosball in the break room? Done. Want a button that sends a Slack message that you’ve made a pot of coffee? Done. Want a button that sends a helpful supportive message to someone in your life — an “I’m thinking of you” button? Done.
Part of the beauty of the IoT button is the simplicity of configuration (there’s an app for that — Android and iOS). With these apps you can get the button connected to Wi-Fi, provisioned in Amazon’s IoT platform and wired to basic templated functions. With about five minutes of configuration, you will have your button on IoT and doing something interesting.
Where you take it from there is up to you. Exadel is currently working with a client who has a specific group of users in mind who will benefit from the touch-of-a-button functionality. This button allows them to solve some of the tricky angles related to bringing this push button functionality to users — namely “how do I get it on the internet,” and also the hardware.
Traditionally, if you wanted a hardware technology, even for something simple, you needed to build it yourself or find a way to use off-the-shelf components to make something. The length of time and cost to implement something (even something simple) can be prohibitive. Now the IoT button lets you create a custom technology, with hardware. As long as what you need the hardware to do maps to “press a button and something happens.” What happens can be customized for any user.
There are other types of buttons out there that can solve this, but they usually have to have some type of hub to enable them to connect to the internet, and then, once you do, you’re still going to have to figure out the plumbing to get the button’s press signal to your back end.
Amazon makes it very easy to plug in the IoT button and connect it to its cloud infrastructure. From there you can connect it to other services easily and with little code. A few sample applications that we built used text messages, email and Slack.
Our client’s use case involves not just the button, but mobile apps and web applications. Pressing the button allows our client’s customer to execute a simple transaction when needed without the need for a sign in, app or any other interaction. Just press the button and things happen!
The signal from the app goes to the back end, where it is processed. Then a real-time communication wakes up an app on the concierge’s phone. The concierge can then take care of everything for them. The button can be given to someone, such as a business owner or administrative assistant — an easy way to make something happen.
Of course, there’s a bit of theatre to this, and that’s the intent. This type of always-on, “press a button and things happen” functionality has yet to become ubiquitous, although we are coming closer. Soon, between connected home devices and one-press functionality, we’ll see this type of thing everywhere.
There’s still a lot of bang for the buck here. The IoT Buttons are $20 right now. It might be expensive if you bought a lot of them, but for a targeted use case like this client, where we’re looking at a few hundred targeted users, it can make a lot of sense versus the hardware development costs or custom configuration that may be needed for a different system.
The beauty of the IoT Button is its simplicity. It can be set up and connected simply, and it can pass a command to a cloud application, allowing action to take place. Because of its simplicity and price point, it can be incorporated into a wide range of applications with relative ease.
What else could it do? Anything that expresses a dedicated, repeatable customer desire — order a pizza, place an order, call a taxi, send a notification, publish a tweet, place a notification on a web page. The sky’s the limit.
As recent security breaches have shown, malicious actors are becoming increasingly clever and sophisticated. They break in through unlocked doors, such as weak or nonexistent passwords on IoT devices like security cameras. As billions of devices are becoming connected (including many that have tremendous impact on business processes or consumer safety), security should become a primary concern. But are we really doing enough to mitigate threats? Are we learning from the painful recent experiences? According to NXC Group, nearly half of companies with an annual revenue above $2 billion estimated that the potential cost of one IoT breach is more than $20 million.
Organizations have historically weighed the cost and time of focusing on security as a deterrent, citing time-to-market delays, design complexity, fragmented ecosystems and more. But as the world moves towards a trillion connected devices over the next 20 years, we as an industry must change our behaviors and agree that security is no longer optional or an afterthought. Users need to know and trust that their IoT devices are born secure, upgradable and managed end to end.
As an industry, we have a social responsibility to build secure devices and maintain a high level of security throughout their lifecycle. We need to make it easy for organizations to securely build, deploy, connect, manage and update devices. This is why Arm has worked with our partners to develop Platform Security Architecture (PSA), the first industry framework for building secure connected devices. PSA provides a common set of ground rules and resources to reduce the cost, time and risk associated with IoT security today, simplifying the security consideration process for device manufacturers, vendors and service providers.
Strong IoT security starts by empowering developers with the tools and system-on-chip (SoC) designs needed to securely develop devices at the very beginning of design, without slowing down time to market. They should incorporate multiple layers of protection, scaling from software to silicon implementations, such as protecting against physical attack threats, which can occur when an attacker has direct contact with the device SoC or is in close proximity to it. Organizations should build technologies into separate partitions that mutually distrust each other in case an attacker breaks into one part of the system.
However, even the highest integrity devices built with the latest security protocols need updating as both devices and attack techniques evolve. Whether a smart meter or smart streetlight has been installed in the field five months or five years ago, it will need to be updated as attackers become more sophisticated and identify new attack vectors over the device lifecycle. When we consider the number of devices deployed could range anywhere from hundreds to millions, over-the-air (OTA) firmware updates become a critical requirement to mitigate new threats. Key considerations when designing in OTA firmware for secure devices include:
- Space requirements for storing the newly received firmware upgrade;
- Properly checking firmware signatures before installing them;
- Ensuring enough bandwidth to support a firmware download; and
- Determining how many firmware updates can be supported before the devices shut down if they are running on batteries.
True end-to-end security requires a comprehensive IoT device management technology for protecting connected devices throughout their lifecycle once they are deployed. This includes securely provisioning the device once it’s turned on in the field, managing the updates over the air and securing the communication between the device and the data store. Secure communications should be encrypted and based on widely deployed and tested security protocols such as transport layer security (TLS) and datagram transport layer security (DTLS).
Security is undoubtedly complex and expensive. The industry works relentlessly to consider and try to protect vast attack surfaces, while hackers need to find just one vulnerability to undo all that hard work. It’s time for us to bring security efforts out of the shadows and make it a first-class citizen in our companies. By working together, we can build a safer future for everyone as we move towards a trillion connected devices being deployed over the next 20 years.