Those who follow my articles know that I like to make comparisons between the internet of things and TV shows and movies. For this article, I have selected the famous show The Walking Dead.
When preparing this article, I read “The Real Walking Dead: Surviving the Software-Defined Zombie Apocalypse” by Scott Noteboom and thought, well, I am not alone. Like Scott, I see a lot of similarities between IoT technology and biology.
Many companies are thinking about their survival after the apocalypse that will be produced by the mix of IoT, AI and blockchain. CEOs must make decisions that prevent their companies from disappearing or worse, becoming walking dead. And one of the most important decisions will be to choose their travel companions well in order to build a strong ecosystem capable of resisting the most adverse scenarios one might think of.
IoT technologies and services that companies need to implement to survive the apocalypse are composed of many apparently simple blocks: devices, protocols, edge computing, fog computing, communication networks, platforms, cloud, analytics, AI, machine learning, blockchain, security and applications. But the selection of the vendors and the integration of all of them in business processes and systems is complex, and there are few companies that can boast of having achieved it.
You are probably tired of hearing that IoT is very complicated and the ecosystem is very fragmented. You may feel that many companies and products turn into the walking dead. But no one has the crystal ball to know which IoT companies will be around in 10 years, no less one, two or three years. Some of them are perhaps in the phase of becoming, when just a couple of years ago they were in good health and enjoyed the sympathy of the analysts.
If you have been living in a sanctuary, isolated, it will not last for a long time. You will soon receive the visit of survivors and walking dead. You have to decide whether to accept or fight the survivors, and you must protect your community against the zombies.
The good news is that you are not alone. During the last five years I have lived 24/7 by and for IoT. I have been monitoring and analyzing the IoT landscape and have seen many IoT startups appear and some disappear. I have seen large companies make absurd purchases or sell IoT businesses when they have not been able to obtain the expected return. That´s why I am able to provide wise advice and recommendations on avoiding being trapped by partnerships with the potential walking dead of IoT and help you build robust and scalable IoT systems.
Do not walk blind alone among The Walking Dead of IoT.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
When it comes to IoT data, one thing is certain: Real-time analytics in manufacturing and industrial operations continues to accelerate as companies get better results by including sensor-based/IoT device data in their decision-making process — and there is no slowing down in sight. Insights gleaned from IoT data are changing business models by automating processes, improving efficiency and driving revenue, and no other sector has been impacted by the IoT transformation more than manufacturing.
IoT has been a core component of industrial transformation efforts across the globe, with Industry 4.0 and the industrial internet (with the Industrial Internet Consortium). Smart manufacturing uses IoT data to reduce downtime, increase productivity and create efficiencies in manufacturing operations. The greatest value being experienced from IoT is stemming from monitoring industrial operations, solving complex logistics and predicting the downtime of equipment. However, the challenge of data integration still remains.
Operations teams work with diverse technologies, supporting a wide range of data formats and communications protocols. It is challenging to gain a holistic real-time view of information contained in critical industrial assets and across disparate systems, often leading to decisions being made on intuition rather than data. This lack of real-time insight contributes to a reactive approach to managing operations, creating siloed data, system downtime and lack of agility in responding to changing business needs, all resulting in millions of dollars in lost revenue and lost business opportunity.
For operations professionals, plant managers and process engineers responsible for monitoring and maintaining critical industrial assets, the key to success is being able to easily combine and correlate data from disparate sources, such as industrial control systems (SCADA), sensors, applications, infrastructure and IT systems, and deliver valuable new analyses into asset health. These new insights are driven by real-time and predictive capabilities to help businesses rapidly identify and diagnose issues such as equipment failure, reduce downtime and optimize operations.
In the past, IoT meant retrofitting sensors to existing manufacturing equipment. Today, in a world where new manufacturing equipment comes with IoT sensors preinstalled, the potential for transformation is tremendous. Sensors provide voluminous information on every aspect of industrial assets, while simultaneously enabling people and programs to make adjustments to the operation of these assets for performance optimization.
As IoT continues to become more pervasive, organizations will invest in improving operations and driving predictability in equipment downtime and processes across manufacturing, mining, oil and gas, power generation, utilities and rail. The question they face next is: Are these smart manufacturers prepared to deal with IoT’s data challenges, such as privacy, storage management, hardware and effective data mining?
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
There is a lot of talk about how IoT will alter the way consumers interact and how businesses will function. But the truth is, most activity to date has been around IoT (consumer) and not EoT (enterprise). EoT is different. Companies deploying EoT must concentrate on devices that are in place for long duration (seven to 10 years) and not consumer-level throwaways. They must also be manageable, like other corporate technology assets, and many devices will have user interfaces of some sort requiring organizations to have an extended endpoint app strategy. All of these issues are important and require careful planning. But perhaps the biggest challenge will be in securing the devices and their interactions with back-end corporate systems.
Many organizations have deployed some level of “things” into their operations. Indeed, I estimate that as many as 75% to 80% of enterprises already have some form of devices deployed. Yet, I also estimate that no more than 10% to 15% of these “things” would meet acceptable standards of corporate security if they were typical enterprise-grade devices (e.g., smartphones, PCs or even server assets). This is a major problem as with such limited security, it’s relatively easy for hacks of devices to be initiated — and with potentially disastrous consequences. If a PC is hacked and crashes, it’s annoying. If an EoT device gets hacked and crashes, physical property and or persons are at risk.
In this short article, it’s impossible to talk about all of the issues regarding security in EoT. It requires a concerted effort to secure the things at the hardware level first (many of which are built on old and potentially insecure technology). But it’s also important to have a network infrastructure for these devices that can monitor and remediate any potential threats. And it’s equally important that proper management tools be implemented that can properly deploy, monitor and control these devices (similar to what’s done today for smartphones and PCs).
A useful model to look at for working with EoT, though not totally the same, would be the mobile environment where in the early days, the proliferation of smartphones was essentially equivalent to anarchy. There was little management and a minimal level of security. In fact, corporate IT was often not even involved. It took several years for the security and management of these devices to catch up to the level required in the enterprise. The good news is that using the smartphone as a model, the amount of time and effort to secure EoT can be much shorter. Indeed, several of the mobile security and management companies are already moving down the EoT path with a unified management approach (e.g., BlackBerry and Citrix), and this will benefit organizations in the short term.
There is much more to say about this and in subsequent articles, I’ll discuss more specifics. But for now, enterprises should be putting a strategy in place that deals with the inherent insecurity of the many devices now being put in place, and the plethora of devices that will appear in the next two to three years. If you don’t, it’s certain you’ll pay a significant price.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
The growing use of IIoT worldwide is in the process of turning the concept of smart factories into reality. The phenomenon has been described by observers as the fourth industrial revolution, or Industry 4.0 for short.
Industry 4.0 promises to combine digital technologies — such as big data, data analytics, artificial intelligence and machine learning — with all-pervasive internet connectivity to produce vast quantities of valuable data. Companies mine, analyze and convert the information into a wealth of insights and then use the knowledge to boost factory productivity, increase supply chain efficiency and make substantial cost savings.
As always, new trends bring about new security challenges. Though connecting industrial machinery to the outside world can be risky, the deployment of virtual private networks (VPNs) can ensure that Industry 4.0’s data treasures stay hidden from unwelcome observers.
Measuring IIoT success
To take their industrial processes to a whole new level, manufacturers everywhere are placing great faith in IIoT’s potential.
In its 2016 Global Industry 4.0 Survey, PwC expected manufacturing organizations to commit $907 billion a year — around 5% of revenue — to developing Industry 4.0 applications. It was also predicted that firms investing in Industry 4.0 would achieve ROI within two years.
In 2017, there were an estimated 9 billion connected IoT devices – one-third of which were industry –related, according to Gartner. The electronics industry leads the way with investments totaling $243 billion a year through to 2020, closely followed by engineering and construction ($195 billion) and industrial manufacturing ($177 billion). Industrial cost savings are anticipated across the board by 2020.
PwC’s study interviewed 2,000 respondents drawn from aerospace and defense, automotive, chemicals, electronics, engineering and construction, paper and packaging, industrial manufacturing, metals, and transportation and logistics. Cost reductions across the nine industries have a weighted average of 3.6%. In financial terms, that’s a savings of $421 billion.
The most effective cost-saving method to emerge from Industry 4.0 is automated inventory management, which enhances both the supplier’s and buyer’s business performance.
Aside from obvious ROI results, investments in Industry 4.0 will be measured by their impact on a range of factors including corporate vision, management strategy, process innovation, productivity and adoption of internationally recognized industry standards. A recent study from Bsquare revealed that manufacturers are most commonly adopting IIoT to handle logistics challenges (95%), followed by machine health (82%) and operating costs (34%).
Despite all its benefits, there’s one aspect of Industry 4.0 that still concerns IT managers. Factory machines connected to the internet may attract unwelcomed attention from treasure hunters seeking to intercept the data for their own profit.
It is estimated that manufacturing businesses gather around 2 exabytes of data every day. Sensors built into all kinds of devices from motors to conveyor systems alongside other factory assets pump out data continuously. In some cases, it is needed for feeding into related systems such as ERP, line-level programmable logic controllers and human-machine interfaces. In others, it is needed for analysis by complex data analytics and AI systems that ultimately lead to better-informed decision-making.
The security dilemma can be handled in a number of ways, although almost all are constraints on ease-of-use. Furthermore, devices are under constant threat from new malware variants and must be capable of secure remote management so that they may be kept up to date with the latest patches and software upgrades.
Centrally managed VPNs
Essential machine-to-machine communications security is achievable by utilizing remote VPN services. Centrally managed VPN connectivity is a positive first step in shielding valuable IIoT data from prying eyes. It allows device software to be patched and upgraded remotely in a secure and cost-efficient manner. Other qualities include device authentication for verification purposes along with logging and alerting capabilities to notify IT admins of unauthorized access attempts or tampering.
In conclusion, Industry 4.0 is turning manufacturing and utilities companies into digitally driven, data-centric powerhouses that are entirely dependent on the constant mining of valuable information to stay efficient and competitive. Both customers and suppliers, however, recognize that more needs to be done to realize the full potential of Industry 4.0.
Device manufacturers must develop products that adhere to industry approved security standards, while customers need to carefully balance their rollout of Industry 4.0 against its implications for security practices. Industry 4.0 is expected to evolve rapidly, starting with localized solutions before expanding outwards to encompass cloud applications and secure connectivity between manufacturers and their supply chains.
VPNs allow device authentication, updating and patching to be managed remotely from a central point — helping manufacturers shield Industry 4.0’s treasures from the unwelcome attention of cybercriminals and industrial spies.
We have roared into 2018 — at least as far as the weather is concerned. Massive snowstorms in the Alps, heat in Australia, ice storms in the south of the United States, bombogenesis in the northeast, and the list goes on. Clearly, 2018 has announced herself — time to buckle in.
One thing about the weather is it is not always easy to predict. Sure, we have an idea of whether it is going to rain or snow in the coming days and can rely on historic information to project what the temperature range will be in March or May. But when it comes to the sudden and unexpected — think blizzards, tornadoes, hurricanes or even the stratospheric effects of a distant volcanic eruption — we really cannot predict when, how, where and to what degree the weather will disrupt our lives. This unpredictability disrupts all of our lives and can have a profound effect on our supply chains and other business processes that depend on long-term planning and orchestration among multiple trading partners.
But guess what? In 2018 and beyond we will be increasingly leaning on the internet of things to improve our ability in sensing and responding to outages and issues that may arise due to occurrences such as massive snowstorms or record heatwaves. So, what will IoT hold for us in the new year?
- IoT’s presence continues to increase within our homes. Just look at the sales of voice-activated home assistants such as Google Home or Amazon’s Alexa, the latter of which sold close to 25 million devices through the third quarter of 2017. As an increasing amount of homes have these IoT-enabled smart devices, how soon will the number of tasks they carry out lead to automating various aspects of our lives? Will Alexa or Google Home be the controlling hub for our smart kitchen appliances, smart garage or even smart bathrooms? We have already seen IoT-enabled refrigerators from the likes of LG and Samsung, toothbrushes and lighting from Philips, smart mattresses by Seebo and, of course, the IoT-enabled thermostat, Nest. The home will continue to become more connected, and players such as Amazon, Google and Apple will battle to be the controlling hub of this universe. Some of the use cases for these devices might not be immediately evident to the consumer, but as more aspects of the home become connected, vendors will campaign to make consumers wonder how they could ever have lived without their Roomba being able to connect to the web.
- Smarter IoT … how about AI and IoT? Why not. In 2018, the dividing line between IoT and artificial intelligence will become increasingly harder to distinguish. The two technologies are made for one another. All the data produced by IoT devices will overwhelm any traditional data analytics model, let alone human analysis. This will make the usage of AI that much more vital. There is a reason why the likes of Google, Apple, Amazon and Facebook, to name a few, are investing so heavily in artificial intelligence. As the world continues to create mountains of data, much of which is messy and unstructured, these firms realize they need the engines that can turn that oil (data) into fuel (something useful). Look for not only more investments in AI in 2018, but a continued partnering of AI with IoT. A great example of this is the likes of Under Armour buying workout applications such as MapMyRide. This marriage of product with technology brings together meaningful data and a purpose-built product like a running shoe, and by adding in a layer of machine learning to translate how using that product is affecting a person’s life, Under Armour can give its customers new context and an overall better experience. The ability to collect an increased amount of data is not helpful in isolation — there needs to be some layer of intelligence that makes data meaningful.
- Driverless vehicles continue to push our imagination. Notice we stated driverless vehicles, not simply cars, that will continue to inch along in 2018. Later this year, we expect to see a fully automated, captain-less sea vessel take to the ocean. The Yara Birkeland is set to sail, transporting fertilizer, and will do so without a crew. We have already seen experiments with driverless trucks in Nevada. Uber has also been looking to use driverless cars and is starting to experiment with them in the Steel City, Pittsburgh. While we do not expect that your next Lyft or Uber ride will be conducted without a human driver, there is a strong chance that 2018 sees more IoT-enabled vehicles such as shipping and trucking. Most likely running on routes that are lightly trafficked — we do not expect to see any driverless trucks on I-75 at rush hour trying to cross Atlanta!
- Speaking of driverless trucks, IoT grows within our supply chains. Connected factories and robotics is nothing new in the world of supply chain, and we will only see more IoT-enabled connectivity in 2018. Think of connected inventory, connected distribution, connected infrastructure and connected stores. All these attributes will make supply chains smarter. Moreover, it is not just about adding more connectivity within factories or in transportation; IoT will make the overall supply chain more connected, enhancing visibility and enabling more agility and responsiveness from one end to the other.
- Privacy concerns will not go away. Data privacy and information security will continue to be topics of high concern for the public — and should be for anyone working in the technology space. IoT-enabled devices will continue to be ripe targets for hackers. As IoT-enabled device creation seems to outpace the security layers that need to run in parallel. Expect 2018 to have some major data breaches and hacks that spread short-term panic, whether it is another hack of an IoT-enabled vehicle or a data breach that originated from a connected HVAC system. Companies that are using IoT need to put together contingency plans for this occurrence and should constantly test their networks for vulnerabilities and strive to have the proper countermeasures in place. In 2018 and beyond, it is not a matter of “if” but rather “when” some hacker, whether from a foreign country or someone living in their parent’s basement, tests the IoT vulnerabilities.
As a technology, IoT has lost some of its buzz over the past year, thanks to the likes of blockchain and AI. Still, the presence of IoT will only continue to grow in 2018 as we move beyond imagining what is possible and begin to inject this connected technology into more of the products we use every day — whether as consumers or in business.
Many connected industrial devices, like robots, motors, drives, controllers or medical devices, typically generate revenue through a traditional hardware-based monetization model. That means customers buy the device and might also sign up for a services or maintenance package. Yet, that model doesn’t offer many opportunities for growth over the lifecycle of the customer. And it doesn’t reflect where the true value is today — software, data and digital services. Hence, many traditional hardware companies are re-evaluating their monetization strategies. Software-based services enable IoT producers to bundle product offerings, services and feature sets in new and creative ways that generate incremental new revenue streams in the form of subscription or pay-per-use models.
Survey says: IoT producers rapidly embrace service revenue streams
A recent IDC survey (registration required) reported that 38% of IoT producers currently derive half or more of their revenue from hardware, but that proportion will decrease to 33% within the next two years. In contrast, 32% of respondents derive half or more of their revenue from services, and that figure will increase to 38% within the next two years.
Interestingly, 32% of respondents currently use software to electronically turn features on and off based on one-time purchases or subscriptions. Among those not yet using software to activate features, 19% plan to do so within two years. Other IoT producers use software to utilize data from IoT device sensors to uncover new services opportunities (31% currently and 29% within two years) like repairs and sales opportunities (29% currently and 27% within two years). Respondents also indicated they’re making investments to be able to manage the licensing and entitlement of these services-based revenue streams. Sixty percent use licensing and entitlement management systems to develop new offerings that bundle device, services and/or consulting, and 17% more plan on doing so within the next two years.
Subscription and usage-based models pose opportunities and challenges
Applications, features and services for IoT devices are commonly offered as a subscription or usage-based revenue model, and both have benefits and drawbacks. Subscription models work best for defined resource pools and steady workloads. They offer predictable revenue for producers and expenditures for customers. Pay-per-use models work best for elastic workload demands and for applications that need access to an unlimited resource pool. Revenue and expenditures are based on consumption, and can vary widely in each billing period.
For instance, PolySync — a company that offers a platform that manufacturers, researchers and suppliers use to test autonomous vehicle systems — monetizes via a subscription model. Prospects typically enter into a relationship with PolySync with a free, 30-day trial. When the trial is over, users can choose a basic, plus or pro subscription, and PolySync can turn special features on or off based on the level of the subscription. PolySync’s software-based system eliminates the need to upgrade the IoT hardware, which means the company can easily introduce new, premium features, as well as push out critical security patches and fixes.
Define metering standards for transparent pricing
Both subscription and pay-per-use revenue streams utilize meters to authenticate registered users and track usage. While per-user models are prevalent for SaaS offerings, there’s no standard meter in IoT when it comes to monetizing devices and the software on them. Meters depend on the business model and can be based on devices, users, transactions or the actual usage, like miles driven, gigabytes stored, products produced (in a factory environment), energy saved (in a building automation environment) or magnetic resonance imaging scans (in a medical company).
As IoT companies define their digital business models, the best strategy for monetization is to measure multiple factors to define the best meters on which to create transparent pricing models so that customers can see on an automated, self-serve basis how much they are using. Innovative IoT producers are already evaluating many different meters as a basis to make future pricing decisions, or to monitor license compliance and usage.
Thinking IoT? You have to think SaaS.
Most people think about devices, connectivity and data coming from edge devices in the field when they think about IoT. Not everyone thinks about SaaS straight away, but the truth is that every true IoT system also includes a SaaS application. SaaS offerings are usually the key to data, analytics and remote monitoring. That’s where the value is in IoT.
However, the world of SaaS offerings, subscription and recurring revenue is somewhat new to many traditional hardware companies. The success of a SaaS application depends on the value customers get out of it. Hence it requires an ongoing customer relationship. Producers that fail to monitor customers’ success and adoption of their SaaS apps put their business at risk because it will become hard to predict renewals and fight churn.
For SaaS offerings, subscription and usage-based models are the monetization standard. When IoT companies think about monetization models, they might include their SaaS offering into a bigger IoT subscription package that is comprised of the device, the SaaS offering and data-driven services, or they might monetize their SaaS offering independently from their hardware business — both options are seen on the market.
‘Everything as a service’ — IoT is transforming objects into services
In the first wave of IoT, device makers were fairly limited in terms of monetization options. Their primary revenues derived from the sale of the hardware device and, perhaps, maintenance revenue associated with the purchase. The next big revenue opportunity lies in IoT services delivered via software, which makes devices capable of providing “everything as a service.” While few standards exist, there are many options to incorporate IoT meters that track usage, time, number of transactions, bytes transferred and other metrics that can monetize appified, task-based usage of software. In addition, IoT services can be monetized via subscription. IoT producers must choose whichever appropriate licensing model enables payment based on how the software is used in performing tasks deemed valuable by users.
The start of a new year always provides companies with a great opportunity to reflect on the challenges they’ve faced in the past 12 months and take stock of the lessons they might have learned from them. While 2017 has seen a large number and variety of cyberthreats hit companies around the globe, this trend is set to continue into 2018 and organizations should re-evaluate their practices now if they want to avoid being caught up in cyberattacks this year.
In 2017, contrary to what many were expecting, it wasn’t the development of sophisticated attacks that troubled Europe, but rather common, known vulnerabilities. For example, in February, bad actors took advantage of Spiral Toys which left 800,000 customers’ details and recorded conversations online and unencrypted. The product that left these details so vulnerable was an IoT teddy bear. Given how accessible known vulnerabilities are to both organizations and hackers, it is critical for enterprises to have good cyber hygiene to prevent hackers from taking advantage of a weak link and exploiting a given vulnerability before it can be patched or updated.
Below reveals what we foresee could happen next…
… from Ayelet Kutner, vice president of engineering, Innovation Center
OT cyberattack that ruffles calm waters. My crystal ball is telling me that as some organizations are slow to adopt IoT/OT security solutions, in 2018, we may see an attack on a large operational technology network — and it won’t necessarily be sophisticated — but it will be enough to significantly disturb normal business operations of an electrical company, water facility, etc. Or impair an organization’s ability to provide services, get paid by their customers, or a similar consequence in the U.S. or EU. It may be brute force or a distributed denial-of-service attack.
… from Jon Connet, senior director of corporate strategy, Business Development
The first IaaS breach, the rise of cyber-physical threats and pervasive encryption. We’ll likely see the first major data breach of an infrastructure-as-a-service vendor, as attackers and threats will continue to follow the data and money. The rise of cyber-physical threats will drive explosive growth in the cybersecurity insurance market. We will see progress on cybersecurity becoming a board-level issue, especially in the industrial space. Additionally, encryption will likely become increasingly pervasive in the corporate and consumer world, but advances in quantum computing may start to erode the effectiveness of traditional encryption approaches at the nation-state level.
… from Bob Reny, principal systems engineer
The intensity level for hacks will continue to rise. We’ve seen a major credit organization hacked… and, I expect moving forward the intensity of these hacks to only increase. Sadly, we may see someone harmed or killed because of rogue code on an IoT device, making it the first death by a remote hack. I also expect to see a change in how artificial intelligence starts to interact with people. Chatbots will become aware and start stealing data.
It seems that everybody is talking about artificial intelligence and machine learning these days. But machine learning is not a new concept — it’s actually been around since the late 1950s. So why all the hype? Why do businesses feel like now is the time to adopt? Before businesses start to develop a strategy around machine learning and AI, it’s important to review how machines really learn, and how this can impact your AI and machine learning strategies.
To start, there’s been a lot of discussion around the difference between AI and machine learning, and even the term analytics has become very nebulous. What’s important from a business perspective is not the technical definitions, but the value it brings you. All of these technologies are powered by data, and the goal for a business is simply to do what it takes to move from data, to insight to outcomes as efficiently as possible.
One area where there is a clear connection between data, insight and outcomes is in the field of maintenance. People have long relied on routine maintenance as an attempt to ward off potential problems. Until recently, we changed our oil every 5,000 miles. Not because there was an indication of failure, but because we were too reliant on routine. In short, we didn’t have the data to tell us whether we needed to change the oil. In this case, it wasn’t a major problem since it’s fairly inexpensive to get your oil changed, but imagine if you had thousands of machines or extremely expensive machines? Taking a single expensive machine offline, like an aircraft engine, when it’s not necessary or managing thousands of machines using a static maintenance routine are both problematic. Read: You’re wasting money when you’re making maintenance changes based on routine. Instead, businesses need to start using data to move away from routine maintenance, which will help save money in the long run.
The large volume of data presents a tremendous opportunity, but it can be hard to take full advantage. After all, you don’t simply need more data to guide your decisions — you’re likely drowning in that already. In fact, one of the biggest questions that you may be facing as a business leader is how to utilize the data you’re already generating on a daily basis.
Perhaps you’re already employing data scientists to build the models you need to make sense of it all, but they can only process a fraction of the data that is flooding in. Hiring more data scientists is not only expensive, but there is such a shortage that hiring enough may be a logistical impossibility. A better solution is to utilize a machine learning platform to remove this burden from the team processing your data, allowing them to focus on making decisions while the platform focuses on analysis at scale.
As we covered above, more data is readily available than ever before. Beyond the IoT sensors that constantly generate machine data, there’s also behavior data from customers and users, and business transaction data in an ERP or CRM itching to be analyzed. Data is in such abundance that the level of contextual insight that can be derived is phenomenal.
Humans, however, have a hard time processing data at this level. The average human is capable of processing several dimensions at once — but when too many variables are introduced, even the most intelligent among us can’t process it all. We start eliminating variables, or we generalize, which reduces the accuracy of our analysis.
Machines, on the other hand, do not have this limitation. Computers boil everything down to binary decisions — yes or no states, 0s and 1s — and eliminate nothing. Mathematical models can then be designed to simulate real-world behaviors, and the computer can run the entirety of the data input against the model to yield a result that is as accurate as possible.
The key then to producing accurate results lies in the model. Ordinarily the creation — and critically, the frequent tuning — of a model is a complex task, requiring the painstaking labor of a team of trained data scientists. To keep the model up to date, it must take in the very latest and most relevant data, and the new analysis must be compared with past analysis and results, leading the data scientist to a series of conclusions that allow them to further increase the accuracy. This can be a difficult and time-consuming process.
Fortunately, it’s all math, and we can use machine learning to automate much of this process — I know it sounds strange, but we are basically using machine learning to automate the machine learning process! Why rely on data scientists to do the manual work of running and deciphering which set of algorithms provides the best prediction? A computer can automate the testing of the models without limiting the tests to a subset of the data. And a computer can do this continuously in order to respond to operational or environmental changes. While it appears as if the machine is learning, it is merely improving the accuracy of the models that will generate the most accurate result. This approach allows you to use your data scientists to analyze the outcomes versus spending all their time doing routine work. Or to enable the average business user (like me) to make sense of the predictions, even when we don’t understand the math. So, while there’s no actual “learning,” the magic is in the math, which is now available to the average organization.
What do you think of when you think of a computer? Most people envision a laptop or desktop station in an office. Some people might think of a server in a data center. Some people might even recognize that mobile devices, like smartphones and tablets, are actually just small, portable computers. How many people think of a connected car?
The car is emerging as the next personal computing platform. Many people have heard that the processing power in an iPhone is greater than that used in NASA’s Apollo computers of the 1960s. But the processing power onboard a car is an equally remarkable growth story. Consider that the computing power onboard a typical car today has increased 10 million times since the 1990 levels — that’s faster than Moore’s Law. In fact, the amount of computation onboard a modern vehicle is increasing by about an 80% compound annual growth rate (CAGR).
Connected cars and autonomous driving technologies are driving incredible value for the automotive industry and have attracted significant investments to “auto tech” in both the private and public markets. Concomitant with this growth is a fundamental shift in the data platforms used to support connected car applications because of their unique requirements. If we look at connected cars and autonomous driving as the next killer app, we can learn from the platform shifts that they are precipitating as harbingers of even greater movements to come. In particular, the data platforms used to support connected car applications must support:
- Real-time processing at scale. Increasingly sophisticated advanced driver-assistance systems (ADAS) need to process large volumes of data onboard the vehicle in real time to provide features such as collision avoidance, automatic braking and adaptive cruise control. The need is compounded as the car achieves greater levels of autonomous driving capabilities. An autonomous driving car gains a holistic understanding of the vehicle’s position and circumstances by combining multiple sensor outputs from devices including radar (10-100 KB/s), sonar (10-100 KB/s), GPS (50 KB/s), cameras (20-40 MB/s) and lidar (10-70 MB/s). In total, about 4 TB of data are generated and processed onboard the vehicle for every autonomous driving hour. The data platform, therefore, needs to support true real-time data processing and decision making (e.g., braking or accelerating).
- Machine and deep learning. While some of the systems onboard the vehicle utilize human-curated rules that help the vehicle make decisions quickly while on the road, there is an increasing emphasis on using machine learning and deep learning to make better decisions in real time. For example, pedestrian detection is difficult to implement using a rules-based system; instead, cars use deep learning models that do semantic segmentation of real-time dashboard-mounted camera feeds in order to detect pedestrians. This shift toward using machine learning requires the use of emerging software frameworks — like Caffe2 or TensorFlow — and there will likely be many more new entrants to come. Moreover, the process of training and deploying machine learning models has precipitated new, iterative development processes that require massive volumes of training data and the close collaboration between data scientists, application developers, data engineers and governance professionals. Data platforms supporting these applications need to support an incredibly broad variety of processing engines and data types and need to facilitate a complex application development process with as little friction as possible.
- Distributed computing. With increasing computational capabilities onboard the car itself coupled with internet connectivity, the modern car is the ultimate edge processing device. In addition to the real-time ADAS functions onboard the vehicle, the car sends relevant summary information a centralized fleet management application in a data center or cloud where it is aggregated across many vehicles in order to analyze fleet performance and to anticipate maintenance issues. In many instances, the data movement between the car and the data center/cloud must be bidirectional so that machine learning models can be rescored and improved over time through experiments in the data center and can be seamlessly redeployed to the vehicles. Vehicle-to-vehicle functionality will require further that the cars communicate in a peer-to-peer network that supports omnidirectional data movement. While the 2000s might have been termed the cloud era, the shift we are seeing today exemplified by connected cars is indicative of the rapid growth in total processing that is being done outside of any physical data center or public cloud environment. Consider that, compared to the 80% CAGR of computation onboard a car, industry estimates from IDC, Gartner and Wikibon put the growth of public cloud computing between 16%-19% CAGR. The data platforms that support connected car applications have to be infrastructure agnostic and have to support continuous, coordinated data flows — seamlessly moving data and compute between the data center and/or cloud and the vehicles.
As the car emerges as the next great consumer platform, the next vanguard of truly disruptive applications is motivating significantly new capabilities at the data platform level. As computation and data processing becomes ubiquitous with an increasing proportion taking place on the car itself, the primary challenges will be to manage the massive volume and velocity of data being generated on the car from different sensors, to facilitate real-time processing of that data using emerging computational frameworks, including machine learning, and to connect the car seamlessly with the data center. It’s no surprise then that, as with smartphones before, data management platform providers are placed squarely at the center of value creation in the connected car ecosystem.
Many consumer IoT companies have struggled to monetize their offerings as effectively as their industrial IoT peers, but some have found the exception. Take connected cars, for example, which are powering nearly half of global consumer IoT revenue. That revenue isn’t just flowing from in-vehicle infotainment features that deliver on improved driver experiences. Automakers are now personalizing the driving experience by giving drivers access to premium services, apps, subscriptions and more in-car and on the web.
Here are three learnings from how leading automotive OEMs have successfully bridged connected devices to drive new revenue streams from IoT:
Turning data into actionable insights
Every connected car generates a treasure trove of data, giving auto manufacturers unprecedented access to user behavior, telematics feedback and environmental data. By collecting and storing all possible usage and sensor-based data for future processing, auto manufacturers can apply analytics to build better products.
Imagine a dashboard that helps product engineers understand driver usage behavior, or gives insight into wear tracking data from a component manufacturer. With big data analytics, manufacturers can glean insights that provide tangible value to both operations and customer service. Based on this knowledge, new products will be created and existing ones will be improved. Products and services that enable rich experiences for drivers and passengers through wireless updates or streamlines the OEM’s operations will get market traction.
Bringing together customers and suppliers
Creating a dedicated place for customers and suppliers to come together allows auto manufacturers to retain customers and establish new revenue streams long after the initial vehicle purchase. For OEMs, this place is an ecosystem, made up of hardware, software, firmware, content and service providers. However, OEMs need to provide a wireless, secure, branded user experience to monetize this ecosystem.
Deploying a trusted commerce platform lets OEMs test a range of flexible business models from one-time purchase, fractional ownership and pay-per-use (such as Car2Go) to bundles or subscriptions (such as satellite radio or Audi on Demand). With this platform, OEMs can easily offer trials, premium upgrades and feature add-ons to customers, who can discover products and services and have them delivered remotely to their cars. Additionally, the OEM can share sales revenue and reconcile cost sharing with partners.
Providing real-life use cases
Tesla recently averted a costly dealership repair of nearly 30,000 vehicles that had an issue with a charging plug software bug with an over-the-air update. Using predictive maintenance data, OEMs can proactively replace “bad” car parts — a win for them, the car dealer and the driver.
Predictive maintenance uses sensor data generated by the vehicle to determine a defect. The OEM can then shut down that sensor and replace the part as needed. When this happens, like in Tesla’s case, the information can be forwarded to a cloud service which interfaces to find a dealer, and after reaching out to the driver using an app on the vehicle head unit, mobile device or portal schedules the service and orders the required parts.
This scenario ultimately ties the traditional servicing model of vehicles to IoT, enabling intuitive servicing and customer retention.
When considering the capabilities of a connected car, the market will drive IoT monetization. Improving customer service, driver experience and vehicle performance/adaptability through IoT adds ongoing value to connected cars, provided the OEM uses the power of data analytics, a central platform for drivers and developers, and a commerce solution to drive new business models and enhance traditional ones.