IoT Agenda


March 26, 2019  1:53 PM

With IoT, patients can get continuous health monitoring anywhere, anytime

Christopher McCann Profile: Christopher McCann
#eHealth #Healthcare IOT #Wearables #wireless medical devices, Connected Health, connected healthcare, healthcare IoT, Internet of Things, iot, IoT device, IoT devices, IoT in healthcare, remote patient monitoring, Wearable devices, Wearables

Remote patient monitoring is gaining momentum as healthcare providers strive to deliver more services on tighter budgets.

With the population aging and facing greater health issues, the pressure on healthcare services is rising and will continue to do so. Providers are grappling with lower reimbursements and a significant trend toward payments for value rather than volume.

To cope with these challenges, the industry is embracing new healthcare delivery models that emphasize more outpatient and at-home services. To enable these new models, providers are increasingly embracing remote patient monitoring. Traditionally, RPM has meant taking existing medical monitoring devices and putting them into the home. A finger pulse oximeter. A scale. All requiring manual entry by the patient or even wired into some kind of central hub before being transmitted through a phone line.

This presents numerous operational and patient experience challenges, but is also of questionable clinical utility. Numerous studies have found that spot measurements, once or twice a day, are of limited clinical efficacy and adherence is poor.

By moving to wearable devices for passive RPM, the amount of clinical data captured massively increases. By then taking that day, applying machine learning and generating actionable insights that enable earlier healthcare to be delivered, a truly valuable RPM model can be delivered. The government has recognized the potential: The Centers for Medicare & Medicaid Services (CMS) recently approved incentives to healthcare providers using RPM and other telehealth systems.

RPM-enabled models allow patients to be treated in hospitals for a shorter period of time and discharged to continue their recovery at home — the care is still managed and readmission less likely. Without RPM, that transitional stage can be a vulnerable one for many people. Patients often have little or no medical support once they’re home on their own and don’t always know when to call a doctor. Connected devices take that burden off the patient. These wearables continuously communicate vital data to healthcare staff, enabling them to detect and address problems as soon as they arise.

It’s also important to recognize that health systems manage patients with a wide range of pathologies, each with a different suite of vital signs and metrics. Until now, we’ve often expected the patient to manage many different devices and self-report but ubiquitous use of Bluetooth Low Energy and near field communication technologies allow a single device to act as an on-body hub, integrating with a range of devices around the patient, such as for spirometry or for glucose.

Armed with this information, doctors can get an early warning of potential problems and provide earlier attention and care to patients in a remote setting. The ability to intervene quickly addresses a huge problem: costly hospital readmissions. If patients don’t report problems and get prompt treatment, they can wind up back in the hospital.

Currently, hospital readmissions within 30 days of discharge are associated with more than $40 billion a year in healthcare costs. Besides placing a burden on the system, this imposes a financial and emotional strain on patients and their families and often leads to poor patient outcomes. Government and healthcare providers alike have made it a top priority to reduce readmissions, imposing financial penalties on hospitals with higher-than-average readmission rates.

RPM can help in this effort, and it’s poised to take off thanks to advances on a number of fronts.

The populations most in need of an effective alternative healthcare service often lie in rural, underserved communities. Internet access and Wi-Fi penetration is low, but with the spread of 4G and the impending rollout of 5G, connectivity is a far lower barrier than it historically was.

Next, the availability of low-cost, high-volume cloud storage and computational resource has significantly increased thanks to companies like Amazon and Microsoft. This allows machine learning to be used at scale to generate early warnings. Presenting large volumes of data to healthcare staff is not valuable, presenting salient data that allows earlier action is.

Lastly, the shifts of payers to begin reimbursing for RPM, largely driven by CMS, will likely lead to an explosion in implementations as healthcare providers and home health agencies vie to deliver the best and most competitive healthcare in their local markets.

Though RPM relies heavily on technology, it doesn’t cut out the human element. Far from it. With this real-time, accurate data on patient health, care managers can prioritize patients and their needs. Apps allow patients to report symptoms, receive medication reminders, review educational material and even visit with their doctor via video. Studies have shown that this approach has a positive impact, empowering patients and leading to better outcomes.

RPM is already moving to the forefront of the industry, and given current and projected future trends, it clearly has significant upside potential for the coming years.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

March 26, 2019  12:43 PM

Prioritizing the IoT device’s security posture in 2019

Stephen Douglas Profile: Stephen Douglas
Enterprise IoT, Internet of Things, iot, IoT attacks, IoT cybersecurity, IoT devices, iot security, IoT testing, securing IoT

When was the last time literally billions of something showed up in global enterprises, seemingly overnight? The projections are changing all the time, but Gartner has estimated that upwards of 7 million IoT devices will make their way into businesses by next year. If IT teams thought they had their hands full with BYOD security, the sheer ubiquity of IoT devices promises to be an exponentially larger burden to secure. This is compounded by the many variations in types, purpose, manufacturer, build quality and country of origin of these devices, which are poised to become instant security headaches. The accelerated growth of IoT cannot be stopped, but how device manufacturers and enterprises improve the security posture of their IoT devices can. This is the only path to safe, expedited rollouts that ensure IoT can deliver on its many promises.

Getting an IoT device quickly to market and deployed has always been the primary objective of the vendor, developer and the enterprise. Security has often been an afterthought during this quest for speed, but now is the time to reverse that psychology. As more IoT devices come online, an enterprise’s attack surface grows. And with a growing eagerness to support more 5G devices, an adjustment in how security is accounted for in the testing phase of devices has never been more important.

Putting the ‘test’ in IoT

Simply put, security needs to be embedded in every part of the testing process. From early on, with the developer testing for software vulnerabilities, to the enterprise’s infosec DevOps teams conducting testing that emulates realistic application traffic while validating security coverage from the enterprise to carrier-grade network capacity. Developers and infosec teams can’t work in silos. They should be collaborating from day one and be aligned on the requirements for securing an IoT device. Unlike a network switch or web portal that uses standardized interfaces and protocols, IoT devices can run on completely different protocols that are not addressed by traditional enterprise security and management measures. When that’s not taken into consideration, the potential for an IoT breach runs high.

To truly improve risk mitigation of IoT devices, comprehensive testing requirements are needed, but it could be several more years before that happens. In the interim, parties that have a stake in ensuring the device’s security should follow the most credible industry guidelines available — including GSMA’s fairly new IoT security guidelines. While it’s unrealistic to abide by all guidelines, developers and infosec should curate and reach agreement on the checklist and risk assessment items in the guidelines document that both groups will abide by throughout the testing lifecycle.

A few security ‘things’ to consider right now

As IoT devices become more complex and operate in more complex environments, hackers will continue to evolve tactics to exploit them. With that in mind, there are steps that IoT device manufacturers, enterprises, developers and infosec teams that deploy devices can take right now to take significant strides toward better security practices. From Spirent’s work with the expanding IoT ecosystem, here are three simple rules we often offer for anyone exploring where to start with IoT device testing:

  1. Eliminate vulnerability management complacency. If a vulnerability is found early on, act with urgency to patch and mitigate. Compiling known vulnerabilities and waiting until the end to patch or just after a device is deployed is not suitable.
  2. Stress test, and stress test some more. One of the more common tactics among hackers is executing a brute force attack, which applies numerous combinations of usernames and passwords to gain access to the network and follow through with a distributed denial-of-service attack. Knowing the number of available usernames and passwords accessible to hackers via the dark web, enterprises executing stress testing — encompassing network penetration testing and simulating actual cyberattacks on the network — is a given.
  3. Get multiple reads on vendor security systems and software. IoT device ecosystems continue to grow, and vendor systems and software need to keep pace with their level of protection. But sometimes that’s not happening or what the systems or software can do no longer meet the ecosystem’s requirements. While vendors may offer up their own testing and reads on things, enterprises need to make the investment to do more testing themselves.

Testing brings an invaluable level of assurance to IoT devices, but enterprises need as much confirmation as possible that the security designs and capabilities of these devices will hold when deployed. Just last summer, CTIA introduced a new IoT cybersecurity certification program that was developed in collaboration with wireless providers, technology companies and security professionals to provide a new level of confirmation. The program enables IoT device vendors to submit soon-to-be-deployed devices to authorized test laboratories for aggressive security testing which accounts for the device’s security features, complexity, sophistication and manageability. In recent months, I’ve had an opportunity to be part of the initial certification testing in our authorized test laboratory and have come away with some interesting learnings. Numerous devices have included low-cost CPUs and open source operating systems from suppliers. There is a tendency for either or both to possess more weaknesses largely due to the fact they haven’t been updated to support newer threats including Spectre and Meltdown. The takeaway is IoT device developers cannot simply trust a CPU and OS as being secure out of the box.

As IoT devices become more pervasive in the enterprise, use cases and functionality will continue to expand. This means devices will interact with more networks, infrastructure and components, creating an even larger attack surface that provides a constantly moving target and will need to be continuously accounted for when working to manage overall security posture. It’s why IoT device security testing can’t be conducted in silos.

A holistic approach doesn’t need to bring an overly complex testing strategy. By standardizing IoT device testing and following a consistent set of testing activities, little room is left for confusion or gaps, creating the ideal environment for device security posture to improve.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


March 25, 2019  2:50 PM

Super friends: IoT, AI and cloud create a powerful team

Helena Lisachuk Profile: Helena Lisachuk
ai, AI and IoT, cloud, Internet of Things, iot, IoT and AI, IoT applications, IoT cloud, IoT data, IoT management, IoT strategy, utilities

What do blockbuster movies have to do with IoT? More than you might think.

Superhero movies are the big box office draw right now, with superhero team movies leading the way. Even if you’ve never seen one, you probably know the formula: A group of superheroes with different powers and vulnerabilities band together to defeat the biggest, baddest villains imaginable. It’s not a new formula, but it works. And I’ll suggest it also works to tell the future story of IoT. Here’s how.

Think of IoT like a superhero team: The trick to unlocking its greatest potential is to bring together a diversity of technologies — the strengths of each perfectly compensating for the vulnerabilities of others — then deploy them against the biggest, most intractable business problems. The super team of IoT, AI and cloud is a great candidate for such a story.

Tech superpowers

Everyone knows superheroes have certain vulnerabilities, like kryptonite, and dependencies, like a magic ring. Tech superheroes are no different. No single technology can do everything, and every technology has its dependency: lightbulbs need electricity, cars need gasoline and so on.

So, while we know that IoT can find powerful new insights in almost every aspect of daily life, there’s an implied dependency: the ability of human analysts to find those insights among mountains of data. According to some estimates, IoT-connected devices produce almost 1,000 times more data each day than the entire internet in 2005. And while there will be ever greater, more useful insights in all that data, no human could hope to uncover them in a single lifetime. This is our cue for the next technology super friend: AI.

Where IoT’s superpower is to sense and generate tons of data, AI supports by sorting through that data, making it actionable, and getting smarter as it does so. With AI, we can now process human speech in real time, determine where time-sensitive shipments need to be routed in a supply chain and many other applications that would have seemed like the stuff of comic books just a decade ago. But AI’s superpower has its dependency too — it often needs more processing power than most edge devices can provide. That’s where the call goes out for the cloud.

The cloud’s superpower is ubiquity, expansiveness and being on-demand — which means it doesn’t just store all the data coming from IoT’s sensors, it can also host AI and machine learning tools. With this new team member in the mix, IoT users can rely on the processing power and sophistication of cloud providers and even select from a variety of AI tools that suit their particular needs in the moment. In fact, as edge devices become more sophisticated, users can increasingly run simple AI tools locally at the edge, without the need for costly backhaul communications from edge devices back to central servers.

IoT, AI and cloud on a mission

IoT, AI and the cloud’s strengths and dependencies fit together nicely on paper, but their real power comes to play when they work together to tackle real-world challenges.

Like managing a power grid. Traditionally, power grids were centrally managed by utilities that controlled both the production and transmission of electrical power. The only variable was customer demand. But today’s modern grid has many more variables: power production from renewable energy sources vary depending on conditions, while small-scale solar and other generation technologies add to the complexity as utility customers both consume their own power and sell their surplus back to the grid. And with electric vehicles of all kinds proliferating, this complexity — and the challenges it brings — will likely grow.

But together, IoT, AI and cloud can help utility managers not only address all of these new variables, but make use of them to create a more efficient, greener grid.

In the United Kingdom, the independent energy supplier OVO provides one example of how IoT, AI and cloud can come together to create an entirely new business model based on data.

OVO is trialing a service for residential consumers with solar panels that combines a variety of sensors and data sources with in-home (or in-car!) batteries. The IoT sensors track the home’s energy usage and battery power levels, communicating that data to the cloud where it’s combined — by AI — with data like public grid load and current electricity prices. With this enhanced data, the AI can then direct the customer’s batteries to store excess energy when grid demand grid is low and release it to the grid when demand is high.

This business model gives homeowners stable, fixed prices on their energy, while the grid (and the public it serves) gets access to more energy in distributed storage. This kind of smart management of demand is a way to a transition to greener sources of energy. (See our article on disruptive innovation in the power sector to learn more.)

Not the only solution

But just as a blockbuster may not be a good fit for an art house festival, the combination of IoT, AI and cloud may not be right for every problem. Communicating data collected back to a central cloud may not be cost-effective for simple IoT applications in austere environments, like pipelines. Similarly, the time required for communications and analysis may make this combination difficult to use in applications — like monitoring pipeline valves — that are always-on and require minimal delay. Such analytically simple tasks don’t benefit much from AI, and the valves are often in geographically isolated locations making communications costly.

So while IoT, AI and cloud can accomplish incredible things together, the first step for any business should be to think through the following three criteria:

  1. Does the volume of data collected from a potential IoT system require more than simple human analysis or could it benefit from deeper analysis?
  2. Can backhaul communications be established from the edge back to central servers? If that communication cannot be constant, can it at be established at regular intervals or paired with simple at-the-edge AI?
  3. Can reliable training data be found for the AI?

If the answers to these questions are yes, you may want to reach for whatever serves as your bat-signal and call in the super team of IoT, AI and cloud.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


March 25, 2019  11:26 AM

We can see clearly now: The value of supply chain visibility

Michael Burdiek Profile: Michael Burdiek
IoT data, IoT supply chain, Logistics, Smart sensors, smart supply chain, Supply chain, Supply Chain Management, supply chain visibility, telematics

Few parts of enterprise operations have transformed more radically over the years than supply chain — as companies struggle to navigate globalized sourcing, the rise of just-in-time delivery models for manufacturing and, most importantly, unprecedented customer expectations for product availability and speed of delivery.

Ironically, while companies are using extended supply chain networks and heavy outsourcing to meet these challenges, the solutions they come up with create a big new problem of their own in the form of lost visibility. As we’ll see in the post, lost visibility into your supply chain means lost value. The good news is that the right telematics and related technologies can help maintain powerful visibility and control over even the most complex supply chain networks in service today.

Visibility = value

Among the many impacts of the Amazon effect is the heightened demand for consumer visibility into availability and shipping logistics, from the first mile to the last. Not surprisingly, businesses see value in getting the right supply chain visibility to meet those customer expectations and ensure efficiencies in their own operations.

The hurdles come in trying to actually implement in some tough supply chain use cases. For instance, think back to the Amazon effect and the prototypical example of a book. Supply chain visibility into that book is one thing, but what if your supply chain involves highly volatile, highly valuable pharmaceutical cargo that’s environmentally sensitive? You’d want nuanced, real-time and real-world data and analytics to keep that kind of sensitive cargo on track. Smart sensors and granular data mean we can provide real-time operational intelligence to not only reduce the risk of cargo theft and spoilage, but also document compliance with regulations and drive operational efficiencies that benefit everyone, from shippers to the end consumer.

Connecting systems for stronger solutions

For many, additional possibilities come to life when you realize that, rather than scrap all your existing, limited capabilities in favor of some brand new system, you can create new visibility by integrating and strategically augmenting many of the capabilities you already have.

For instance, look at any tractor-trailer on the highway today and you’ll likely see a telematics device in the cab to track location, a separate system that’s monitoring cargo temperature and a third system for security to protect the valuable cargo from theft or tampering. That’s a lot of duplication of systems and services for what remains silos of uncoordinated insight.

Source: CalAmp

What if we instead choose to fold in point solutions and targeted capabilities together with larger “system of systems” integrations that create more sophisticated outcomes — like the difference between assisted braking and full autonomous drive?

That last auto analogy is not random. Tesla took years of laying the sensors, telematics, actuators and other infrastructure into its vehicles — the cars were preemptively designed to operate and engage these sophisticated systems that wouldn’t be enabled for more than a year into the future. That point in the future came when Tesla finally downloaded its first AutoPilot program to users who — literally overnight — were able to suddenly enable new capabilities from latent technology.

Making it reality

So, now that we realize we’re not starting from scratch — that we can use some of the same capabilities we might already have, but in more integrated and orchestrated ways — how do we make this transition while keeping our business running? In other words, once we realize we don’t need to reinvent the wheel, we’re still faced with having to change the wheels for supply chain visibility while our business is moving at full speed.

The best way to go about it is to focus on the infrastructure first and then introduce the capabilities when the time is right. Think back to our tractor-trailer example: We’re faced with having to pick the best system — perhaps see whose contract is up for renewal or whose existing infrastructure is most interoperable — and weigh everything as a business decision. One way or another, try to introduce some coordinated technical standards and interoperable infrastructure for the telematics throughout your supply chain.

Conclusion

When you think about it, these ideas make total sense. So, why haven’t we seen more people connect the dots? My sense is that sometimes you don’t see an advanced solution until you’ve made enough progress in the industry to uncover the advanced business problem behind that solution to begin with.

At least, that’s why supply chain visibility is on our minds so much here at CalAmp. We’re in a position as a market leader in telematics not only to see what connectivity and telematics can do today for supply chain visibility, but also connect the dots on what those capabilities can be tomorrow if we bring the right vision, strategies and tools to the market.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


March 22, 2019  11:32 AM

Emerging frameworks for cross-silo IoT data models

Chonggang Wang Profile: Chonggang Wang
data framework, Data Management, data model, data modeling, Internet of Things, iot, IoT analytics, IoT data, IoT data management, IoT data model, IoT framework, IoT interoperability, IoT standards, Standardization

As far back as 2015, in one of the earliest macro-assessments of IoT, the consultancy McKinsey quantified the market opportunity in nine categories. They ranged from consumer homes to wide open spaces and included home automation, factory, personal monitoring — health and fitness — and smart city applications. Each category, of course, can contain many hundreds and thousands of individual applications.

When the IoT market began to take shape, the initial focus of solution providers was on connectivity. There is a raft of market studies on the billions of potentially new connected devices for technology providers and connectivity service businesses to target. However, the work involved in connecting physical and virtual things to the internet is just a start. An increasingly important objective is to handle the data that comes from connected assets. That, in turn, means finding ways to connect, manage, analyze and, eventually, share data among different organizations.

In addition to the novelty of large-scale and distributed data management, the characteristics of IoT as a heterogeneous system of systems introduce an additional layer of complexity. Industry attention is now shifting from connectivity to new approaches to understand and interpret data. Heterogeneity introduces other new challenges, but there are issues in making data interoperable. This may be across application silos, between partners in a supply chain or between vendors of interchangeable devices and sensors.

The value of data models

A good data model can solve these issues. In other words, IoT data modeling offers an approach which could more efficiently describe, interpret, analyze and share data among heterogeneous IoT applications and devices.

Fortunately, several data models already exist. Many of them have been developed by different standards development organizations. Some of them are for specific IoT vertical applications or domains. For example, Smart Appliances REFerence provides a shared model for home appliances. Data models from the Open Geospatial Consortium are more for geosciences and environment domains. The Open Connectivity Foundation specifies data models based on vertical industries such as automotive, healthcare, industrial and the smart home. The World Wide Web Consortium Thing Description provides some vocabularies to describe physical things, but does not have much focus on data.

In contrast, Schema.org operates as a collaborative community and it aims to provide more general and broad vocabularies and schemas for structured data on the internet. A collaborative approach to integrate and unify various data models is critical and necessary to work across IoT application and organizational boundaries. It requires cooperative efforts among different industry bodies.

OneM2M’s role in data model standardization

OneM2M, the global standardization body, was established to create a body of maximally reusable standards to enable IoT applications across different verticals. OneM2M focuses on common services, such as device management and security, that are needed in all IoT applications. The oneM2M standard takes the role of a horizontal IoT service layer between applications and the underlying connectivity networks. In doing so, it masks technology complexity for application developers and hardware providers.

Amongst its various standardization efforts, oneM2M takes a collaborative approach in developing its data model-related specifications. For example, one of its technical specifications, TS-0005 for management data model, is the result of a collaboration between oneM2M and the Open Mobile Alliance.

Another specification, TS-0023, laid the groundwork for a Smart Devices Template that was first applied to create a Home Appliances Information Model and Mapping. In the next release of oneM2M, Release 4, the underlying data model principles will be extended to support information modeling and mapping for vertical industries, such as smart cities.

New ideas for IoT data models

The focus on frameworks to manage data models across application verticals and domains sets the stage for the next phase of the IoT industry’s growth. This topic will be featured on the agenda at the forthcoming Internet Engineering Task Force (IETF) 104 Meeting on March 23- 29, 2019 in Prague, Czech Republic. Two IETF working groups, Constrained RESTufl Environments (CoRE) WG and Thing-to-Thing Research Group (T2TRG) will explore new ideas around IoT data models as they affect IoT application layer issues.

As one of oneM2M’s representatives at the IETF meeting, it will be interesting to hear about the latest progress on the CoRE and T2TRG activities. The meeting will also be an opportunity to explore new developments and possible collaboration opportunities around the use of data models for IoT management and security.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


March 21, 2019  2:45 PM

Evolution of consumer IoT: 2019 forecast from WEF’s annual meeting

Niall Murphy Niall Murphy Profile: Niall Murphy
ai, Consumer IoT, Internet of Things, iot, IoT data, IoT device makers, IoT devices, IoT hardware, IoT products, IoT software, IoT supply chain, smart manufacturing, smart supply chain

I had the opportunity to take an active role in this year’s discussions at the World Economic Forum’s (WEF) annual meeting at Davos. Selected as a WEF Technology Pioneer, it is both exciting and validating to see the themes we evangelize come into focus with live debates and energetic conversations in the snowy, sunny Swiss Alps.

Consumer diversity in 2019

A lot of the discussion at Davos this year focused on the diversity of the environments in which consumer product brands and manufacturers are operating — for instance, balancing the need to respond to Gen Z’s values-driven purchasing behavior in contrast to the experiential needs of aging populations in developed markets. Or the tapestry of data management regulations across the world — Europe’s GDPR in one corner, China’s regime in another, and the relatively loose framework in the U.S. in another.

The fundamental message is we now live in a world where:

  • consumers expect localized and personalized experiences;
  • the effectiveness of data integration and data flow has to navigate national rules; and
  • the competence of instrumentation and data management has moved from being a cost-center issue to being a value-driven determinant of competitive advantage.

Global impact of IoT

Specifically in 2019, we are seeing a couple of key drivers influence the global impact of IoT in the consumer products industry:

  1. consumer demographics and behavior; and
  2. digital-driven competition.

With this in mind, following are five trends to watch in 2019:

1. Transparency of product provenance and new models of localized and personalized retail and service delivery.
Meeting the divergent needs of Gen Z and Gen X is challenging the consumer products industry. Gen Z consumers are making values-based purchasing decisions, demanding transparency from brands. At the same time, Gen X and the aging population of retiring Baby Boomers are demanding ever more convenience. Brands that can deliver both transparency and personalized consumer experiences are going to rise to the top.

2. Increasing rules and competencies to gather and make use of real-time product data on a global scale — while protecting consumer trust.
When it comes protecting and managing consumer data, global brands are faced with a tapestry of rules and regulations. For instance, Europe led the way with personal data protection and privacy with the GDPR, which is great if a brand is only operating in Europe. But if a brand is global, it not only has to worry about the GDPR in Europe, but also a completely different set of rules in China, somewhat laissez-faire rules in the U.S. and so on. Despite the global differences, what is clear is regulations to protect consumer data and trust are inevitable. There simply isn’t going to be one consistent rule set. Global brands must navigate different rules in different geographies.

3. Real application of artificial intelligence to production and delivery processes.
We are increasingly seeing consumer product brands apply artificial intelligence to production and delivery processes to ensure supply chain integrity and to enhance speed to delivery. Not only are machine learning and AI transforming today’s supply chains and manufacturing plants, but the technology is being applied at scale.

For instance, machine learning is helping brands detect counterfeit products and protect against parallel trade at scale. Machine learning is also being applied in manufacturing plants to optimize the sourcing of raw materials and to predict movement in inventory.

We will see continued adoption of AI and machine learning as the technology becomes more accessible. This is important because the competitive advantage is exponential. There is a real sense that the countries, companies and platforms driving AI have a disproportionate amount of power in the emerging world.

Source: Evrythng

4. Scaling up of new operating models for manufacturing to compete with agile and dynamic value chains.
At Davos, there was a lot of talk about the need for supply chains to operate in a much more dynamic fashion to meet real-time consumer demand. Not only do consumers expect real-time delivery of products and services, but they also want them customized to meet individual needs. The traditional model of high-scale manufacturing, churning out gazillions of the same widget, is no longer applicable. Manufacturing needs to adopt new models to close the gap between design and delivery in an era of instant gratification and mass customization.

5. Urgent — and necessary – high-impact response to sustainability and, specifically, to climate change.
We all need to be taking steps in a sustainable direction. Working together as a global economy to build consumer awareness and mandate supply chain transparency is the only way we will achieve a truly circular economy.

The good news is that brands are investing downstream in their sourcing, manufacturing and distribution processes to improve their capacity for sustainability. The challenge is making this information available to consumers. With strong consumer awareness, sustainability within the supply chain becomes a competitive asset for brands, strengthening brand trust and authenticity for consumers and winning market share. Today, consumers prioritize brands that promote sustainability as they seek ways to make an impact with their personal behavior. See my WEF blog post touching on this theme.

The future of IoT for consumer product brands

For my company’s mission to organize the world’s product data, it’s an encouraging picture. There are a lot of issues to work through for digital transformation in the consumer product industry to yield its full value potential, but the framework is coming into clear focus in 2019.

The ability to have traceability of every product item through the supply chain — into the hands of the consumer and beyond — is table stakes. Direct connection with the consumer is the only way to compete in an experience-driven consumer world. The fidelity and real-time application of data gathered during the product’s supply chain journey is the keystone to competitive advantage for consumer product brands — for efficiency, integrity protection, transparency, compliance and sustainability.

In the months ahead I’m looking forward to sharing more thoughts on the future of IoT for consumer product brands including:

  • its impact on sustainability in manufacturing;
  • blockchain in the supply chain; and
  • consumer trends where these innovations are being applied.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


March 21, 2019  12:19 PM

IIoT adoption: What’s on the horizon?

Rick Harlow Profile: Rick Harlow
ai, Blockchain, IIoT, industrial internet of things, Industrial IoT, Internet of Things, iot, IoT applications, IoT data

Working with senior leadership teams across multiple market segments over the past six months, such as automotive, energy, transportation, manufacturing and government, has been nothing short of amazing to experience. It is easy to see why the global industrial IoT market is expected to reach a value of $922.62 billion by 2025 according to recent a Million Insights report. This growth is due to the worldwide rise in IoT technology development and implementation in the past few years.

I have been able to personally witness some of this explosive growth of software development and industrial IoT development over the past six months at Deasil Cognitive. With the core team being together for over 20 years and having extensive experience in collecting, moving, buffering, queueing and processing data, and building frameworks around the implementation of bleeding-edge technologies in artificial intelligence, machine learning and blockchain, the team is able to rapidly develop applications, quickly integrate new sensors and data acquisition devices, and implement game-changing sysetms for these market segments.

I have watched the Deasil team significantly improve productivity and ROI across multiple client projects, using Kubernetes, Docker, Golang, Cassandra, Kafka and Elastic, to name a few. The team is developing highly productive, stable, clean and faster applications and the results are beautiful and innovative, including IoT management systems, IoT implementations, mobile applications, business intelligence and data management platforms.

Some of the projects include figuring out how to use the massive amounts of data being collected on vehicles to provide more personalized services to drivers. These types of systems are designed to give customers a more personalized experience while also driving more customer loyalty and new revenue streams for the automotive industry. Applications for the energy sector have included streaming data from numerous sources to provide insights on where to allocate funds for potentially catastrophic risks, reduce product loss, property damage and even environmental risks.

As an authorized government software development partner, we have been able to identify ways to help the Department of Energy analyze Synchrophasor data. Synchrophasors are time-synchronized numbers that represent both the magnitude and phase angle of the sine waves found in electricity, and are time-synchronized for accuracy. They are measured by high-speed monitors called Phasor Measurement Units, or PMUs, that are 100 times faster than SCADA. These types of applications will help greatly mitigate risks, improve quality of service and reduce unforeseen downtime for utility companies.

Industrial IoT and software development technologies don’t always have to be so complicated, though. Figuring out ways to help companies digitize data that is being collected across the globe — where there are many mistakes and costly inefficiencies — is a great first step in helping companies begin their industrial IoT journey. In just being able to digitize the data, customers can begin to see a real benefit of reducing waste. The second step is where the magic happens. This is where they can begin to easily understand whether vendors are meeting the deadlines on delivery of thousands of components used in the field or whether or not customers are using their buying power across the many projects and vendors available to get the best pricing for the company. Data automation platforms, like Acuity, help enable companies to greatly improve operations, customer experience — with on-time delivery of projects — and notably improve their profits. Combining this kind of application with live operations is nirvana, especially where early warning signals can be tied to vendors to, for example, drop ship replacement parts, engines or pumps to reduce downtime.

What do I see coming down the pipe for industrial IoT and blockchain? For one, we are merging machine learning with blockchain. Machine learning relies on vast quantities of data to build models for accurate prediction. A great deal of the overhead incurred in securing this data lies in collecting, organizing and auditing it for accuracy. This is an area that can significantly be improved using blockchain. By using smart contracts, data can be directly and reliably transferred straight from its place of origin and improve the whole process significantly by using digital signatures.

I am looking forward to sharing more soon.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


March 21, 2019  11:09 AM

Three strategies for vendor success in IoT markets

Ann Bosche Profile: Ann Bosche
end-to-end IoT, end-to-end systems, Internet of Things, iot, IoT adoption, IoT analytics, IoT applications, IoT devices, IoT hardware, IoT scalability, IoT software, IoT use cases

As technology vendors, cloud service providers, analytics firms, network equipment makers, industrial equipment OEMs and telcos compete for leadership in IoT markets, the next few years will be critical. Bain believes the market for IoT hardware, software, systems integration, and data and telecom services could grow to $520 billion in 2021, more than double the $235 billion spent in 2017.

To reach that scale, however, market leaders will need to continue to make gains while expanding their industry-specific offers. Incumbents across categories who cannot move quickly enough to address customers’ needs are likely to get leapfrogged by more nimble competitors. Device makers, in particular, run the risk of seeing software and analytics competitors capture the value of solutions, leaving them to deliver lower-profitability hardware components.

The right actions vary, of course, from one company’s situation to another. But three themes are nearly universal for IoT vendors.

Focus on getting a few industries right. Customizing use cases for each industry and packaging systems intelligently are emerging as keys for success. Leading vendors are targeting their technologies on fewer industries than before — a welcome change that will allow them to deliver offerings better suited to customers’ needs. They should continue to narrow their focus: More than 80% of vendors still target four to six industries — too many to build depth rapidly (see Figure 1). Focusing on two or three verticals allows vendors to gather significant industry expertise, which can help them maintain a competitive edge against the more generic offers by cloud service providers.

Develop end-to-end systems. As vendors gain experience implementing IoT systems in specific industries, they can develop cost-effective end-to-end packages with partners — something that buyers have been clamoring for. Many IoT deployments require customization, usually based on the industry. More than 60% of customers say the technologies they buy are at least 25% customized. When vendors explore the particular use cases of an industry, they learn about the different data sets required, the sensors measuring it and how to process it to glean valuable insights. From this, they discover what’s transferrable to the next customer. They can then create standard packages, reducing customization requirements, shortening sales cycles and increasing the likelihood of success.

Several companies are showing how to adapt technology to markets in this way through partnerships and acquisitions. IBM Watson, for example, develops proofs of concept with customers and applies those lessons to develop industry playbooks that include multiple use cases. IBM Watson’s concept work with Samsung, for example, informed later work with elevator company Kone and French railroad operator SNCF. Verizon chose acquisitions as a way to gain deep expertise in telematics, buying Hughes Telematics, Fleetmatics, Telogis and Movildata to boost its fleet management.

Prepare to scale by removing barriers to adoption. Customers and vendors alike believe that progress has been slower than they had expected in overcoming the main barriers to adoption (see the Bain Brief “How providers can succeed in the internet of things“). However, vendors have more experience with operational considerations today than they had two years ago, and customers have a better understanding of the necessary investments. Customers also have more realistic expectations about the returns they can expect.

Leading vendors will translate that experience into repeatable playbooks that address their customers’ concerns about security, integration and returns on investment. Understanding these pain points is the first step; addressing them and baking them into end-to-end offerings will position technology providers to deliver cost-effective IoT systems that can scale.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


March 21, 2019  9:24 AM

Making connected cars a successful reality for developers and manufacturers

Kevin Valdek Profile: Kevin Valdek
"Automotive industry", app developers, app development, Connected car, connected car data, connected cars, IoT data

Welcome to the final post in this three-part series where we’ve been looking at the disconnect between developers and car manufacturers, and the roles played by each group. In this post, I’ll discuss the challenges still left in this delicately balanced ecosystem and the future of connected vehicles as a whole.

Despite improvements in the relationship between carmakers and the development community, there are still a number of challenges ahead if the connected car industry is to flourish and meet customer demands on services.

Manufacturers remain cautious toward the open app ecosystem

Despite great advances being made in the areas of data security and user privacy, a major challenge that persists comes from the automakers themselves. Essentially, manufacturers still have privacy concerns about an open app ecosystem. This fear stems from historical security breaches whereby malicious apps have had access to sensitive data. These kinds of security breaches damage the carmaker’s brand and, in turn, erode customer trust, which takes considerable time to rebuild. Approaching an open app ecosystem with caution is certainly advisable, however this approach can cause delays to the process of bridging the gap between developers and carmakers.

But it’s not just malware or security breaches that carmakers are afraid of. A desire to maintain apps and services of a consistently high quality is also a strong motivator. Manufacturers need their apps to be a true reflection of their brand and to match the existing expectations of their customers. For these reasons, the carmakers themselves like to retain a great deal of power over app development for their connected vehicles.

Finally, and understandably, it is no quick process gaining trust in such a long-established and traditional industry. This is not necessarily a bad thing; it’s simply the way it is. Developers need to ensure they work closely with carmakers to protect customers and ensure that their apps are not only brand-compatible, but also of the highest quality.

Payment models for services need more deployment

In many ways, apps created for connected cars are hitting the sweet spot of the market: tech-addicted millennials who are buying their first cars. These young people won’t be satisfied with a run-of-the-mill vehicle that simply takes them from A to B; they will be demanding a car that provides them the entire connected package and a driving experience that connects seamlessly to the rest of their online lives. For many of these, driver apps are seen as the obvious way to receive, send and share content such as music, navigation, news and recommendations — exactly the sorts of content and services apps within a connected car could offer on the move.

If they’re willing to build the right apps to target this particular group of young people — a group typically renowned for having a disposable income, as well as the time to spend it — app developers with their nose to the ground could build long-lasting and profitable businesses. In fact, a 2014 Deloitte study found that 52% of millennials — those born roughly from 1984 to 1995 — say they want apps on their dashboard.

However, despite the huge potential from this still largely untapped corner of the market, for many developers and third-party services it is still unclear how a connected car app or service charges its customers and turns a profit, not to mention what kind of payment models a third-party service might be able to expect from a carmaker in the first place. With these financial arrangements still largely unclear to everyday app developers, it’s no wonder that some are put off engaging in connected car app development and sticking with smartphone applications — which are, in comparison, relatively easy to monetize — instead.

Another financial reason which may be holding back regular app developers from getting involved in connected app car app development is the length of development time and testing time. These processes can be significantly longer for connected car apps compared to smartphone apps and for those developers working on a budget they simply don’t have the time to invest.

Carmakers need to make transition smooth for developers

In a more general sense, when we think of how to connect the worlds of development and carmakers, carmakers need to think practically about what they can bring to the table in terms of technology. Discoveries and innovation from third parties which lead to new products and services can only be made if there are ample software tools available for developers, as well as those crucial open APIs.

These resources come directly from individual car manufacturers and their availability is, ultimately, what will dictate the success or failure of their individual connected car enterprise. The more open APIs that are accessible to developers, the more varied services they can offer their customers. The more carmakers that realize this and respond to it with the right tools and software, the quicker we will see changes in connected car app development software.

Conclusion: Bridging the gap

Although we have come a long way in the last few years, there is still some way to go before we have fully bridged the gap between carmakers and third-party services and developers. Despite huge advances in technology, in order to bring these two parties together there needs to be more than simply the technical environment. There has to be a strong desire on both sides for collaboration and compromise, with initiatives coming from developers and carmakers, as well as incentives for all involved. For optimum collaborations and for companies to invest the time to start working with car data, there has to be minimal friction.

As we have seen, data privacy and security is still a huge concern for carmakers. Gaining trust of carmakers and consumers by proving the technology is fully secure will make it easier going forward to collaborate. This is already happening and the case for an open ecosystem is getting stronger by the day.

As things currently stand, the future for connected cars and the services that work in them is looking extremely positive. With the additional support of accelerators, incubators and hackathons, small businesses are getting the support — financially and in terms of technology — that they need to build and test their applications, plus the exposure to major car companies to get their projects off the ground and into cars. Increasing customer expectations are also putting pressure on the carmakers to shorten development and testing time, which again prompts them to promote and support connected car app development.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


March 20, 2019  12:53 PM

Future of the automotive industry: IoT-enabled cars

Ankit Patel Ankit Patel Profile: Ankit Patel
Connected car, connected car data, connected cars, infotainment, Internet of Things, iot, IoT sensors, IoT software, Sensors, Traffic management

Imagine a day when a driverless car is roaming on the road with happy and contented passengers enjoying the seamless integration of entertainment, music, road safety, live traffic information and updates, and connectivity to other vehicles. It sounds too good to be true, but soon this once seemingly unrealistic imagination of yesterday is going to be the trending fact of tomorrow.

In this era, when connectivity of people via digital and social media has proved to be a boon, who would not be amazed if their cars are connected to themselves as well as to its colleagues on the road? For a safe ride with no hassles of travel, like getting stuck for hours in traffic and arriving late to the office, users will get a health alert of the car before the chaos actually happens — motorists will be more than enthralled to incorporate this technical revolution in their vehicles.

Automotive giants across the globe are continuously striving to provide the above-mentioned amenities at affordable prices. An application like IoT in the automotive industry is going to have its own plan or course of action, a scheme to be implemented at consumer and commercial levels. But the basic attributes of these new ideas can be seen spreading their wings on the seven most eye-catching needs of consumers, which the automobile industry should bank on.

1. Safety at its best

Whenever a car embarks on a drive, the first wish travelers on board is safety. Connected cars using IoT technology will be intelligent enough to detect probable geographical spots of collision by amalgamating the mechanics of wireless mesh-networked cars and other technologies.

Rearview cameras and proximity sensors also aid in safe and judicious driving and parking. A composite technology comprising of crash avoidance, like forward-collision warning or lane-departure warning, combined with the necessary backing of communication and damage-control alerts will be vital in the worst-case scenario of a crash.

2. Crisis identification

How wonderful would it be if we could sense problems before they happen? Though life does not give such a provision, today’s IoT connectivity gives insights on vehicle health by initiating informative inputs before embarking on a road journey. Be it the exposure of the principal parts to wear-and-tear or the potential breakdown of a crucial component, sensors fitted in the most vital elements will indicate elementary parameters, like temperature and the position of the engine, well ahead of time to make the owner aware of potential issues that may arise. It is advantageous financially and practically to have a premonition of the events likely to occur so that they can be avoided.

3. Receptiveness and serviceability

In the unfortunate event of accidents, IoT-enabled cars with collision-recognizing abilities can act as senders of life-saving messages, using cellular networks to alert emergency medical aid services. Moreover, manufacturers can improve customer service by providing relevant apps that are best operated on smartphones. Drivers can get the updates and alerts, such as service due dates, the nearest fuel stations, updates on general maintenance of the vehicle and so forth.

Apart from end-user convenience, manufacturers can collect consumer data to improve manufacturing processes. This data, blended with dashboard tools, provides results like enhancing lifecycles, lowering the cost of maintenance, minimizing response times in case of an emergency and more.

4. Structuring traffic movement

By implementing IoT to monitor traffic control, movement on the roads can be modulated by getting live updates of traffic and area-wide congestion. Destination and in-transit parking spots can be monitored, thus making the driving experience less cumbersome in major cities.

5. Entertainment with information

The present-day technique of selling information with a hint of entertainment has taken shape in mid-sector and high-end vehicles by means of a third-party instrument, like a smartphone connected to a cellular network. Google, Apple and others are providing services by pairing with many known giants of the automotive industry.

Features like maps, music and more are played or operated via integration with an external device, but in the years to come, external connections will become extinct as vehicles offer their own seamless software that will stream in-vehicle packages.

6. Rising need to track

Tracking is a need of every individual, whether commercial or consumer end user. Knowing where vehicles and their contents are will be critical to track fleets and goods, recover stolen vehicles, track in parking lots, and even for parents to track speeding trends reckless teenage drivers.

GPS-equipped facilities can satiate vehicle safety needs, as well as regulations and legalities of traffic bodies of states, national highways and other major connecting roads. A self-driving car agency can locate its vehicle sitting in Delhi, even in the uneven and rugged terrains of the Leh Ladakh region, thus abiding to the regulations of over speeding. The plugging in and unplugging of such devices can also be reported.

7. Information and technical know-how

“Connected car” is a phrase that is selling like hotcakes nowadays. We have to understand that the concept is not based on a single parameter, but is an outcome of many technological factors entangled in each other. However, the basic idea is to connect a vehicle to everything possible. The modus operandi is based on two pillars: device to everything and device to device.

In device to device, vehicles are connected to vehicles, just like two friends chit-chatting in the workplace. Vehicles can also be connected to infrastructures, vehicles and pedestrians. Ensuring as much safety as possible can minimize the chances of head-on collisions through the synchronization of traffic signal and pedestrians timings.

Major players in the automotive industry will try to revolutionize the world with the aforementioned technologies, but it will take at least another two decades to completely back them up with necessary infrastructure and to gain end-user acceptance to assimilate the fact that one day driverless cars can really roam the roads.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: