IoT Agenda


December 6, 2018  1:31 PM

How rapid experimentation can improve IoT in insurance

Nick Ford Profile: Nick Ford
"car insurance", Insurance, insurance industry, insurers, Internet of Things, iot, IoT applications, IoT in insurance, IoT services, IoT verticals, rapid experimentation, telematics

As one of the world’s oldest industries, insurance is often seen as a traditional or even old-fashioned industry. But insurance is one of the many industries that is being redefined by IoT. But insurers need a shift in mindset if they want to advance into the modern era and stay relevant to their customers. Historically, insurance companies have focused on making their core as good as possible to mitigate risk. Insurance at its core is risk-averse. But the landscape is changing and IoT in insurance is becoming more widespread.

More insurance technology startups that are using IoT are popping up and being invested in by large companies. Last year, insurance technology companies raised $2.2 billion and saw 202 deals — a record-breaking number. This shows that there is an increasing interest in new technologies in the insurance sector. And with the money behind these new technologies, they are gaining momentum.

IoT and new opportunities for insurers

New technologies like IoT provide a huge opportunity for insurers to improve their products and the way they interact with their customers. Many insurance companies have already started experimenting with how they can implement IoT with their business. Car insurance companies are starting to use telematics, the integrated use of communications and IT, to track how much and how well you drive. This allows insurers to reward their customers for good driving behavior and to charge less if they drive less. For example, Metromile offers a usage-based, pay-by-the-mile car insurance in the U.S.

Life insurance companies have experimented with incorporating health tracking devices to integrate wellness benefits with their customers’ insurance plans. In fact, it is predicted that the connected health market will be worth $61 billion by 2020. John Hancock was the first insurance company to offer policyholder discounts of up to 15% for wearing internet-connected Fitbit wristbands and rewards for various healthy activities. This smart life insurance is an example of how companies are starting to make insurance more immediate and relevant in the daily lives of their customers.

Another example of IoT in insurance are those who are experimenting with data from connected buildings to offer reduced premiums by monitoring utilities to understand water leakage, fire or occupancy trends. Many of the building’s systems are designed to be purely reactive, like the smoke detector that raises an alarm when it detects smoke. But with this data, insurers are finding that they can start to do predictive maintenance by detecting potential problems before they occur, reducing claims. Aviva is integrating Leakbot, a smart connected water leak detector, to offer a solution to the issue of water damage through its ability to detect leaks in a home, spotting them before they have a chance to become big problems.

In the not so distant future, IoT-connected products will become the norm across all segments of the insurance industry, as providers look to attract and retain digitally savvy audiences. Insurers are starting to take notice, with three-quarters of them seeing financial technology innovations as a challenge for their industry. To adapt to the shifting landscape, insurers need to create connected products rapidly.

As the examples above show, the internet of things is new and shiny and creates opportunities that are vast and valuable. But it is important to emphasize the word “new” here. There is no shortage of impressive, potentially groundbreaking ideas out there, but the technologies are still very novel and therefore the systems are not well-defined, requirements are loose and changing, and ideas are hard to prove.

It is challenging for insurers to understand the value of the data and for their customers to even know what they want. That’s why so many insurance companies are frozen with fear at the idea of taking such a risk without knowing the payoff. Therefore, it is important to constantly experiment and get something into the market quickly without spending a lot of time and money.

Rapid experimentation to the rescue

Insurance companies need to approach IoT applications with a willingness to fail often in order to figure out how to succeed sooner, in order to adopt a development process that allows for this rapid, low-cost experimentation. The lightbulb wasn’t invented with a single eureka moment. Instead, Thomas Edison experimented with thousands of different ideas. In his eyes, he thought, “I have not failed; I’ve just found 10,000 ways that won’t work.”

Budget also plays an important factor in the new mindset for insurers around risk. Insurance companies need to fund these projects in different ways and need to make an investment with an understanding that the business value may well be zero and that they may lose money at first. However, if insurers can test these ideas rapidly and at low cost, the risk is alleviated.

To foster this low-cost, high-value experimentation, companies need the right set of tools and processes. Some best practices for fostering rapid experimentation in your organization include:

  • Designate time and resources for this type of experimentation to prove to your organization that this new approach can work, and then scale it widely as a new mindset.
  • Form cross-functional teams that include the business and IT. Bring together a person with a big idea and someone with the technical skills to bring it to life.
  • Use visual, model-driven development to create a common language and allow for faster experimentation and greater collaboration.
  • Create a feedback loop to continuously capture feedback from users that you can take back into the process for continuous innovation.
  • Test a minimum viable product early in the process to ensure the ability to change direction with minimal risk based on what you learn.

It is imperative to get these ideas out there quickly in order to validate or invalidate them. Avoid the “it must be perfect” mindset. Instead, build and deploy a minimum viable product and continue to iterate with feedback from customers to create the best experiences.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

December 6, 2018  10:46 AM

5 questions to ask before piloting AI for IIoT analytics

Prateek Joshi Profile: Prateek Joshi
ai, APM, Artificial intelligence, asset performance management, Data Science, Data scientist, IIoT, IIoT analytics, IIoT data, Industrial IoT, Internet of Things, iot, IoT data

A new industrial age is being propelled by companies wanting their assets to generate more revenue without further investment or infrastructure upgrades. Artificial intelligence and IIoT can make this a reality. With a system intelligently assessing the conditions that affect manufacturing processes, humans can learn and make decisions based on that information.

This allows operations to improve with little or no manual analysis from personnel, leading to lower costs and downtime, and the ability to produce faster, as well as a slew of other benefits.

Sounds good, but there’s more to this.

Large data sets are too time-consuming for a system to process, especially if attempted manually. AI is used to find correlations and the root cause to specific events. Add in a capable asset performance management system and AI algorithms can offer advanced analytics that deliver a clear view of business outcomes, and even what the future may hold.

It’s an exciting time and a lot of companies are ready to rush right in. But if AI was simple and success guaranteed, everybody would already be on board. It’s an evolving field, and if not done right, people might walk away with the impression that AI can’t do much or that it’s not effective. Before turning these new technologies loose, consider the following questions.

What are we trying to solve?

Not identifying key business pain points to solve is a reason many AI pilots flounder. The thing is, even when these initiatives appear successful, they will stall at some point. You have to know what you’re trying to achieve and, most importantly, make sure that the leadership is fully aware of it. This will enable you to continue working on it despite encountering obstacles. Here are a few examples that grab executives’ attention and commitment:

  • Minimize unplanned downtime: Forecast performance metrics and schedule maintenance to keep operations up and running.
  • Reduce energy costs: Get to the root cause of energy spikes faster. Take advantage of off-peak energy prices.
  • Reduce chemical costs: Purchase and use resources more cost-effectively, such as lowering chemical dosing amounts.
  • Increase efficiency of work orders: Manage dispatches optimally by predicting the performance of assets in advance.

What improvements will be reached?

When pilots succeed but don’t progress, it’s often because results weren’t as powerful as anticipated. The fact is that results are still positive even when performance improvements weren’t obtained but a clear reason is determined as to why it didn’t happen.

The challenge is to find a project with which everyone feels comfortable. Getting some kind of pilot off the ground just to get an evaluation started is actually reasonable. This is where concrete, meaningful improvement goals become important. Your solution provider should lead this charge since they know what’s possible.

What access to data will you have?

When it comes to data, three key aspects make up the backbone of an AI project: quantity, quality and access. AI projects use historical data in order to train algorithms to predict future outcomes. The more data the better. It may not all come into play, but data scientists will want to tease out any and all correlations and look for causal effects, so having access to this data is crucial.

Even so, while less data poses challenges, project goals can still be met. Even gaps in data — such as a lack of one or more sensor inputs — can be overcome. It’s important to know what you have to work with, so bring in a data science team to conduct an investigation before beginning.

Do we have data scientists and subject matter experts?

It’s important to have strong collaboration between data scientists and subject matter experts (SMEs) who understand the process to be optimized. Without this, the project will likely fail. Some solution providers have good AI expertise, others have SMEs. These types of projects require a combination of both.

How do we proceed?

There are a lot of approaches you can take to evaluate and execute a plan. Do you involve an analytics company if you have your own SME? Should a consulting engineering firm organize the project? Do you get a one-stop solution provider to do the whole thing?

All of these are viable options. The key is to know that the analysis can be done, and access to historical and near real-time data is crucial.

Data analysis should be completed and vetted up front. Your team or provider must be able to tell you, within certain limits, that you’ll get the prescriptive recommendations necessary to meet your project goals. If a significant payment is needed before any analysis occurs, you could be funding someone else’s learning curve.

Improving your workflows is a process. The key is to be realistic, patient and persistent.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


December 5, 2018  3:25 PM

The (secure) modern office connection in a mobile world

Christoph Ruef Profile: Christoph Ruef
connected devices, connected office, Internet of Things, iot, IoT devices, iot security, smart office, smart workplace, Workplace

It’s clear that our lives are getting smarter — just ask Alexa.

From predictive mapping to intuitive temperature control to refrigerators that automatically order your grocery list, we’ve become accustomed to connectivity in our personal lives. The internet of things has been a revolutionary driving force in the way we consume entertainment, participate in the consumer economy and simply exist on a day-to-day basis.

This new dawn of interconnectivity does not stop when we leave the house in the morning, however. Increasingly, IoT is presenting companies with chances to make unparalleled strides in creativity, productivity and collaboration by connecting employees to their surroundings, and one another, in ways unlike ever before.

The modern office, after all, is no longer built around four walls. The rise of mobile computing means that critical elements of the workday happen curled up on the couch at home, huddled in the corner of a local coffee shop or co-working center, or even in the back of an Uber or on an airplane. For these reasons, it is vital for companies to understand the ways IoT is already affecting the modern workplace, and how it will continue to impact the future of work.

Mobility: Not just a millennial trend

It is important to put this trend in perspective — while we often associate the rise of smartphones and connected gadgets with millennials, in reality, workers across generations have been adopting these devices as a hub of how content is generated and consumed for almost two decades.

In fact, a study done by Ryerson University estimated that more than 70% of Canadian workers are mobile in some capacity. As advancements in the design of operating systems encourage even greater mobility, we now recognize the modern worker as someone constantly on the move — unfettered by age or office walls.

To truly utilize the benefits of IoT in the workplace, it’s vital to understand the importance of the cloud and how it can serve us at work. The number of connected devices estimated to already exist in the world is approximately 23 billion (a staggering number which is steadily increasing). Investing in cloud infrastructure allows businesses to offer more innovative capabilities to their customers, through an unprecedented level of insight into their needs. As a result, they are more competitive against peers who do not invest in such customer insight and service levels.

For example, we’ve enabled IoT capabilities in printers that ensure customers never run out of ink, as the printer is cloud-connected and will reorder supplies before they run empty. We are able to perform sophisticated remote diagnostics on cloud-connected office printers, auto-deploy self-healing capabilities and inform service partners about parts to be replaced before they actually wear out. This increases uptime for customers and cuts service costs for partners.

Securing an expanding network of connected devices

It is important to note that while we have access to more information than ever, utilizing IoT in this way requires a thoughtful approach to data management and security.

With more and more connected work devices (smartphones, PCs, tablets), security is a growing concern — one which, so far, has been underestimated. Security should absolutely be at the forefront of any company increasing their reliance on the cloud. Any device that connects to a corporate infrastructure poses a potential security threat, putting both companies and clients at risk, which is why cloud and endpoint device security is of the utmost importance.

Considering that most companies simply don’t have this kind of security thinking in their DNA at the moment, the most prudent course of action is for dedicated IT workers to team up with professionals who specialize in device as-a-service models. This can provide valuable insight into what kind of devices to use, how to secure them and how to optimize the infrastructure from a cost perspective, while staying ahead of security threats in this ever-growing hacking space. A holistic mindset is important here, and senior management should work with IT professionals to make technology purchase and deployment decisions. After all, IT security threats put the whole brand at risk.

Looking to the future, employers who want to be on the frontline of the connected workplace will need to be competitive about attracting talent and appealing to a new generation of workers that is already used to a mobile-heavy environment. That effort however, needs to be balanced with a thoughtful and realistic approach to integrating IoT in a productive, secure and creative fashion. Embracing the latest technology is important, but only when it can be integrated safely and effectively. That sense of authenticity and attention to detail will both serve businesses in the long run, as well as appeal to the growing talent pool of millennials and Generation Z.

As the workplace of the future evolves, beginning to take a mindful shift toward practices that support this new technology and mobile work environment is imperative. Being on the forefront of the evolving office of the future will reap great benefit for your employee base and your brand reputation down the road.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


December 5, 2018  10:23 AM

Building the predictive maintenance 4.0 era with cognitive-first models

Ruban Phukan Profile: Ruban Phukan
anomaly detection, Artificial intelligence, Cognitive computing, cognitive machine learning, Data Analytics, Data Science, Enterprise IoT, Internet of Things, iot, IoT analytics, IoT applications, IoT verticals, Machine learning, Predictive maintenance

Predictive maintenance has come a long way since the ’90s. Over the past decade, IIoT and the evolution of analyzing sensor information has pushed organizations to look at new ways to use data to understand machine health.

A new wave of data sets is being generated by and collected through a new generation of connected machines and sensors. With them, organizations are now able to make better and informed operational decisions; perform timely maintenance to minimize failures, unplanned downtime and associated risks; cut costs; reduce scraps and rework; improve productivity and quality; manage inventory and resources better; and deliver an overall superior customer experience.

Recent advancements of artificial intelligence and machine learning applied to IIoT enable organizations to track multiple components of heavy machines and identify highly localized and contextual signals that pinpoint potential causes for concern ahead of time to prevent a catastrophic event. With deeper insight into machine performance and minute fluctuations, changes and errors that were once undetectable by the human eye can now be observed, recorded and analyzed on a real-time basis to detect potential business threats.

We see more organizations tapping into the IIoT potential to predict equipment failure. In fact, IIoT-driven digital transformation is no longer considered just a strategic advantage, but an essential survival tactic for the highly competitive business environment of the future. But there are still many hurdles they have to overcome to truly enjoy the promises of predictive maintenance.

Industrial IoT: The data challenges

Despite enterprises viewing IIoT as a game-changer, many still struggle to harness it effectively to get the optimal performance out of their machines.

First, legacy machines can hinder businesses from capitalizing on the IIoT boom. Intelligent sensors can be added to bridge the gap, but it is important to have the key components properly instrumented to capture critical signals and anomalies that indicate future problems.

The second challenge lies in how organizations analyze the information collected. IIoT gives businesses all sorts of unique information about how their machines function. This is great in theory, but that information has to be mined out from the raw data. Applying old and traditional approaches of data analysis in this new world of IIoT won’t tap into the true potential.

Many teams take a machine learning approach called supervised learning. It can build models trained on historical failure or fault data. Other teams simply write rules to generate alerts and notifications when past failures or faults are primed to reoccur. But these approaches only accentuate what many businesses are already good at: learning from historical problems and putting processes to prevent them from repeating. However, only about 20% of the problems that occur in the field are repeat problems. Eighty percent of those are actually new and unknowns for most industries. Unless the signs for these 80% of the problems are detected early on, predictive maintenance won’t yield the desired results.

Cognitive-first strategies to service the predictive maintenance 4.0 era

Predictive maintenance consists of collecting data on the condition of machinery so that engineers can foresee possible outages or failures before they even happen. Predictive models require large amounts of performance and condition history data to learn from before they can make forecasts for equipment accuracy.

Time may also be needed to analyze live data to be able to establish normal baselines for performance for each and every asset individually based on their operating and environmental conditions. In order to have predictive maintenance working as a practical tool, months of data analysis and investment in data scientists and a production data engineering team are often required to build and update models. But more often than not the scale of these analyses is such that it isn’t practical to do it manually, especially in a production environment.

To make the most of predictive maintenance, companies can combine connected technologies with cognitive machine learning. The cognitive approach will allow them to automatically find data patterns or groupings at scale that otherwise would be hidden and would have been humanly impossible to unearth in time to prevent major issues.

By using IIoT data through cognitive anomaly detection and prediction, businesses are better equipped to identify the unknown variables that are responsible for many of the defects that lead to recalls or failure events in the 80% of new events. This meta-learning approach increases prediction quality and accuracy by about 300% and requires 3% of the time and resources to get up and running compared to other approaches to data model development.

Promising use cases

Organizations in the manufacturing, oil and gas, automotive, aviation, transports and logistics as well as energy and utilities space are starting to lead the way in this field.

The work we are doing with leading global organizations in these industries show that the new generation of predictive maintenance boost overall equipment effectiveness by 90% and reduce unplanned downtime by 20%. They optimize inventory costs by 35% and boost overall equipment effectiveness by 35% and asset life by 25%.

We recently helped a Fortune 50 company to identify 66 times more anomalies in their commercial HVAC systems using the cognitive approach as compared to their traditional approaches and thereby prevent downtime. This translated into increased customer satisfaction.

Gartner forecasted there will be 20.8 billion connected things worldwide by 2020. Organizations that stick to an old preventive data methodology won’t be able to uncover the potential of all these connected things and will likely be left behind. Using cognitive-first models is what will make the difference.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


December 4, 2018  3:53 PM

Data cognition engines can save millions in big data costs

Guy Levy Yurista Profile: Guy Levy Yurista
ai, Artificial intelligence, Big Data, Big Data analytics, Data Analytics, data cognition engine, data set, internet of thing, iot, IoT analytics, IoT data, IoT devices, Neural network

In business, as in life, a pursuit of perfection often results in exponential increases in effort to achieve only minimal increases in quality.

With this in mind, we determine which tasks require the highest level of quality and which do not. Raising my children is an area where I strive for excellence and it requires my continual focus. The emotional reward obviously justifies the significant investment, and I am grateful for this opportunity, daily. On the other hand, ensuring that the lawn is mowed weekly takes up only as much thought that’s required to get the lawnmower up and running and push it around the yard every other week as my mind wanders aimlessly.

Most of what we do falls somewhere between those two extremes. And part of being a mature, effective professional, and human being, is to maximize our effectiveness by figuring out the minimum quality of output required for each task, achieving it and moving on to the next task.

There is an exponential increase in costs to achieve minimal increases in accuracy around big data as well.

A new class of high-performance analytics based on data cognition engines takes a new approach to big data, focusing on answers that are “near perfect” and reducing costs by hundreds of thousands.

This new AI-driven approach allows the neural network at the heart of the data cognition engine to “learn” data sets, and can provide results that are more than 99% accurate, without any ongoing access to the data itself, and by reducing the data set size by many orders of magnitude.

Creating a new ‘brain’

A good analogy to how this new category of data cognition engines operates is that of a young child who speaks no English. Now take that child, teach her the language and then give her only one book to learn, say Harry Potter and the Philosopher’s Stone, and ask her to read it several times. At the end of this process, the child will become an expert in the narrative of the book and will be able to answer questions relating to the book with great accuracy even if she cannot quote it verbatim.

Similarly, data cognition engines teach a neural network a new language — in this case SQL, the language used to retrieve information stored in databases. Next, this new SQL-speaking “brain” is asked to read only one book — a specific data set — many times and study it well. At the end of the process, we are left with an extremely light brain that can answer any question relating to the data set without the need to retain the data set or consult it ever again.

Now, here is the fascinating part. A child would likely need to spend a few years in school learning the English language to the level of proficiency needed to read, understand and absorb Harry Potter’s richly imaginative magical universe. In contrast, the data cognition engine’s brain can achieve the equivalent to this task in just a few hours!

Next, this new brain no longer needs to have the book next to it. Moreover, since it is software-based it can be easily cloned and spread around the edge of the network, operating IoT devices at incredible speeds and extreme levels of accuracy.

This technology isn’t science fiction — it already exists and is being used in real life.

An electronics manufacturer which has deployed a data cognition engine managed to replace a 6 gigabyte data set containing 2 billion records with hundreds of columns which required 30 minutes to query, with a 2 megabyte data cognition engine. This data cognition brain can answer questions and provide its “hunches” at 99% accuracy or better and within 0.1 millisecond. This represents a 30,000x reduction in storage needs and 18,000,000x speed improvement.

Returning to our Harry Potter book analogy, imagine replacing this 1.2 pound book with a 0.02 gram brain — equivalent to the weight of small snowflake. And instead of taking, say, a month to read the full 422 pages of the book, the new brain can provide an answer about the narrative in a fraction of a second — literally, a blink of an eye.

Bringing secure supercomputing to the edge

Data cognition engines have a number of disruptive implications for IoT. Data cognition engines put the insights of tens of billions of rows of data into a sensor, a phone or a wearable device, offering the power of big data in a small, portable, cost-effective and secure IoT package.

The applications for manufacturing — where a sensor can be making quality control decisions formerly handled by an experienced human, based on the power of gigabytes and terabytes of data — are already manifesting. However, this is just the tip of the iceberg.

Imagine that your wearable fitness device with an integrated EKG reader — something that exists now — can look at your readings and compare them to the readings of everyone wearing the device with a similar health profile (age, weight, etc.) across the country or the globe. It can then compare those readings against actuarial charts and health data. Using those two massive, complex data sets — which will reside securely in the small amount of memory in the device through a data cognition model — it could alert you to a serious, undiagnosed health issue.

Because it doesn’t need ongoing access to data after it learns a massive data set, data cognition engines can offer major benefits around data privacy and security as well. Most users do not need to regurgitate row-level data from massive data sets in order to provide insights. For the majority of use cases, data cognition engines also present a secure system for data that can’t be hacked or reverse engineered to provide sensitive personal information — because the sensitive information just isn’t there to steal.

No replacement for perfection

In data, as in life, there are times when only perfection will do.

The vast majority of big data use is focused on extracting insights from massive data sets, answering questions like “What percent are sales up in APAC regions across the last three quarters?” and “Who are the people most likely to purchase a dog during the month of February?”

However, there are times when row-level accuracy is a must. To be sure, when you need to know one of your customer’s credit card numbers or their exact bank balance, cognition engines are not the right tool for the job. This is when you’ll need to go with big iron and crunch the really big data for that one specific piece of information.

In contrast, data cognition engines will help us focus on the tasks where accuracy — delivered securely, inexpensively and quickly — is better than slow perfection. And, with 99% accuracy, the line between those two classifications is blurring.

Many business cultures push for perfection in everything we do. In my experience, however, the people who are most effective are those who understand when near-perfect is perfectly acceptable. For those who reach this realization in the world of data, there are now major rewards to be gained and advances that will change how human beings access, and benefit, from data insights.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


December 4, 2018  12:50 PM

The nightmare of selecting an IoT ecosystem

Francisco Maroto Francisco Maroto Profile: Francisco Maroto
Cloud platform, Cloud vendors, Digital ecosystem, ecosystem, industrial internet of things, Internet of Things, iot, IoT connectivity, IoT ecosystem, IoT partners, IoT partnerships, IoT platform, Professional services

In recent years, I have heard my fill about the importance of ecosystems to making the promises of IoT come true from analysts who have flooded me with their optimistic predictions. All, or at least most of those who read my articles, know that there is no company in the world, no matter how great it is, that can do everything in IoT — ecosystems are the key to success in this business. An ecosystem allows organizations to achieve the multiplier effect and a trusted environment.

An IoT ecosystem can be either horizontal (technology) or vertical (industry), and requires a lot of talent alliance managers to able to maintain win-win transactions over the time. But selecting an IoT ecosystem is not an easy task. In IoT ecosystems there are fights between partners, and sometimes abuses from big partners to little ones. There are also conflicts with companies that are in several ecosystems, sometimes over contradictory interests. It is very common to have partners collide with the objectives of the ecosystem — and you can imagine betrayals and back stabs. For instance, with IBM’s acquisition of Red Hat, will the IoT open source architecture designed by the Red Hat , Eurotech and Cloudera ecosystem be a good decision?

In a 2015 post, I presented several successful cases of collaboration among members of IoT ecosystems. But let’s be honest, there have been few references and examples since. The fragility of alliances in IoT is a challenge to accelerating IoT ecosystem adoption.

IoT ecosystems

In a 2013 Harbor Research article, the analyst firm wrote that no significant ecosystem or network of collaborators had emerged in the IoT arena despite early and interesting efforts being made by several players. In 2013, these ecosystems were emerging alliance developments that had not attained the scale, scope and momentum required to drive the opportunity to its intended and expected state. Most attempts thus far to drive an ecosystem advantage have failed to scale and reach critical mass. This underscores how challenging building a high-velocity network of partners can be.

In this article, I will focus my analysis on four examples of IoT ecosystems that represent a big portion of the value chain in the multiple IoT submarkets: IoT connectivity providers, IoT cloud platform vendors, IoT professional services and IoT solution aggregators.

Telefonica: IoT connectivity ecosystem

One of my first attempts to monetize IoT services was through the Telefonica IoT solution partners program four years ago. At the beginning, I received a couple of calls from the operator to help me create my account and describe my services. There were many partners, and although the partners search portal left a lot to be desired, I did not see much competition to my services and thought that it would be the perfect accelerator. I was wrong. Since I registered, I have not received any invitation to participate in any event for partners, nor has anyone contacted me to request my services, nor have I needed the portal to contact any partners.

How are you going to find me as IoT solution partner if Telefonica’s IoT webpage does not offer a link to the partner search page? The use of this non-updated page is frustrating with duplicates names, closed companies and so forth.

Telefonica identifies three types of partners:

  1. Operator alliances: Telefonica is partners with other Tier-1 telecom operators, including the IoT World Alliance, China Unicom, Sunrise and Avea, in order to provide IoT customers with seamless services worldwide and lower costs.
  2. Channel partners: Telefonica allows partners to drive growth and differentiate their business by reselling their global managed IoT services. It helps increase their capabilities, enabling deployment on a global scale, in particular in regions such as Europe and Latin America.
  3. Solutions partners: Telefonica’s solutions partners ecosystem consists of a global network of IoT providers with functional or industrial expertise, including IoT device providers, IoT system integrators and IoT industrial experts.

I never liked the idea of Telefonica orienting to quantity (around 1,000 partners, including duplicate names and a non-updated list) instead of quality in partners, and I think the results have been and are very poor — clearly a point to improve if companies want IoT to take off within the operator.

Talking with Telefonica IoT, you will quickly recognize that if you are not Microsoft, AWS or a similar company and unless you bring business to it, you will never get business from it. Telefonica does not lead any IoT ecosystem, neither geographically, industrially nor technologically. It is just one more logo (important of course) in many presentations of IoT vendors.

I cannot understand its win-win strategy and go to market regarding IoT platforms. In addition to its own platform, Telefonica appears as a partner of Microsoft Azure, PTC ThingWorx, Sofware AG Cumulocity, AWS IoT, Cisco Jasper, Libelium and more. Maybe it should select partners around share of outcome rather than share of investment if it wants to lead an ecosystem. Pecking is good for the birds. Telefonica needs an open-minded company culture to become comfortable with an ecosystem structure.

Microsoft: IoT cloud platform vendor ecosystem

Having worked at Microsoft, I have had the temptation of becoming an IoT partner. Because my business model is based on vendor independence, my decision is to help other companies enter in the Microsoft IoT ecosystem. This year, I was convinced that I needed to change my approach. Instead of becoming a partner, I decided to convince two other Microsoft partners strong in complementary disciplines (business intelligence and cloud) to create a specific area for IoT. I have not succeeded, which makes me think that despite the efforts and investments planned by Microsoft, partners do not see IoT business clearly yet.

The list of IoT trusted partners certified in Microsoft Azure is impressive, and I recognize the effort of Microsoft building an IoT ecosystem that fuels business transformation. Its largest partnership is with GE Predix, and its partnership with PTC  will help industry customers accelerate their digital transformations by adopting IoT.

In this case, finding a partner of Microsoft Azure IoT is easier than with Telefonica. The categories of IoT partners for Microsoft include devices, gateways, security, independent software vendors, network, telecommunication and system integrators. By the way, there are no partners in Spain according to the site — maybe it’s the right time to invest.

Microsoft is an expert in identifying, nurturing and managing partners, and Azure IoT is a great opportunity to lead IoT ecosystems.

EY: IoT professional services ecosystem

EY is a leader in the IoT space. Not involved in the construction of devices themselves, EY instead helps organizations navigate the largely unchartered waters of IoT.

While working in an engagement with EY IoT, I read a report developed by Forrester Research that segmented the landscape of IoT professional services firms based on functional capabilities to help enterprises deploy IoT-enabled processes, vertical market focus and geographic reach. Based on the service offerings, vertical capabilities and characteristics of a broad array of professional services firms, Forrester identified eight categories. The major players in the consulting firm segment included Deloitte, EY, KPMG and PricewaterhouseCoopers. These strategic consulting firms combine strong business strategy capabilities with the ability to execute on digital transformation initiatives. The report clearly showed EY had strong IoT capabilities across the globe. EY was also recognized as an IoT services leader by HFS Research.

Initiatives like launching a global IoT/OT Security Lab to help clients stay ahead of emerging threats and launching EY wavespace, a global network of growth and innovation centers to help clients achieve radical breakthroughs, has helped EY demonstrate its strategic alliances with SAP, GE Digital, Microsoft, IBM and Cisco. These technology vendors rely on EY to implement IoT systems in large customers with business-driven approaches.

Do not expect EY or any of the consulting firms to lead an IoT ecosystem. Its role is to use its business strengths and client relationships to empower the ecosystems to which they belong.

Tech Data: IoT solution aggregator ecosystem

Perhaps the most complex task I have done while advising about IoT ecosystems was with one of the largest IT distributors in the world, Tech Data. The challenge of balancing players like Microsoft, Dell, Cisco, IBM, Schneider and Vodafone with innovative startups in several industry verticals and different use cases without anyone feeling careless was very exciting. But to find the right place for Tech Data in the IoT value proposition schema was another challenge. It was great helping the company define its role as an IoT solution aggregator and define which partners should be included.

Tech Data has been able to demonstrate how to become useful to IT and OT vendors, and how provide value to existing and new IIoT system integrators worldwide. I always have believed this approach could make it easy for small and medium-sized clients to adopt IoT quickly.

I did not have time during my engagement with Tech Data to analyze and support the launch of its new business model, but I am sure it will use its position to offer new services based on data aggregation.

Education, the latest products, support services and firm footing in the B2B world puts IoT solution integrators at the center of the IoT craze.

Key takeaways

The IoT market is still in early stages. Enterprises face many different options for IoT partners and suppliers. Choosing the right ecosystem is critical not only for a successful IoT project implementation, but for the digital transformation journey. IoT ecosystems need to understand that most industries thrive on coopetition — it’s important to become cognizant and respectful of competitors, as they may also be your potential partners.

Just like with people, when it comes to IoT businesses, no two ecosystems are alike. I have been helping different types of companies build or enter the most suitable ecosystem, and I have no doubt that only the best ecosystems will survive; the challenge is to rank among so many. It is really a nightmare.

The topic of ecosystems is hardly new, but is rapidly evolving. If ecosystems are able to use shared data and information from intelligent sensors, machines and assets, radical new models of value creation will emerge.

Thanks in advance for your likes and shares.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


December 4, 2018  10:11 AM

New IoT security laws and proposals are a good start, but we need to do more

Nisarg Desai Profile: Nisarg Desai
Cybersecurity legislation, encrypted communications, Internet of Things, iot, IoT botnet, iot security, Mirai, securing IoT, security laws, Security legislation

The internet of things has kicked into high gear. For someone like myself who deals with IoT security on a daily basis, it is truly exciting to both witness and be a part of such a transformative force in our lives. But, IoT’s powerful and dynamic nature notwithstanding, its still nascent existence creates a powerful lure to hackers eager to exploit its weaknesses and wreak havoc.

Proven vulnerability

Some of the most infamous attacks include a 2013 hack on a baby monitor, a Jeep Cherokee in 2015 (although the latter were ethical hackers), the Mirai botnet webcam attack in 2016, the first serious ransomware attacks on medical devices last year and, of course, the Equifax breach.

The financial toll of these incidents has been significant:

  • In 2016, the Ponemon Institute estimated the average cost of a data breach was more than $7 million.
  • The massive Equifax breach may ultimately cost the company $439 million.
  • This year, Facebook experienced two spectacular security incidents. The stunning revelations sent Facebook’s stock price tumbling and CEO Mark Zuckerberg himself lost nearly $11 billion (and he had to testify in front of a Congressional panel — a must watch if you haven’t already).

These are just the consumer-facing cyberattacks, because hackers have been busy executing campaigns with potentially higher stakes. Some of the most infamous of which include the attacks on the Ukrainian energy grid, which of course could have devastating implications if someone actually is cunning enough to execute a full-on attack. Large populations of people could wind up literally powerless for days, weeks or even longer.

Earlier this year, the U.S. government accused Russia of attacks on our power plants, water and electric systems during the time leading up to our recent presidential elections. Evidence has traced activity as far back as 2011, but the strikes intensified in late 2015. A Department of Homeland Security report suggested that Russian hackers accessed critical control systems — however, no actual sabotage or system shutdowns of plant operations were recorded. Evidence suggests, however, that hackers still have the ability to shut down operations if they so choose. Whether the decision of the hackers to allow continuous operation without interruption was intentional or not is unknown. Either way, it was a clear message about the potential dangers by hackers on critical infrastructure operations.

The demonstrated vulnerability of IoT has many rattled, especially lawmakers who are increasingly pushing bills to prevent hacks, or at least to limit them.

Governance

California was the first state to pass a cybersecurity law (SB 327) that covers smart devices. According to the law, as of January 1, 2020, any manufacturer of a device that connects “directly or indirectly” to the internet must equip it with “reasonable” security features, designed to prevent unauthorized access, modification or information disclosure. It also mandates that connected devices come with unique passwords that users can change, which isn’t the case for many IoT products today.

However, the new law doesn’t address the ensuing communication that takes place between the device and the gateway and/or the cloud. That communication must be kept private to ensure data integrity and information security, which is typically implemented via the use of encrypted communications.

Still, the law is a positive defensive step aimed at providing fundamental security. If other states follow California’s lead, we could see more legislation that would obligate manufacturers to adopt device identity at the point of manufacture, offering a basic level of security that is currently absent in our current IoT environments.

In addition to the California law, the Internet of Things Cybersecurity Improvement Act, introduced by Sens. Mark R. Warner (D-Va.) and Cory Gardner (R-Colo.), would use the federal government’s buying power to boost IoT security. Under the bill, any companies that do business with the federal government would have to ensure that their connected devices are patchable, come with passwords that can be changed and are otherwise free of known security vulnerabilities. Another bill, the Securing IoT Act, would require the Federal Communications Commission to create cybersecurity standards for certifying wireless equipment.

Do more

It is encouraging that U.S. legislators are considering bills to improve security of IoT; it’s a good start. Still, we have to do more. I have found that, on the whole, most government agencies and the individuals who operate them are woefully ill-informed when it comes to technology. This was painfully evident during the April 2018 Senate hearings on a major Facebook security breach, where the personal account information of over 50 million Facebook users was exposed. Law makers, responsible for drafting legislation to protect personal data privacy, had little to no understanding of the social media platform’s basic functioning, let alone the technology behind it.

IoT professionals, myself included, recognize that IoT technology can be equally, if not more, confusing than social media. Efforts are underway to help legislators and the public gain a better understanding of IoT issues and technology in order to better manage their own data privacy and the data/device privacy and integrity of those they serve. Groups like the Industrial Internet Consortium, where GlobalSign is an active member, are working on communication aimed at clarifying the technology and issues, as well as offer potential solutions to address IoT security. One example is the Industrial Internet Security Framework (IISF). In addition, I strongly suggest interested parties to add comments to a great initiative from NIST, “Considerations for Managing IoT Cybersecurity and Privacy Risks.”

It is our hope that by publishing information that is easily digestible, and by working with legislators to more fully understand the ramifications of their legislative actions, we can all contribute to establishing and improving governance surrounding IoT security. Close collaboration will help produce laws that are informed and effective without being over-reaching, and which can effectively secure IoT from a potentially crippling attack.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


December 3, 2018  3:17 PM

New business opportunities for using WebRTC and IoT

Serge Koba Profile: Serge Koba
augmented reality, Authentication, Consumer IoT, Enterprise IoT, IIoT security, Industrial IoT, Internet of Things, Interoperability, iot, IoT applications, IoT data, iot security, WebRTC, WebRTC applications

By the year 2020, Gartner projects that devices connected to the internet of things will surpass 20 billion. IoT devices are increasingly being used as smart household devices, beacons and wearables, and in industrial applications. Businesses are discovering new opportunities to engage their customers, and the ability to add communication features is opening doors for new marketing opportunities.

Data gathering and machine-to-machine communication is only the first phase of IoT integration, however. Web Real-Time Communication (WebRTC) is the next trend in IoT functionality, focusing on two-way communication between users. This communication can be video, audio or a combination of the two, and is encrypted for maximum security.

What is WebRTC?

WebRTC debuted as an open source offering from Google in 2011. Developers are able to use video and audio communication between users, connecting IoT devices via a web browser rather than installing special plugins.

How does WebRTC work together with IoT?

WebRTC can offer multiple features toward the development and enhancement of IoT applications:

  • Mobile-to-mobile connections: WebRTC isn’t restricted to being used with laptop or desktop browsers, as it possesses native libraries compatible with iOS and Android.
  • Machine-to-machine connections: WebRTC can connect a wide variety of IoT devices, providing communication functionality.
  • HD video communication: WebRTC allows for encrypted streaming of both audio and video data between their wide ranges of supported web browsers.
  • Phone-to-browser connections: WebRTC is able to facilitate connections between browsers and public switched telephone networks, allowing for calls.
  • Messaging and file sharing: WebRTC removes the need to use data centers or the cloud as temporary data stores, instead transmitting data directly from one source to the other.

Business use cases of WebRTC and IoT

As of 2016, WebRTC-enabled products created a value of more than $10 billion throughout the world; and there are projections to reach $81.52 billion by 2025. When it comes to the global evolvement of WebRTC, North America holds around 40% of the total market share due to mobile device proliferation and access to fast and reliable internet connections.

Use cases of prospective intersection of WebRTC and IoT include smart home, security and industrial IoT applications.

WebRTC enhancing smart home devices with video and audio connections

Smart home applications are the area where WebRTC can potentially make the most lucrative inroads in the near future. Two prominent instances are with smart mailboxes and door intercom devices. With smart mailboxes, home owners who work during the day are able to unlock their mailboxes from anywhere while communicating with couriers to either receive deliveries or enable them to be sent. With door intercom devices, residents can speak with anyone that comes to their door directly from the privacy and security of inside their homes — or from virtually any place, which provides additional convenience.

Recent examples include Amazon’s Motion Sensor API, Doorbell Event Source and Alexa.RTCSessionController, which connect smart cameras and doorbells to Echo devices. Smart doorbells can send messages to Echo devices, and the Alexa.RTCSessionController can set up a two-way connection between smart displays, like the Echo Show and Echo Spot. Integration of WebRTC with home communication devices allows users to stick to their favorite smart products while at the same time enhancing them with video and audio streaming features.

WebRTC for surveillance and authentication

WebRTC can also be used in video surveillance systems, which are typically divided into two categories: cloud and point to point. The latter option has a number of issues, including low stability of video transmissions, inability to take real-time actions and local data storage, which make loss of recorded videos a more acute issue. This is where the cloud becomes the answer, storing well-structured sets of video recordings provided by all monitoring devices, while WebRTC can be implemented for video streaming in real time.

Since we’re speak about security, WebRTC can come in handy when it comes to smart biometric authentication. The essentials here comprise facial capture and recognition of voice, which are further enhanced with data science, applied for higher precision. The role of WebRTC is to stream data to the server for recognition, so as not to overload the front end.

Industrial IoT systems enhanced by WebRTC

Industrial enterprises tend to be conservative when it comes to innovation and new technology. However, smart factory and connected worker concepts are on the rise, and WebRTC technology adds valuable features such as remote support and collaboration via audio and video communication.

Manufacturing process controlled with sensors and IoT devices become smarter with voice alerts or video sessions enabled via WebRTC. The combination of WebRTC and augmented reality opens new possibilities for team work and remote assistance applications.

In all of these use cases, WebRTC utilizes mandatory encryption for security purposes, and JavaScript APIs use HTTPS connections. As a result, any developers working with WebRTC are given an additional layer of protection for IoT devices.

Development of WebRTC modules from scratch may be expensive, but that is why platform-as-a-service offerings are emerging on the market. Products like OpenTok or Twilio use WebRTC advantages, adding a cross-platform protocol for audio and video calls that can be conveniently integrated with software.

WebRTC satisfies one of the main necessities of IoT users – communication — so the use cases for this technology will keep expanding.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


December 3, 2018  12:42 PM

Commercial IoT environments require shared edge infrastructure

Justin Rigling Profile: Justin Rigling
Edge computing, Internet of Things, iot, IoT applications, IoT data, IoT design, IoT edge, IoT hardware, IoT infrastructure, iot security, IoT software, IoT strategy, Silo, siloes

As enterprises continue to implement IoT for customer experiences, smart building systems, automation and more, we see that the workflow from the sensor to the cloud often exists within its own, individual silo. The result of this is that many businesses today are operating multiple, siloed, IoT apps — each with its own devices, sensors, IoT gateways and connectivity protocol.

This is a major cause for concern. A siloed approach results in excessive up-front costs, a higher total cost of ownership, system integration issues and increased risk associated with trying to maintain security practices across disparate siloed systems. Simply put, this status quo is a mess. If we don’t do something to fix it, this setup could significantly hold back the commercial IoT industry.

As an industry, we need a solution that can serve commercial IoT applications more efficiently, less expensively and more securely. And because commercial IoT features potentially thousands of devices running across a geographically diverse area, we must provide technologies that are more robust than consumer-grade IoT, less costly and less complex than industrial IoT, and more scalable than both.

The solution sits in stopping the construction of a silo before it is built, in rethinking our project designs and by working from the data down, rather than from the device on up.

How a silo gets built — and how to stop it

This fix begins at the IoT project design level, where teams often start at the bottom of the IoT stack — with the devices — and work their way up to the cloud without consideration for other devices or “things” that might want to be tracked or monitored. Thus, a silo is born.

This is often the direct result of prototyping a single-use case system. A lone app is created with a specific goal and budget. Then “pilot-itis” ensues and each new app — for lighting, automation, security, tracking and so forth — is developed in its own silo, walled off from other apps and development teams.

This is not uncommon. Within commercial IoT teams, there are often several prototyping efforts taking place concurrently. Each effort has its own version of a bottom-up infrastructure in support of its unique use case. This bottom-up approach to IoT infrastructure makes coordination, information sharing and business integration difficult at best. Further, this silo mentality is an expensive and unsustainable proposition, presenting security risks and fragmented data.

It is at this design level where we must put a stop to the construction of these silos. We need to introduce a process that reduces complexity, eases integration and maximizes the scales of economy. The need to shift away from the device-up view and toward a data-down approach is imperative.

The data-down approach

Utilizing the data-down approach to developing a commercial IoT infrastructure can break these silos down — or stop them before they are constructed.

During development, treat the IoT workflow itself as a design component that is equally as important as the data. Instead of asking what devices and sensors are needed and how the data will be gathered, ask instead what kind of data types and elements are needed and why. What will be done with the data? This new direction in thinking begins in the cloud and naturally trickles down to the edge devices.

Once at the edge, the gateways can be considered. Ask what type of edge computing is needed to support the collection of that data and possibly future data requests. Gateways are the hub of commercial IoT activity, so consider the types of connectivity required for devices (downstream) and cloud integration (upstream). Make sure the gateway can handle the specific requirements, both now and in the future, for shared infrastructure and interoperability. Does the gateway:

  • Have flexible connectivity options such as the latest version of Bluetooth, Zigbee, Thread, Wi-Fi and so forth? Flexibility here is key so you are not playing catch-up with device changes down the line.
  • Use containers in a way that allows multiple apps to run simultaneously without interfering with neighboring apps? Containers also play an important role in securing the edge-app infrastructure by encrypting app storage and verifying software signatures from boot to application launch. End-to-end security is paramount in an IoT environment.
  • Monitor performance with intuitive tools and analytics? Does it orchestrate updates? Without these remote management abilities, support teams will struggle to effectively manage edge computing, connectivity, updates and security patches.
  • Allow for edge computing? When data between systems can be shared (because it is not siloed) at the edge, efficient applications can work together before the data is even pushed to the cloud. This allows for distributed workloads and faster response times using local command and control. An example would be for the room occupancy system to work with the lighting and building management system, and possibly the security system, all within the gateway as someone enters a room based on local rules without having to go to the cloud.

A practical example

How does this approach affect the end user? Consider a smart commercial building — be it retail, offices or quick-service restaurants — where there are multiple systems in place to keep the building open and running on a daily basis. Security, lighting and sensing technology are just a few of the IoT applications to be considered. When built from the ground up, there will be a silo for each of those applications. This is a management nightmare when the time comes for upgrades or patches to those applications.

By designing the IoT infrastructure from the data down, complexity and cost decrease significantly, allowing for more efficient building support. It minimizes management chaos and facilitates system integration of IoT systems. In turn, enterprises can maximize business value while reducing time to market, risk and cost.

Ultimately, this single infrastructure framework will allow enterprises to install, run, manage and update multiple edge applications as needed. It will reduce the complexity, cost and risk associated with these actions as well.

The correct solution

Typically, it is best to seek out a manufacturing partner that treats the infrastructure as its core competency. A firm designing an IoT application should focus its time on its expertise — that is, the application itself. Meanwhile, a firm focusing on hardware and infrastructure is uniquely positioned to handle the complexity, risk and interoperability these networks demand.

Baking an IoT application into the right hardware and partnering with the right solutions provider will not only solve the silo issue for your business, but will also allow for rapid and streamlined deployment of your system.

When seeking out such a partner, ensure its product addresses the previously mentioned requirements. It must be interoperable in terms of connectivity, it must prioritize security both in its system architecture and its long-term support, and it must allow for monitoring of performance and orchestration of updates.

Designing IoT applications the right way, and utilizing the right hardware, is key to avoiding the trouble with silos. The future of commercial IoT demands we all take interoperability and security seriously while also allowing for a shared infrastructure that can run multiple edge applications. The technology exists to achieve this outcome. It’s up to us to build the appropriate ecosystems with that technology and truly empower commercial IoT.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


December 3, 2018  10:35 AM

The 5 C’s of making virtual healthcare the new normal

Jiang Li Profile: Jiang Li
#eHealth #Healthcare IOT #Wearables #wireless medical devices, Connected Health, connected healthcare, home telehealth, Internet of Things, iot, patient experience, telehealth, telehealth services, User experience, virtual healthcare, Wearables

Breaking a concept down into five C’s in order to further explain an idea, practice or innovation is nothing new. Over the course of the last century, we’ve seen scholars from all over the world support arguments across various professions with the same formula. As industries and innovations have advanced, so have their respective five C’s. The five C’s of business leadership, the five C’s of credit, the five C’s of marketing — all of these breakdowns have evolved as their industries have progressed and new innovations come along and create new norms.

One industry we see a swift and incredible uptick in change is that of telemedicine powered by IoT. As IoT becomes more reliable and data becomes increasingly essential to patient health, the industry is seeing an increase in telehealth adoption and utilization numbers. Medical providers, hospitals, payers and patients are all experiencing and appreciating the benefits. In fact, such a concept is no longer a luxury, but has quickly become a necessity. Global recognition of telemedicine is proving that it is well on its way to becoming the new normal. Health professional mindsets and consumer behaviors are changing, and along with these, the healthcare industry’s five C’s are evolving.

Virtual healthcare goes far beyond the typical in-office experiences we are used to. Telehealth provides real-time communication, widened capabilities, both patient and provider-driven convenience, flexible channels and, overall, improved care.

1. Communication

In a world where consumers are constantly communicating with friends, family, brands and even strangers as a result of improved technology, it’s shocking to realize that the healthcare industry has stayed relatively old fashioned with its communications. Telehealth is changing that in a big way. Where telemedicine used to mean video conferencing, medical-grade wearables and real-time data has supplemented remote communications between patient and doctor. This makes it possible to handle emergency and urgent care situations or, better yet, preventive situations more effectively and timely.

2. Capabilities

Smaller or rural hospitals are often the last to receive access to new technology and treatment. This information deficit can lead to poor treatments and outcomes. These outcomes could have been avoided if the healthcare professionals had access to the right information at the right time. Virtual healthcare is making it possible for non-metro-area hospitals to broaden and deepen the hospital’s capabilities. It gives staff access to specialists who might be hours away, or even located in another country. These virtual consults are becoming the most effective solution for giving smaller remote hospitals access to an extended specialized staff. It is helping to greatly improve the care of patients, guaranteeing that patients are “seen” by specialists, regardless of location.

3. Convenience

While this C may seem self-explanatory — and in many ways it is — the convenience level provided by telemedicine has effects that bubble far beyond making it easier for a patient to be treated from the comfort of their own home. While, yes, that certainly is a perk provided by virtual healthcare, the convenience factor:

  • Helps cut down on driving time for both patients and clinicians;
  • Clears out waiting rooms, lessening the possibility of a patient contracting another patient’s ailment while waiting for an appointment; and
  • Allows healthcare professionals to treat more patients each day in a timely manner

4. Channels

The ability of a patient to utilize healthcare from any device, anywhere, at any time is the answer for staving off further ailments. Telehealth makes routine rechecks much easier for both the patient and the physician. It helps ensure the patient’s ongoing adherence to the prescribed treatment plan. Virtual check-ins, progress reports and improved general communication are all the result of having the ability to conduct impactful healthcare conversations whenever and wherever. This makes it possible for patients and physicians to remain engaged in their relationship.

5. Care

The enhancement of patient engagement, improved overall satisfaction and more accurate outcomes can be considered the most impactful aspect of virtual healthcare. In the long run, telemedicine will continue to positively impact the timing and delivery of patient care. Providers, specialists and patients will all see tangible benefits. Virtual healthcare will become more widely used than traditional in-office treatment as technology improves.

The numbers and statistics already support the growth of telemedicine — Zion Market Research expects telehealth systems to comprise a global market worth $38 billion by 2022. But conceptual improvements are where we are already seeing incredible growth. Telemedicine is readying to meet the healthcare challenge of the quickly aging senior population, the underserved and those with chronic diseases. Virtual healthcare provides the perfect solution to ensure that all necessary parties on the healthcare spectrum are connected, communicating and effectively collaborating across the evolving continuum.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: