IoT Agenda


June 20, 2019  10:57 AM

Security concerns remain a drag on the internet of things

Syed Ali Profile: Syed Ali
built-in security, embedded security, Internet of Things, iot, IoT adoption, IoT cybersecurity, IoT devices, iot security, securing IoT

Concerns about security continue to hinder the adoption of IoT devices. Enterprise customers indeed are interested in buying more IoT devices, but only if vendors can provide better security for them.

Bain & Company conducted research into the attitudes of enterprise buyers about cybersecurity and the internet of things, and we found that executives would buy, on average, 70% more IoT devices for their systems if cybersecurity concerns were addressed, compared with what they would buy if the status quo remains. Additionally, 93% of the executives we surveyed said they would pay an average of 22% more for devices with better security. Bain estimates that improving security for these devices could grow the IoT cybersecurity market by $9 billion to $11 billion in 2020.

For IoT device vendors — companies that make IoT devices as well as those that provide related solutions — the message is clear: Improve security to gain a competitive edge and expand your market.

Most executives we surveyed (60%) said they are very concerned about the risks IoT devices pose to their companies — not surprising, given the damage that an IoT security breach can cause to operations, revenue and safety. When poorly protected, IoT devices can allow access to enterprise systems, resulting in large data breaches. For example, in January 2018, a Mirai malware variant called Okiru targeted ARC processors embedded in billions of IoT products.

Executives who manage security say they want technologies that are highly effective, easy to integrate and flexible to deploy. Companies take a range of approaches to meet their security needs based on their capabilities and the availability of marketplace mechanisms from vendors. Only about a third of IoT cybersecurity systems used today are from IoT device vendors, indicating that vendors either are not offering holistic, high-quality technologies that meet consumer needs or are not promoting them well enough. Our research found that companies with the most advanced cybersecurity capabilities rely more on internally developed security mechanisms, not only because they may have more complex needs, but also because they are more likely to have the resources to develop their own technologies. As might be expected, companies with ad-hoc security capabilities have the most gaps across all IoT layers that we tested, including access interface, applications, data, hardware and operating system, network and operations.

We also looked at how companies deploy technologies by layer of security, and found ample opportunity for IoT device vendors at every layer of the stack. Our survey shows that the access interface layer has the greatest level of protection, whether internally developed or provided by a manufacturer or third party. Other layers of the stack are protected by more internal systems — or, in some cases, none at all.

IoT device vendors and ecosystem players that move quickly to improve the security around IoT devices are likely to reap rewards, both from their ability to earn a premium and from an expanded market.

First, manufacturers need to understand how customers are using their devices. Refreshing their understanding of customer use cases every 12 to 18 months will allow them to stay on top of evolving security requirements and identify unmet needs. Ascertaining the average cybersecurity maturity level of their customers will help manufacturers invest in the appropriate out-of-the-box and add-on systems.

Second, manufacturers should provide cybersecurity capabilities on the device and, when possible, partner with trusted cybersecurity vendors to offer additional systems. Engineering teams should embed secure development practices into the software and hardware components of the device, and provide inherent technologies for the access interface, apps, data and device layers.

Third, manufacturers also need to meet quality assurance thresholds and be able to certify that their IoT devices are free from known vulnerabilities. This would mitigate a major pain point for customers, who sometimes install new devices without realizing they contain vulnerabilities. Deploying a more methodical process to identify and remove vulnerabilities across layers, or engaging third-party vulnerability scanning and penetration test firms, can help manufacturers meet this bar.

Finally, manufacturers can fulfill their obligations during the warranty period by continuously testing for new vulnerabilities and by providing software and firmware updates, as well as feature and functionality upgrades for out-of-the-box and aftermarket systems. Delivering updates to firmware, operating systems and applications in response to newly discovered security vulnerabilities should remain a top priority throughout the warranty period.

These four steps are a start, though by no means all it will take to begin addressing the security concerns that are holding back IoT device adoption. While growth in IoT markets seems destined to continue its inexorable march, many enterprise customers will continue to move cautiously until they can gain some reasonable assurance of security — not only of their data, but also of the operations that increasingly rely on devices, sensors and IoT.

This article was co-written by Ann Bosche, a partner with Bain’s Global Technology practice, and Frank Ford, a partner with Bain’s IT practice. Ann is based in San Francisco and Frank is based in London.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

June 19, 2019  3:44 PM

Secure healthcare IoT devices with network traffic analytics

Gavin Hill Profile: Gavin Hill
#eHealth #Healthcare IOT #Wearables #wireless medical devices, connected healthcare, healthcare IoT, Internet of Things, IoHT, IoMT, iot, IoT devices, iot security, medical IoT, network traffic analytics

A mere buzzword a few years ago, IoT has come to define modern technology: digital, smart, connected. From watches to vehicles and from homes to entire cities, our world is becoming smarter and more connected by the day.

However, IoT’s promise of a more convenient, more efficient future comes with drawbacks. Smart devices don’t always live up to their name. While they are smart at doing what they were designed to do, most are lacking when it comes to peripheral areas — security in particular.

In recent years, the IoT ecosystem has become a hot target for bad actors, affecting everyone from consumers to critical infrastructure. The healthcare sector in particular has become a lucrative target, not only because it’s one of the most IoT-centric industries, but also because it handles the most sensitive data: personally identifiable information and health data. Protecting medical IoT gear is tough, because embedded devices don’t support individual security agents. So how, then, can we protect medical IoT products?

Lack of security puts lives at risk

Frost & Sullivan estimated IoMT devices will number between 20 and 30 billion by 2020, and will be used for anything from remote patient care to hospital operations to interoperability and data management.

These devices have embedded operating systems, which means they usually don’t allow third-party software into the OS or, even worse, can’t be patched. As IoMT devices proliferate beyond hospital grounds, connected medical equipment used in homes and even in human bodies has become vulnerable to attacks.

Medical IoT security incidents are on the rise, according to the 2018 HIMSS Cybersecurity Survey. A study by Netherlands-based Irdeto goes even further, showing how organizations in transportation, manufacturing and healthcare have suffered substantial losses due to IoT-related incidents. According to the report, such incidents cost on average more than $330,000. Of the 700 enterprises surveyed across China, Germany, Japan, UK and the U.S., 80% admitted to suffering an IoT-related cyberattack in the past year. And almost half of respondents said they need additional expertise within the organization to address all aspects of cybersecurity. More worrying is the fact that 82% of organizations that manufacture IoT devices are themselves concerned that what they put on the market is not adequately secured against potential cyberattacks.

The U.S. Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency recently issued an urgent notice that researchers found a potentially deadly flaw in cardio defibrillators equipped with wireless functions.

Patient data is a hot commodity

The dangers don’t stop at the hardware level. IoMT devices frequently access healthcare networks, expanding the attack surface for criminals to steal electronic medical records and other patient data. Cybercriminals then use the data for monetizing because it is especially lucrative in fraud and extortion campaigns.

As IoMT devices continue to proliferate, and with it the potential for attacks and network breaches, healthcare organizations must be prepared to monitor and detect threats for thousands of endpoints. This means an additional challenge of ensuring the best security posture along with meeting stringent compliance measures.

Catching attackers in transit

The inability to install security reporting agents on individual IoT devices has brought to light a serious issue: Attacks are typically detected when it’s too late. This challenge has given birth to a new category of security mechanisms expressly designed for individual and networked IoT devices. These systems use network traffic analytics (NTA), a technology that lets IT admins detect anomalous network traffic behavior they would normally have missed without the need to install an agent.

The technology is well suited to healthcare environments where IT staff is limited and the specialized skill set of a cybersecurity analyst may not be among the ranks.

The value of NTA is two-fold. First, it identifies and reports what looks like anomalous network traffic without any agents installed by non-intrusively taking a copy of the network traffic for analysis. Second, it focuses on the network traffic metadata without the need for deep packet inspection, thus providing insights into all traffic — regardless if it’s encrypted or not. This also means NTA meets the compliance requirements of GDPR, HIPAA and the like, allowing logs to be stored for future forensics analysis.

Perhaps most importantly, NTA automates the process of security incident triage to accelerate investigations and reduce the number of trivial alerts, addressing the ongoing issues associated with alert fatigue that so many IT personnel face. It uses machine learning models trained in complex scenarios to correlate thousands of events and report anomalous traffic with high accuracy. Additionally, NTA provides detailed explanations for the incident severity score and recommends remedial actions to speed up incident response.

Whether you’re a small medical practice or a state-level healthcare institution, an NTA-based security tool dramatically reduces the risk of exposure to your IT infrastructure, sensitive medical equipment, patient data and even patient lives from the increasingly sophisticated online threats.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


June 19, 2019  1:08 PM

Getting started in enterprise IoT: Six tricks for solving the puzzle

Lou Lutostanski Profile: Lou Lutostanski
Enterprise IoT, Internet of Things, iot, IoT management, IoT partners, IoT pilot, IoT project, IoT projects, IoT strategy, KPI, ROI

IoT is the Rubik’s Cube of business technology. Every company’s playing with it, a few are starting to nail it and many are stuck rotating through possible solutions that aren’t really panning out.

As with the confounding cube, learning the experts’ secret tricks will increase your chances of succeeding time after time.

The foundation of solving the IoT puzzle is this: Both the starting point and ultimate objective are the use case. A use case is a list of actions spelling out how a person and system(s) accomplish a particular goal.

At the outset of any IoT assignment, we invest ample time working through the use case with clients using this tripartite focus: people, systems and goals. Then we dig into our bag of tricks:

First, convene all the players. One of the biggest mistakes businesses make when they launch IoT initiatives is focusing too little on the people part of the use case. Every initiative should start with people. When you’ve identified the broad outlines of a use case — say, your company wants to better manage its truck fleet — get everyone involved in the room and ask them to share their dream scenarios for connectivity, insights and actions.

The fleet manager will want to know where the trucks are at a moment’s notice. The CEO will want to be able to tell how much money the fleet is making or losing. Maintenance will want to know what trucks need to be fixed when and where the parts are. Customers will want to know where their delivery is and when it will arrive. Drivers will want to know their schedule, performance versus their peers and how they’re faring on incentives. The CFO will want to know about extra capacity in the fleet. Customer service will want to know about breakdowns and ETAs for getting back on the road. The list gets long, which is a good thing: Potential uses equal potential business gains. You can prioritize later.

Source: Avnet

Second, create personas. The brainstorm is just the start. As you refine your IoT plan with your core team, create personas — fictional identities — for all the stakeholders, including those outside your organization, who would conceivably touch the things, data and insights in your IoT system, and who will take or feel the actions. Visually map personas to every use case you put on the drawing board. In the corporate truck fleet, that group would probably be the people mentioned above plus truck manufacturers, insurers, lawyers and logistics experts, for starters.

Third, be sure to span departments. Let’s say the truck fleet is owned by a city, and its fire department needs a big vehicle to block Elm Avenue commuter traffic during a fire. The best-placed available truck with an on-shift driver might be in public works, sanitation or parks and rec, or it might even be a school bus. Make sure your IoT use case spans all the departments, because an asset is an asset no matter whose name is on it.

Fourth, think of your IoT endeavors in terms of a platform first. You may not see it yet, but your business will have as many use cases as it has roles and functions inside the organization. That means you’re better off with an approach that is inclusive and flexible instead of narrow and tech-specific. Ideally, the goal is to create platforms for your things that can consume data in any form and share it via any device. Too often, companies handcuff themselves by building a project around one use case and one proprietary technology. The organization can’t extend, scale, modify or sustain the app, and not surprisingly, there’s negligible return on their investment.

Fifth, keep your eye on the KPI and the ROI. The ideal first IoT project is one with a high potential return on investment, but that’s simple enough to get done. Once you ace a simple project, you can go bolder and more complex. When you do choose a concept, apply numbers to everything, including expected cost, savings, revenue, operating efficiency, customer experience value and so forth. Every use case implies a set of quantifiable key performance indicators that will yield actionable insights. It’s critical to identify those KPIs.

Source: Avnet

Finally, plow your insights back into the business. In your early planning stages, your concept must show the data you capture being re-ingested by people inside the organization, either to drive the actions of a user or do something on the back end, for example, initiate a repair ticket. If you’re capturing data and not using it to make your people smarter and better at what they do, then it’s not a viable IoT use case.

These tricks — which admittedly smack of common sense — will help anyone more quickly solve the puzzle of IoT, whether you’re managing a smart truck fleet, factory, store chain, health care operation, workforce, energy grid or city.

And soon, as with the multicolored cube that used to be so perplexing, everyone will be getting it right.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


June 19, 2019  11:21 AM

Capitalizing on the world’s most valuable intangible asset: Data

Joy King Profile: Joy King
Data governance, Data lake, Data Management, Data monetization, data swamp, Internet of Things, iot, IoT data, IoT monetization, unicorn

All entrepreneurs dream of the moment when their startup becomes a unicorn. Every startup craves the unexplainable valuation that is gifted to companies that burn through cash but continue to attract and retain investors. Every employee imagines the moment his company rings the bell at the New York Stock Exchange and his equity stake triples in value in a single day.

The climb to unicorn valuation is shrouded in mystery — and some would say covered in clouds. But one component of unicorn valuation that should be the obsession of every leadership team is the contribution from one of the financial world’s most underreported intangible assets.

The growth of intangible assets on corporate balance sheets

Intangible assets are those that are nonphysical, but identifiable. Examples include patents, copyrights, licensing agreements and even website domains. Intangible assets, with the exception of goodwill, can be bought and sold independently of core business revenue products. Historically, tangible assets, including cash, inventory, investments and even real estate, were the majority of the components of any company’s balance sheet. But I recently discovered a Forbes article that made a dramatic claim: “Intangibles have grown from filling 20% of corporate balance sheets to 80%.” This is a trend that will continue because most companies do not yet include what should be, and what already is for many, one of the most highly valued intangible assets: data.

Warning! Data swamps ahead

In every industry, the value of data is skyrocketing, and it will soon become a common practice to include the value of this intangible asset on every company’s balance sheet. The path to unicorn valuation gets a lot shorter if this invaluable intangible asset is monetized correctly. Collecting massive volumes of data spanning customer information, operations and supply chain, product development and delivery, and financial and industry trends is a key first step. But when that first step leads to a data swamp, that asset becomes a liability, especially in the face of data privacy and regulatory expectations. Architecting a unified and executable strategy — that word executable is very important when companies get distracted by “free” open source software — is the fork in the road between a highly valued intangible asset and high-risk liability. Analyzing the data without limitations on volume, without compromises on speed, and without well-built bridges between data repositories is not easy, but it’s mandatory.

Short-sighted data brokers miss out on major windfalls

Given the challenges of an executable unified analytics strategy, the first thought that many enterprises have when they seek to monetize their data is to sell it. Data brokering is a relatively straightforward way to add a revenue stream and monetize data. But truly data-savvy companies know that selling their most valuable asset is short-term thinking at best. Consider a company like Netflix, which knows so much about what each of us watches, our interests, the timing of our watching, the keywords and genres we search for and so much more. That data is obviously valuable to many other companies, but Netflix knows that using that data to better engage us all as customers, including algorithmic-driven analytics to create personalized recommendations, as well as providing guidance for its own content creation and development decisions, is the most effective way to capitalize on its data.

New revenue-generating services for auto manufacturers

Automobile manufacturers are following the same path. Tesla is leading the way, but companies like GM, Chevrolet, Ford and BMW are not far behind. The volume of data that these manufacturers are collecting today spans every component of their vehicles and every aspect of how and where and when those vehicles are used by customers. This data is easily classified as one of the most highly valued intangible assets when you consider the new services that could — and will! — be created based on this highly personalized data. Imagine a service offered by your automobile manufacturer that provides you, not your insurance provider, with data that shows your safe driving actions, including following speed limits, blinker usage and no heavy braking due to tailgating. This service is now a benefit to customers, not a threat, and it is also a net-new revenue opportunity for the manufacturer.

Increasing shareholder value by monetizing and protecting data

The balance sheet is a very simple concept. Assets minus liabilities equals shareholder value. The bigger the total asset value is, the better — unless the liability is growing as well. Big data can make a big difference on both sides of the balance sheet, but the true industry disruptors know that building the intangible asset valuation of their data puts money in their pockets and drives the greatest market differentiation.

Everyone has patents, licenses, goodwill and the standard intangible assets. But not everyone is taking action to build up what is soon to be the most valuable intangible asset: data. When I consider my own personal investment strategy, I carefully consider the indications that the company whose shares I am evaluating is focusing on both monetizing and protecting their most valuable intangible asset. Smart investors like Warren Buffett, Jeff Bezos and Peter Thiel did exactly that over the last couple of decades and their strategy clearly paid off — and paid out.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


June 18, 2019  2:00 PM

Is your organization in IoT for the short or long haul?

Ken Figueredo Profile: Ken Figueredo
Enterprise IoT, horizontal platform, Internet of Things, iot, IoT platform, IoT strategy, IoT verticals, vertical platform

A senior executive from Ericsson recently started a discussion about horizontal IoT platforms. Horizontal IoT platforms are general-purpose platforms. They include reusable functions to support applications in many different industry verticals. Horizontal platforms look appealing in concept. However, a common criticism is that they are a distraction from immediate needs. They might also compromise on the specific needs for vertical use cases. Somehow, this implies a suboptimal solution.

The advice to solution providers from Ericsson’s Rob Tiffany was to start solving specific problems related to connected intelligence that are customer pain points. Customer pain points are an appealing way to tackle IoT applications: They capture management’s attention. They provide organizational focus. Their boundaries are cleaner, making for a straightforward business case. And, successful implementation yields immediate and visible results.

But should that be the end of the debate? No, not in a market where industrial organizations are still learning about IoT.

Keep in mind that IoT technologies cover topics that are outside the core competencies of many industrial organizations. Moreover, while near-term solutions are good, their knowledge deficit means that longer-term considerations are not even on the radar. Think of the typical IoT pilot project or the IoT solution team working on a well-defined use case. How many of them are planning ahead for second-generation requirements? How many are thinking about the need to scale up and support multiple applications? What about interoperability for cross-silo applications or opening solution stacks to partners in an extended value chain? And how about secondary uses and business models for IoT data?

Strategy is more important than IoT technology

A growing school of thought argues that organizations need to take a strategic approach to their IoT deployments, one that emphasizes horizontal capabilities. Rami Avidan, now of Deutsche Telekom’s T-Systems business unit, talked about strategy rather than technology as the critical challenge of enterprise IoT adoption. He explained the choices that businesses face. An organization will have fast results if it is selling a service that gains uptake rapidly. Conversely, if an organization is digitizing a factory, that’s not a quick fix. The work involved in deploying sensors, linking them, optimizing the data and changing the behavior of machines is a long-term process.

He also pointed out that partners, ecosystems and standardization are three critical elements in delivering viable solutions. Partners are essential because there are so many elements in delivering an IoT application; no single organization has mastery over all of the solution elements. Ecosystems represent environments where partners have laid the groundwork to collaborate. This eliminates many of the technical pitfalls. Ecosystems also provide workable commercial models and solution templates. Standardization addresses longer-term benefits by providing clear rules of engagement on technologies, notably in the area of security.

IoT standardization

There is a broader recognition about the value of IoT standardization. Here is a recent viewpoint from Enrico Scarrone, who works for Telecom Italia Mobile and is the Steering Committee Chair of the oneM2M standardization initiative. His observations describe the impact of fragmentation and integration costs on the viability of IoT solutions. This may not matter to some companies — they will focus on commercial imperatives and use a quick, off-the-shelf solution that meets their application needs. Others companies will decide that product sustainability is more important. In this case, they will opt for a standards-based system. As Scarrone observed, there is not a default decision that works for all companies and differing product timeframes.

The consequences for purely vertical thinking, however, are to build downstream switching or systems integration costs into IoT applications. With horizontal thinking, some of those costs are brought forward in time. That probably implies a financial and time-to-market penalty. On the flip side, it encourages designers, operators and mangers to consider sustainability issues. This covers the potential to extend their IoT applications and to look for expansion opportunities arising out of cross-silo possibilities.

For companies adopting IoT concepts into their business operations, the question is whether they are in it for the short of the long haul.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


June 18, 2019  10:16 AM

True IoT security in the energy industry requires continuous compliance

Reggie Best Profile: Reggie Best
critical infrastructure, cyber-situational awareness, energy industry, energy providers, Internet of Things, iot, IoT compliance, iot security, IT/OT convergence, NERC-CIP

In April, I wrote about the IoT-specific challenges in the government vertical. Today, I’d like to put a spotlight on the energy industry.

In the past, we didn’t really have to worry about attacks targeting our electric grids or utility plants because they weren’t connected to the internet and garden-variety cybercriminals had no way to infiltrate operations technology (OT) environments. OT networks and the devices deployed on them — industrial control systems (ICS), supervisory control and data acquisition (SCADA) systems, programmable controllers, etc. — were isolated from IT networks and, thus, the associated array of cyber-risks. Today, however, the energy industry is operating in a very different environment.

The Industry 4.0 movement, combined with the increased adoption of IP-enabled infrastructure to overcome the North American electric power grid challenges, has resulted in the convergence of OT and IT networks. No matter the form — transmission and distribution, electric or water — the energy sector relies on a vast supply chain of IT and OT from third-party providers. ICS, SCADA, controllers and other OT devices are now part of the network ecosystem. This fact alone can undermine the security posture of any environment.

While digital technologies and other network modernization initiatives can have a positive impact on the business side of the house — increased efficiency, enhanced product quality, better decision-making and an improved bottom line — it also introduces new security and compliance risks.

For starters, the convergence of IT and OT networks has dramatically expanded the attack surfaces of energy organizations; each connected device now represents a potential entry point for cybercriminals. There’s also the challenge of securing OT devices and environments that weren’t designed to support conventional security technology — because, as mentioned earlier, cybersecurity never used to be an issue. And then there’s compliance with the North American Electric Reliability Corporation (NERC) Critical Infrastructure Protection (CIP) requirements.

In response to the 1965 blackout in which 30 million customers were impacted across the northeastern U.S. and southeastern portion of Ontario, NERC was formed to promote the reliability and adequacy of bulk power transmission in the electric utility systems of North America. NERC’s CIP plan includes standards and requirements to ensure the bulk electric system is protected from unwanted and destructive effects caused by cyberterrorism and other cyberattacks that could lead to instability or power failure.

Achieving continuous NERC-CIP compliance

The U.S. Department of Energy released an August 2017 report that found seven “capability gaps” in the power sector’s ability to respond to a cyberattack on the electric grid. One of the seven identified gaps was “cyber-situational awareness and incident impact analysis.” This is still an issue today, thanks to increased network complexity and the proliferation of connected OT endpoints.

OT and IT environments are now interconnected, and devices and systems must interoperate for full functionality and value creation. This means that an energy manufacturer may be more inclined to prioritize devices that enable a smart grid, for example, based on interoperability rather than their security profile. So, the process to achieve continuous compliance starts with conducting a gap analysis of compliance management to determine what IT and OT assets it needs to protect, and the likely impact(s) that could result from compromised compliance.

In short, successful security and compliance programs require complete cyber-situational awareness — which is why it’s not surprising that this was raised as a point of concern in the DOE’s report. Achieving cyber-situational awareness can seem like an impossible task with today’s dynamic, hybrid infrastructures. But, the time and energy — no pun intended — spent on making cyber-situational awareness a reality is well worth the effort. Especially when you consider penalties for noncompliance with NERC-CIP requirements can include fines up to $1 million per day.

Avoiding these fines through continuous NERC-CIP compliance requires four key capabilities:

  1. Comprehensive visibility into all endpoints, assets and connections across all environments — including on-premises, virtual, cloud, OT, ICS and so forth — for an accurate understanding of the state of network infrastructure.
  2. Continuous monitoring of security controls to pinpoint baseline deviations, ensure that system changes do not have a significant negative impact on security, that security plans remain effective following changes and that security controls continue to perform as intended.
  3. Identification of critical and sensitive infrastructure components.
  4. Detection of events or configurations linked to adversarial or anomalous conditions for faster threat detection and incident response.

These capabilities give energy companies the agility they need to keep pace with changing NERC-CIP requirements, not to mention the ever-changing threat landscape.

Consistency is key

The benefit of comprehensive cyber-situational awareness can be summarized in one word: consistency. Energy organizations can benefit from consistent real-time network visibility, consistent change monitoring, consistent intelligence on how changes affect security and compliance, and consistent policy controls across environments.

Consistency of this nature helps energy companies maintain continuous security and NERC-CIP compliance, regardless of how endpoints move and environments change. And with these results, energy organizations can move beyond operating with a metaphorical and tactical “keep the lights on” approach to security, to a truly strategic security approach that actually does keep the lights on for millions of customers!

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


June 17, 2019  2:03 PM

IoT, cloud, security and the IT professional: Transforming traditional roles

Andreas Pettersson Profile: Andreas Pettersson
Cloud Computing, cybersecurity, Internet of Things, iot, IoT cloud, IoT communications, IoT professionals, iot security, IT teams, Physical security

Today’s IT professionals must have it all: knowledge of scripting, networks, application development, artificial intelligence, architecture and more. But it wasn’t always this way, as we’re seeing the role of IT shift significantly with the proliferation of the latest technological advancements, such as cloud computing, the integration of IoT, incorporation of physical security and an increased emphasis on cybersecurity and privacy.

Incorporating physical security

Though once seen as a siloed department within the enterprise, the IT team has now become a crucial component and resource when it comes to the implementation of physical security infrastructure. With the latest technology developments in this market, including networked devices and remote monitoring, they can also bring complex technical challenges that can be mitigated with the help of IT specialists. As such, the role of the IT pro must include a working knowledge of security best practices and how technology is used to protect people and assets.

IoT

IT comes into consideration when you think about the various devices and sensors that must be integrated through an organization’s single, unified platform. A wide array of knowledge is required of today’s IT professionals as they work to achieve this in a secure fashion. The role of IT has shifted to where this level of collaboration now focuses more on data integration, with the goal of extracting the most relevant information for intelligent decision-making. IT teams help identify the right methods for achieving this convergence in an efficient and cost-effective manner.

Security and cybersecurity

Because today’s networked services are about more than just physical devices, they’ve become susceptible to cybervulnerabilities that can be best understood by the IT team. From the beginning to the end of implementation, IT personnel are tasked with identifying any holes or gaps that may leave a system open to vulnerabilities. IT leaders must ensure that the proper techniques are followed when it comes to strengthening cybersecurity, such as updating passwords, encrypting data to the cloud, automatically installing software patches and more.

Increased communication

Collaborating across departments requires a new level of communication for organizations, indicating a transformation in the requirement of certain soft skills. More specifically, communication skills are paramount to anyone in IT, as these individuals are required to not only work across many teams and groups, but also outside departments and contacts to integrate systems across an organization. Collaboration between physical security and IT, alongside additional outside contractors, requires strong leadership and communication skills.

Cloud computing

The introduction of the cloud into today’s modern organization has encouraged IT professionals to evolve their knowledge and skill set to incorporate cloud-based functionality. IT professionals benefit from cloud-based services, as they reduce overhead costs, as well as physical hardware they would otherwise have to maintain — this minimizes their security footprint. The IT department must be well-versed in the efficient implementation and use of cloud-based technologies to ensure an organization’s survival with respect to ever-reducing margins.

The inherent value of an IT team is that these professionals are well-versed in the ins and outs of their companies’ current computer and technology setup. The changing nature of their role within an organization requires them to understand emerging technology and tailor it to their duties and goals while remaining up to date in the latest innovations. As these advancements change the needs of organizations, so too should the knowledge base that IT professionals possess and implement across departments.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


June 17, 2019  11:35 AM

Unraveling the predictive maintenance conundrum in the IoT era

Rajesh Devnani Profile: Rajesh Devnani
ai, augmented reality, Enterprise IoT, Internet of Things, iot, IoT analytics, IoT data, IoT strategy, Predictive maintenance

One of the pioneering applications fueling the exponential growth of IoT in the digital era has been the rise of predictive maintenance as the poster child of IoT applications. Estimates from analysts vary widely, but the overall predictive maintenance market will grow at a rapid clip compared to any other IoT use case. With global assets under operation amounting to roughly 2.5 times the world GDP, the economic impact from effective maintenance practices can be truly transformative and extend to trillions of dollars.

In the past, maintenance was traditionally relegated the status of a support function within the realm of manufacturing and was treated as a pure-play cost center. Predictive maintenance flips this status over and elevates the maintenance function from a cost-centric role to a prime strategic role within the organization.

With the potential that predictive maintenance promises, there is an excessive hype surrounding the topic and enough content peddling the power of algorithms as a magic wand to realize maintenance nirvana.

The reality, unfortunately, is more nuanced. Predictive maintenance is not a pure-play technology gig. Predictive maintenance can be a distinct competitive differentiator — whether you are competing on cost, customer service or innovation — and can truly bring immense business value to your organization. It’s imperative, however, to peep under the hood to gain perspective on the practical application of predictive maintenance and resolve the conundrum:

  1. It’s all about the metrics.
  2. Busting the lone ranger attitude.
  3. There is no “one size fits all.”
  4. You may have a hammer, but everything is not a nail.
  5. The great horizontal versus vertical divide.
  6. It’s not just in the asset.
  7. Predictive is not the end-all.
  8. Measurement is key.
  9. Deployment and sustainability is the end game.
  10. The future beckons, so embrace it

1. It’s all about the metrics
The risk that the technology works just fine but fails to deliver the goods is quite high in the case of a predictive maintenance application. It’s easy to get enamored by a new shiny toy and dive straight in. Instead, it is imperative to begin with a clear understanding of the business goals. It is imperative to have a clear understanding of the specific KPIs that you wish to impact and more specifically by how much — for example, overall equipment effectiveness, mean time to repair, mean time between failures, on time in full, maintenance effectiveness and so forth. An overall maintenance function audit is a wise strategy in this startup phase.

2. Busting the Lone Ranger attitude
Functional maintenance specialists have a tendency to be enamored with the fruits of their toil and believe that they can alone deliver the results. This induces a siloed mentality that is counterproductive. It is imperative to bust the silos and have integrated thinking when considering a predictive maintenance initiative. Predictive maintenance is a team sport. Maintenance should work as an interdependent function and balance considerations related to other functions including production, inventory, human capital and customer service to optimize overall performance.

3. There is no “one size fits all”
Do not paint all your assets and asset classes with the same brush. Before diving deep, it is imperative to classify your assets into distinct classes, each with its own maintenance strategy suited to maximize business value. Some situations need reliability-centered maintenance for critical high-value assets, but a reactive approach may be sufficient for some simple noncritical assets. The segmentation should enable classifying assets into distinct classes, each with its own strategy but ultimately delivering the best business outcomes.

4. You may have a hammer but everything is not a nail
It is important to have an in-depth understanding of the physics of each asset and its potential failure modes and root causes. Vibration monitoring may be a great technique, but it may not be the right strategy for what you are trying to monitor, for example loose electrical connections. An effective understanding of failure modes and how to proactively preempt them through measuring what characteristics is essential. Knowing where to apply what sensing technique — thermal imaging, ultrasound, infrared, spectral analysis, vibration analysis and so forth — needs good domain expertise.

5. The great horizontal versus vertical divide
One school of thought contends that maintenance is a horizontal function and given enough historic data — hopefully labeled — smart algorithms can figure out all underlying patterns and correlations, delivering near-perfect insights without an iota of understanding on the vertical or asset class. The other school believes that an in-depth understanding of the asset, its constituent components and its functioning modes is essential to a good maintenance strategy. The reality lies somewhere in between. Having domain expertise on the pertinent asset classes and the contextual environment is definitely useful, but there is truly a bit of magic behind the data science algorithms. While they do uncover interesting and often counterintuitive insights that are humanly impossible in the end, the objective should be to balance the two perspectives to derive the most optimal value from your implementation.

6. It’s not just in the asset
Birds of the same feather may flock together, but assets of the same make and model don’t always perform similarly. An inordinate focus on just the asset data — based on sensors — can trip us up. The sensor data needs to blend with the context data — such as ambient conditions, operational environment, asset operation mode, general asset upkeep, etc. — to deliver the right insights. Context is really the king.

7. Predictive is not the end-all
According to Gartner, predictive maintenance is a strategy on a continuum from reactive to financially optimized. In that sense, predictive maintenance is a cog in the wheel that powers higher end objectives. The insights gleaned through predictive maintenance should be actionable in a sustainable way for translating them into real tangible value for the organization. The end goal is clearly either financial optimization or driving new innovative business models. Maintenance should work as an interdependent function and balance considerations related to other functions including production, inventory and human capital to optimize overall financial performance.

8. Measurement is key
Very often, strategic transformation initiatives like predictive maintenance get a bad rap for not delivering the goods. A major issue is often the lack of effective communications on the benefits achieved — e.g., by averting major breakdowns through predictive maintenance, avoiding truck-rolls, etc. — and quantifying the financial impacts. An effective baselining and measurement strategy is key to ensure that the benefits realized through predictive maintenance are adequately measured. Communicating the positive results to gain a wider traction within the organization and keep the momentum going is an imperative.

9. Deployment and sustainability is the end game
Creating standalone algorithmic models and proving their efficacy may get all the attention, but is not the end game. The real value is derived once those models have been deployed in a production context and are integrated into the business applications landscape. It is imperative to refresh models periodically to avoid model fatigue and ensure that the model incorporates new contextual information to sustain high levels of accuracy.

10. The future beckons so embrace it
Predictive maintenance is a fast-evolving function and greatly benefits from its confluence with technologies like AI and augmented reality. McKinsey forecasted productivity increases of up to 20% and reduction of maintenance costs greater than 10% through application of AI in predictive maintenance. Computer vision advancements will further enhance traditional predictive maintenance applications. Augmented reality applications will enhance maintenance worker productivity through wearables for applications like guided repairs. The era of self-healing machines is also not too distant. The field will continue to make rapid strides and it is imperative to stay plugged in and start piloting with new emerging technologies on an experimental basis.

Delivering true benefits from the implementation of a predictive maintenance program calls for a holistic perspective and requires the right blend of domain, consulting, technology and analytic skills. In summary, it’s the recipe that matters more than any one of the above single ingredients.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


June 14, 2019  10:58 AM

4 ways IoT innovation can transform the utilities industry

Mark Ingerman Profile: Mark Ingerman
electricity providers, Internet of Things, iot, IoT applications, IoT use cases, MQTT, outage response, Predictive maintenance, Smart meter, utilities, utility providers

IoT innovation is turning the utilities world on its head, augmenting smart meters with more advanced monitoring, alerting and data analytics capabilities than ever before. One of the most impactful innovations is a publisher-subscriber messaging protocol called MQTT. In the case of the utilities industry, electrical grids, gas pipelines and water suppliers can use MQTT to more quickly and accurately transmit data between their services and their customers and devices.

This article will provide four use cases that can transform the way utilities, especially electricity providers, can improve performance by cutting costs, improving billing and responding faster and more efficiently to service disruptions.

1. Smart metering

The industry has gone from having meters read regularly by an individual walking around town to reading the same meters by driving down the street with a remote device or cell phone. A real revolution is happening by equipping meters with a way to transmit their data over the power lines. Consider how powerful it will be to have power consumption not by the month, but by 15 minutes or any other interval. By using the MQTT protocol to transmit it, this data can be sent with guaranteed message delivery to provide accurate meter readings in real time.

For one thing, billing could be fairer and more accurate. Consider two households that consume the same power over the course of a month, but one household shifts much of its electricity use to overnight use, where the other does not. In the current system, the more energy-friendly house is subsidizing the other. With more granular energy reporting, utilities can charge more for peak energy and lower the charge for off-peak energy. In this case, the energy-savvy household would see a significant decrease in the monthly cost, while the household using more of the expensive energy will see a sharp increase. In addition to being able to lower the costs for many households and provide better data for peak energy analytics, the new model could encourage all households to reduce peak energy demands, or at least slow their growth.

2. Theft prevention

Industry experts estimate that $6 billion of electricity is stolen in the U.S. each year. By equipping transformers with the same technology as smart meters, transformers could send up their load data as well. By aggregating the load from all of the meters and comparing it to the transformer, a utility should easily be able to find places where there is theft or some other loss and respond quickly. Even with the theft occurring on the other side of the meter, the utility should be able to use analytics to determine where a spike in usage seems out of line from historic norms, including peak usage and outside influences, such as the need for increased cooling.

3. Fast outage response

Utilities no longer have to rely on customers reporting outages, as one feature of the MQTT technology is “last will” message alerting. If a device were to go offline unexpectedly, the MQTT broker would publish a predefined message down a predetermined path. This feature is often used to manage presence. In this case, it would manage the state of a household. If the power were to go out, the MQTT client in the smart meter would disconnect and the MQTT broker would notify the subscriber or utility team to let them know the power has been lost. At this point, the utility could dispatch a repair crew without waiting for a call. Imagine how pleasantly surprised a homeowner would be to come home and find a note that the power had gone out and the utility company has already repaired the damage — that is customer service at its finest.

4. Smarter outage response

Consider the above scenario, but this time all of the households on the same block, using the same transformer, all lose power together. Now, the problem could be the power line connecting the transformer to the power, the power line connecting the first house to the transformer or the transformer itself. If the smart transformer is still online and functioning correctly, it’s likely the problem is the line coming out of the transformer. But if the transformer is offline and messages beforehand have shown overload or overheating, then the loss of power is likely a blown transformer. Armed with this knowledge, the utility can dispatch a response crew that has the proper replacement transformer and equipment, eliminating duplicate efforts and wasted resources — for example, by sending one crew to bypass and another to replace the transformer.

Moreover, instead of waiting for the failure, the utility can be proactive. Analyzing the load on the transformer and its diagnostic information and taking into consideration the age of the transformer and its load rating, utility teams should be able to predict when a transformer is likely to go. If an external event, such as a heat wave, is approaching, the utility can proactively replace or prioritize any equipment likely to fail during the normal business operation — reducing inefficiencies by addressing potential issues ahead of a heat wave or storm, and reducing costs, such as having to pay overtime for teams to replace multiple devices on an urgent or emergency basis.

The above scenarios provide a brief glimpse of how MQTT and other IoT technologies can help transform a utility from an outdated, inefficient and unresponsive organization into a proactive, efficient, fair and customer-focused business. Of course, this is only the beginning. The smart grid opens the door for more sophisticated analytics and for even more sophisticated billing where customers can negotiate directly with energy providers for both standard and peak energy delivery. Here, energy providers can use smart grid analytics to target the appropriate customers for different pricing mechanisms. The more compelling cases are likely not even known today. The sky’s the limit on how this technology can be used moving forward.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


June 13, 2019  12:52 PM

3 IoT technologies primed to shape government asset management

Tom Amburgey Profile: Tom Amburgey
Asset management, Internet of Things, iot, IoT applications, IoT examples, IoT sensors, IoT use cases, smart city, smart city applications

Imagine the nightmare of receiving a utility bill for more than $25,000. I recently read an article about a couple getting such a bill for their modest 1950s home, and it reminded me of the true impact that IoT can have on citizen engagement.

When homeowner Chris Rose went online to check his water and sewage bill, he was stunned to find he owed $25,787.73. For the Roses, an unapparent leak turned into a life-altering bill that left them scrambling not only for answers, but also for the money to pay the staggering charges. I found myself wondering how different the situation could have been if this family lived in an area equipped with smart utility sensors. Had their home been in an area enabled with smart sensors, utility workers may have been able to identify and address the leak well before the family was hit with this unimaginably high bill and before nearly a million gallons of water were wasted.

The Roses’ situation shows just one of the many ways IoT can shift the government and citizen experience for the better. With the potential to accurately monitor consumption patterns, traffic flows, street and sidewalk conditions and much more, IoT is poised to redefine government operations by enabling proactive management protocols that can head off many problems before they develop.

With numerous IoT options available to government leaders, here are the three areas that I foresee having the greatest impact in the immediate future.

Benefits afforded by utility, infrastructure and traffic sensors

Utility sensors
IoT makes energy use more efficient — specifically electric, water, oil and gas utilities — and helps relieve some of the stress related to energy demand. If the Roses’ home had water sensors or water leak detectors, for instance, anomalies in consumption would have been detected before the leak racked up more than $25,000 in fees, particularly since the family’s bimonthly water and sewer bill averaged around $110.

Infrastructure sensors
Although infrastructure is designed to last many years, time takes its toll as roads eventually crumble, bridges start to crack and railway tracks fall into disrepair. However, governments across the globe are installing technology in hopes of improving infrastructure, saving money and improving citizens’ lives. Last year, for instance, the Colorado Department of Transportation teamed up with Integrated Roadways to test a half mile of the company’s smart pavement on a dangerous and accident-prone spot for drivers on U.S. Route 285. Once in place, the asphalt, brimming with safety sensors and the latest fiber optic and wireless technology, will be used to detect accidents and provide real-time services, such as road condition and traffic alerts to drivers.

Traffic sensors
We’ve all been caught in traffic jams or have found ourselves in the middle of hazardous road conditions that could have been avoided had technology alerted us. Traffic sensors are being used to identify traffic patterns to reduce congestion and allow for adjustments based on usage. Portland, Ore., is using 200 traffic sensors along three high-traffic corridors that account for more than 50% of the city’s road fatalities. Data from these sensors will be used to connect vehicles’ GPS systems with traffic cameras, providing city personnel the insight to help control traffic patterns and increase safety within these dangerous corridors.

IoT provides a world of possibilities

As IoT technologies continue to grow, so will their influence on our day-to-day lives. Thus, I encourage you to think about your current city operations and how they could benefit from IoT. If your community struggles with hard-to-manage, deteriorating roadways or citizen reports of skyrocketing utility bills, or constituents wish you could better enable them to avoid traffic congestion, perhaps it’s time to take a closer look at the benefits IoT can bring.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: