IoT Agenda


December 3, 2019  4:54 PM

Battery-free smart home adoption? It’s possible

Srinivas Pattamatta Profile: Srinivas Pattamatta
battery life, Bluetooth, Internet of Things, iot, IoT connectivity, IoT device, IoT wireless, smart home

The time of the smart home is now. A staggering 1.15 billion annual shipments of Bluetooth smart home devices are expected by 2023 with connected home devices exceeding home automation by a ratio of three-to-one, according to a 2019 Bluetooth market update. It’s no longer science fiction to control lighting or regulate a home’s temperature with a voice command or pressing a button on a smart phone. Smart home technology today has become widely accepted and accessible.

While voice assistants are one of the most common smart home automation and entertainment devices for homeowners and property managers, there are many other emerging technologies that make up the average smart home. Specifically, there are three core categories for smart home technology, each of which feature connected devices that require an increasing number of batteries:

  • Home entertainment: Remotes, voice assistants and audio systems
  • Home utilities: Connected refrigerators and washing machines
  • Home automation: Security systems, sprinklers and thermostats

Why is this important? As the growing number of wireless devices increases so does the number of batteries needed to power them. As consumers increasingly use more batteries, the financial and environmental costs of replacing them will rise, too.

The majority of IoT devices are connected wirelessly to the Internet via sources like Wi-Fi, Bluetooth or ZigBee. Wi-Fi is most commonly used for high input applications and streaming data on connected devices and ZigBee is a two-step connection requiring a hub that connects devices to Bluetooth or Wi-Fi. Bluetooth, with advanced BLE or Bluetooth 5, enables long, Wi-Fi-like range in the home and compatibility with smart phones, laptops, earphones and other devices.

The chosen source is vital as we look to examples within the emerging smart home categories. For example, home security devices have portable sensors that run with batteries and connect to security systems via Bluetooth.

The battery dilemma

Because most sensors are constantly online and communicating information regularly, their batteries are always powered on, which results in decreased battery life. This, in turn, results in more battery changes or larger batteries that will also need to be replaced way too often. Homeowners or property managers then must consistently monitor for low batteries or risk losing the protection and peace of mind a security device provides.

This is a prevalent issue across several connected in-home devices, such as automated door locks, automated sprinkler systems, temperature sensors and more. These battery-powered devices are not able to use rechargeable batteries due to current U.S. regulations. With the option of rechargeable out of the picture, homeowners and property managers are again faced with the dilemma of battery replacement.

Imagine a world in which extended battery life is the norm. A world in which you replace batteries every few years or even the entire life of the device; not every few months. Technologies like Atmosic’s M2 and M3 Series solutions can help remedy these challenges.

Leveraging multiple sources of power, such as radio frequency, thermal, light and mechanical, to harvest energy will be able to effectively power the increasing number of smart home devices. With the prospect of extending battery life or eliminating batteries altogether, we will be able to greatly reduce the financial burden and environmental impact that comes with battery replacements.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

December 2, 2019  12:38 PM

The ten things that should happen in 2020, continued

Mark Troester Profile: Mark Troester
AI and IoT, app development, Artificial intelligence, Internet of Things, iot, IoT analytics, IoT and AI, IoT trends, Machine learning

This article is the second in a two-part series. Read the first piece here.

In part one, we looked at the first five trends I see for DesignOps collaboration, digital innovation, AppDev and WebDev alignment, machine learning and AI making it into production in 2020.  Here’s the remaining five:

Modernization will be considered alongside cloud native

Modernization efforts are often thought of — or managed — separately from new application development efforts, except where new development efforts are a complete replacement. This can be due to how organizations structure their maintenance efforts vs. new efforts, or how they separately budget for development.

As technologists, we all know how hard it is to get funding to address technical debt because that effort typically doesn’t result in new application features. But it’s important that we come up with an approach that balances modernization efforts with net new development efforts that are typically cloud native. To accomplish this, organizations might do the following:

  • Organizations that want to continue deriving value from legacy systems can implement an abstraction layer to make legacy capabilities accessible through a common API
  • They can shift application workloads to the cloud while keeping data in place using new connectivity constructs that aren’t difficult to implement and don’t require risky firewall and network reconfiguration
  • Once the abstraction is in place, organizations can refactor the most critical parts of applications without changing the frontend apps by using a backend redirection to update capabilities

One simple way to ensure that modernization is factored into new AppDev is to think about modernization in the context of integration. Since most new AppDev efforts aren’t standalone, think about what modernization stages need to happen to integrate the new efforts with existing systems.

Organizations will start treating all users like customers

We all know that customer expectations for improvements to user experience are rising, but how does that relate to apps targeted toward employees and partners?

Employees and partners have the same expectations as customers. We’re also seeing new professionals that are not only digital natives, but also digital experts. We’re past the era where only IT experts dictate what apps and technology could be used. With the introduction of cloud and SaaS, people can — and will — bypass IT experts if they aren’t getting what they want, which can lead to another set of issues.

Leading organizations are taking note and are employing different approaches to ensure that employees and partners get consumer-grade experiences, such as:

  • Applying DesignOps to internal initiatives as well as usability and constant optimization to the user experience
  • Using a marketing mentality similar to managing customer conversions to ensure that employees can successfully complete actions that are supported by digital technology
  • Applying personalization to employee and partner experiences based on needs, which helps to better manage a multi-generational workforce as well as cater towards individual preferences that will drive efficiency and productivity
  • Designing customer experiences that are horizontally integrated across different digital touchpoints — omnichannel experiences — that are vertically supported or integrated with employee and partner experiences. For example, customer interactions that take place over an integrated web, mobile and chat experience are integrated appropriately into employee and partner systems to analyze and report on the customer activity and help ensure the right level of human support

Data access and APIs will be used to break down tech silos

Applications that don’t support modern API constructs to provide useful capabilities that are not accessible to new application workloads still exist. It used to be time consuming and expensive to code around these limitations. But with the introduction of data integration patterns that establish standard SQL and REST-style interfaces without writing code, this greatly increases the ability to access application capabilities as well as important data sources.

This not only makes legacy data more accessible, but it also makes new data types like JSON and XML accessible to business intelligence (BI) and reporting that are based on SQL access. Not only can organizations breakdown silos between operational and transactional applications, they can also break them down between operational and transactional applications and their BI and reporting efforts.

This becomes interesting when we get to the point where we don’t think of these different application patterns separately. For example, we don’t think of reporting and BI as something that happens after the transactional applications use operations such as periodic reporting. Rather, we think about how reporting and BI is integrated using advanced analytics in real time to guide the transactional application experience.

Whether we are using predictive capabilities and providing guidance within the application or using predictive capabilities to act on behalf of the user so that the user interacts in a fluid and exception-based pattern, the elimination of these silos can have a tremendous business impact.

IoT and mobility will force organizations to rethink the edge

We typically think about edge computing from an IoT perspective, where we have use cases that require processing either on the actual device or on the network edge that’s closer to the device.

I’d argue that we’ve long since thought about architecture with regards to proximity, and that edge computing predates technologies such as IoT and smart sensors. Given the limitations of network speed, we are careful to place application functionality close to the data.

We use intelligent caching on the device or on the network edge to eliminate roundtrips and network chatter. We design and manage data to address the vagaries of data residency required for international business. We also design levels of abstraction, virtualize data and use data pipelines to shield frontend developers and applications from data complexity.

While everyone is racing to the cloud, it’s important to continue to think about the edge; not only in the context of supporting IoT, but in the broader context of supporting distributed computing.

Another interesting factor is 5G. You would think that with network speeds 10x faster than 4G, 5G would negate the need for processing at the edge. But when it comes to the potential life and death consequences relating to data processing delays in the medical field, or the ability to automate automobiles or smart factories, there will be a need for processing at the edge. As a result, it is expected that mini or micro-data centers will be added to nearby cell towers.

While there is no single architecture pattern that makes sense for all use cases, it’s clear that organizations that are focused on faster data processing and reducing latency with 5G will have an advantage when it comes to improving customer and user experiences.

Organizations will align business and technology efforts more effectively

Perhaps it is a reaction to the philosophy of  “every company is or should be a software or technology company”, or pressure from the board and executive team to keep up with the pace of technology, there is a vital need to align business and technology efforts in a substantial way.

Businesses and IT can learn from each other in terms of the planning process. As many businesses have implemented some form of periodic planning mechanisms, such as OKRs and Kanban, IT has become more agile in developing short-term plans, and are becoming more adept at taking small increments, adopting minimum viable product or moving towards continuous delivery of capabilities and features.

Leading organizations will determine how to marry these different approaches and drive corporate objectives by creating a structured set of goals for each discipline. They will take a holistic view of how technology can help drive those goals by analyzing how digital technology can be used in every business discipline.

This analysis will bring together key business and technical leaders to determine the potential digital projects that are then prioritized by business impact and success feasibility. This periodic planning will be supported by more agile updates and modifications to the overall corporate goals and high-level activities, based on changes to market and business conditions.

Some organization will identify KPIs that flow up from actual customer and user experience by extending traditional infrastructure and application management to providing a customer or user-centric viewpoint. These KPIs are then measured and used as part of the overall approach to execute company goals.

So, hang on to your hats, folks. 2020 is going to be an interesting year.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


November 27, 2019  12:04 PM

SMBs and the great cloud video opportunity

Andreas Pettersson Profile: Andreas Pettersson
AI and IoT, cloud benefits, Data strategy, Internet of Things, iot, IoT cloud, IoT data, IoT strategy, Machine learning

When most people discuss the benefits of cloud for video surveillance, they tend to do so from the lens of enterprise end users. Enterprise investments in camera technology can run into tens of millions of dollars depending on the size of the organization and converting this massive upfront cost into an ongoing operational expense is one of the biggest opportunities for surveillance vendors.

However, SMBs account for many organizations globally as well as in the U.S. As of 2018, there were more than 30 million small businesses — defined as companies that have fewer than 500 employees — spread across the nation accounting for 99.9% of all U.S. businesses and employing 47.5% of all workers, according to the U.S. Small Business Administration (SBA)

Considering the obvious market opportunity, it is somewhat surprising that more consideration isn’t given to the SMB space by the surveillance industry and cloud providers. However, the proliferation of IoT devices, including connected surveillance cameras, means that there is a wealth of data now available to end users of all sizes.

More than 130 million surveillance cameras were expected to be shipped globally in 2018, a 10-fold increase from the less than 10 million that shipped in 2006, according to IHS Markit. Approximately 70 % of the cameras shipped last year were IP-enabled, further adding to the prevalence of video in IoT.

Taking advantage of untapped potential and cost savings

You may be thinking: “Why is it important for me, as a small business owner who has already invested in on-premise video hardware and software, to consider migrating my surveillance system to the cloud? What good is all this additional video data going to do for my business?”

These are valid questions and should be asked by anyone prior to making a significant change in the architecture of their security system. But just as enterprises can realize tremendous efficiencies from shifting management of their video network to the cloud, the same can also be said for SMBs. Aside from the shift from a capital expenditure to an operational expenditure, a cloud-based video system would enable SMBs to benefit from the automatic update capabilities of cloud-based services.

Unlike large organizations that enter into maintenance agreements with their systems integrators to take care of things like camera and server failures that are commonplace throughout the lifecycle of a video system, SMBs are often take care of these issues as they arise either with the help of in-house IT staff, or by paying an integrator for a service call. These headaches are eliminated with a cloud solution because software patches and firmware updates are pushed automatically as they are needed, often without any interruption to the end user.

Another area where the cloud can benefit a SMB greatly is video retention and system scalability. It wasn’t long ago that organizations had to purchase racks of DVRs to achieve the right levels of storage, often limited by physical storage space. Today, video can be retained for as long as needed since physical storage space is not needed with the cloud.

Additionally, the cloud provides a high level of flexibility, providing SMBs with the ability to scale their business as needs change without additional physical storage space. And rather than being forced to purge video due to lack of space — which could contain valuable insights for business managers or used as evidence for criminal prosecution or insurance liability claims — it can now be retained indefinitely.

Leveraging next-gen video analytics

SMBs could also see immediate benefits from a cloud video solution when using video analytics to improve business operations. Driven by advances in AI and machine learning technology, video analytics now goes well tripwires and can detect and classify people and objects with accuracy that is continuously improving.

Some SMBs are already exploring how they can use video data to garner additional insights into their organizations and improve efficiencies. For example, restaurants are now leveraging systems that use cameras in lobbies and other areas of a restaurant to track staff and guests, which is generating information that can subsequently provide feedback on things such as host availability, customer wait times and customer bounce rates.

However, leveraging these types of advanced analytics simply isn’t possible without the aid of the cloud for most businesses as deploying the hardware necessary to run them is cost-prohibitive. As video data continues to grow in importance in the years to come, SMBs will need to look to cloud solutions if they want to keep pace with ever-evolving surveillance landscape and maintain a competitive advantage.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


November 27, 2019  11:29 AM

Cybersecurity risks affect IIoT fog computing

Julian Weinberger Profile: Julian Weinberger
Cyberattacks, cybersecurity, encrypted communications, Encryption, fog computing, IIoT, IIoT data, IIoT security, remote access vpn, VPN

Cloud computing within IIoT is creating new opportunities for manufacturers and industrial systems. From connected cars and smart cities to real-time analytics and 5G mobile, IIoT sensors are generating data in unprecedented volumes.

Since most essential smart factory services would be inefficient without lightning responses from IIoT systems, many factory systems rely on sensors and actuators with built-in time constraints. Any latency or break in signal to operational sensors or actuators could have catastrophic consequences.

To overcome this challenge, leading technology providers have developed fog computing; a virtualized platform that runs essential cloud processes locally across a distributed network of IIoT devices. Fog computing enables consistent, two-way cloud communications between local operational components and remote management points via the Internet in milliseconds.

Closer to the edge

Though still in its infancy, fog computing is already being rolled out for a range of IIoT-based applications. For example, smart cities rely on access to data in real time to run public services more efficiently. In the case of connected cars, sensor data pertaining to road conditions, geo-positioning and physical surroundings is analyzed in real time at a local level. Other types of data, such as engine performance, can also be communicated to the manufacturer so they know when to offer maintenance services or repairs.

Sometimes, IIoT devices are located in remote areas where processing data close to edge devices becomes essential. An oil rig is a good example. A typical oil platform may have about 100,000 sensors generating data at the rate of several TBs every day. To relay all this data over the Internet and back for analysis and response is neither practical nor economical. Instead, cloud services must be brought closer to the edge.

Other applications in the cloud, such as mobile 5G, analyze the aggregated data from many thousands of sensors to identify opportunities for productivity improvements or trends over time. For example, in dense antenna deployment areas, a fog computing architecture with a centralized controller may be used to manage local applications and connectivity with remote data centers in the cloud.

Data encryption

It’s widely acknowledged that most IIoT devices do not have security built-in, and energy providers and manufacturers still deploy IIoT systems in remote, exposed locations. As a result, thousands of smart yet vulnerable mechanisms in physical isolation is a cause for concern as data shared across factory ecosystems and the cloud may be readily visible to unauthorized third-parties.

The best way to compensate for the lack of built-in security is to implement enterprise-grade privacy and protection measures to fog computing systems. Encryption can prevent confidential industrial data, such as intellectual property or operational information, from being observed by cybercriminals, hackers or spies.

Surprisingly, many industrial and manufacturing organizations have yet to introduce encryption into their IIoT environments. More than half — 51% — of organizations still do not use encryption to protect sensitive data in the cloud, according to a Thales study involving more than 3,000 IT and IT security practitioners worldwide.

VPN

The most effective way to ensure communications are encrypted and connectivity throughout IIoT networks is secure is to implement professional, enterprise-grade VPN software. A VPN can encrypt all digital communications flowing between local systems and the cloud with advanced algorithms such as Suite B cryptography. Even if a third-party were to penetrate a device or application, the information itself would be indecipherable.

A growing number of manufacturers and industrial organizations are pivoting to cloud-based VPN services for secure management of remote IIoT equipment because cloud VPN services offer airtight security as well as additional flexibility, scalability and reduced technical complexity. Cloud-based VPN services create end-to-end encryption between an on-premises central management point and remote IIoT devices. The cloud server conducts authentication checks automatically and establishes appropriate tunnels. Best of all, it does not decrypt or store any data that passes through.

Remote access to IIoT devices may also be on-demand and restricted to times and other parameters specified by the owner. For example, access may be limited to service engineers according to the principle of least privilege, which ensures security remains as airtight as possible.

Final thoughts

Although fog computing can improve productivity, efficiency and revenue, it also can put data at risk. Securing all data processed by these critical ecosystems with VPN software is paramount.

A VPN provides secure and reliable connectivity for remote IIoT machines and cloud-based control hubs by encrypting all digital communications passing over the Internet between innumerable devices and the remote administration center. These encrypted connections allow smart systems to send confidential data over the Internet while being shielded from unauthorized third parties.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


November 26, 2019  5:16 PM

Secure IoT devices and networks and bears, oh my!

Cheryl Ajluni Profile: Cheryl Ajluni
Cyberattacks, cybersecurity, Internet of Things, iot, IoT attacks, IoT cybersecurity, iot security, IoT strategy

There is a popular saying that goes something like this: You don’t have to run faster than the bear to get away. You just have to run faster than the slowest guy running from the bear. As it turns out, that saying is as good a metaphor for life as it is for security in IoT.

It all comes down to the issue of perfection. Success in life doesn’t require someone to be perfect in everything they do. What it does require is for a person to never give up, to keep moving forward, to work to improve even when it seems too hard and to outlast the competition.

Security in IoT devices is no different.

Just like life, appropriate IoT security does not demand perfection. It is not a matter of having to ‘outrun the bear,’ but instead needing to ensure IoT device security is ‘running faster’ or better than that of the competition. Unfortunately, that is often easier said than done.

Source: Shutterstock.

Preparing for a cyberattack

It is common knowledge that IoT devices are prime targets for hackers. They often lack rudimentary security measures and operate with out-of-date firmware. These security vulnerabilities create a backdoor into the network, which can be used to launch automated IoT botnet distributed denial-of-service (DDoS) attacks when exploited. That backdoor also gives hackers the ability to take control of an IoT device and force it to operate in an unintended way.

Imagine a hacker intentionally draining the batteries of IoT devices in a smart factory, causing an untold loss of revenue to a company. Or worse, what if the hacker takes control of a patient’s medical infusion pump, changing the amount of medication it dispenses? Without appropriate security measures in place, these scenarios could be all too real.

How do IoT device makers and network operators ensure their IoT devices and networks aren’t the lowest hanging fruit when it comes to security? It all starts with having the right IoT testing, security, and visibility infrastructure in place to protect both the IoT devices and the networks that support them. Having a solid security strategy and plan for mitigating attacks is also critical.

Source: Shutterstock.

On the network side, some of the best practices to consider when building that plan include:

Know your attacker. Hackers are creatures of habit. If an attack tactic works well, they will likely employ it repeatedly. Understanding the attacker, their patterns of attack and what to expect can prove critical to helping operators identify an attack in progress before it has a chance to get out of control.

Choose your weapons carefully. It’s not if a network attack will happen, but when. Being prepared with a DDoS mitigation tool or service is always a smart choice. But make that choice wisely by first checking the scale of attack the tool or service can stop, the level of service it can provide to critical infrastructure users and how many users are being affected while the attack is ongoing. Also check how often the tool or service falsely flags someone as an attacker.

Test your environment. Knowing how an attack will impact a network is essential to its prevention. This can be done by running simulations of attacks and defending against them by trying different solutions. The information that results can prove especially useful in building a database of defense mechanisms to be used for various scenarios. Plus, the more the network is tested in the lab, the fewer surprises the operator will encounter during a real attack.

Never stop working to improve security. Hackers adapt quickly and that means operators need to as well. The only way to be prepared is by proactively and continually seeking out new weapons and security techniques to plug the gaps in an existing security strategy.

IoT devices or networks alike are increasingly prone to cyberattack. Hardening devices and networks to withstand the onslaught is a tedious and ongoing process, one that can be addressed by many different strategies and using a range of different tools and services. The key to making the right choice is to first thoroughly test your environment and uncover any device or network vulnerabilities. Only then can IoT device makers and network operators begin to make the best choices for ensuring IoT devices and networks are resilient to attack.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


November 25, 2019  3:27 PM

Overcome data overload and prevent IIoT project failure

Matt Schnugg Profile: Matt Schnugg
Data Analytics, Data infrastructure, data modeling, data overload, IIoT, IIoT analytics, IIoT design, IIoT strategy, Industrial IoT

More companies are embracing IIoT to drive better business outcomes, but they still haven’t made the most of the data that they’ve collected. As one might expect, IIoT produces massive amounts of data at record speed. It can be overwhelming, especially to industrial companies, which are often strong in their real-time operations but struggle to translate data produced from those operations into meaningful and systemic process improvements.

While digital technologies could play a key role in unlocking value in the business and enabling growth, 93% of utility executives believe that they are struggling to deliver the benefits of digital transformation, according to Accenture research. Furthermore, as is frequently the case in large industrial companies, different departments often have access to different data resources with little cross visibility.

Data overload occurs because the parameters of the IIoT system, the type of data and the rate of change often occur faster than most companies’ ability to adapt, which can result in technical debt. There are quick fixes you can add to the system to accommodate the near-term needs of the data ecosystem, but that’s not how to build a system that lasts well into the future. To really make the most of IIoT systems, we need to rethink how to manage the entire ecosystem in a way that’s future capable.

Converge data for a streamlined approach

Harboring multiple data environments with disparate perspectives is the challenge to getting companies fully digitally transformed and is often a source of infrastructural cost overruns. The first step towards rectifying these issues is data convergence, which is a critical component in using data to drive better business decisions. A wide breadth of data will provide much needed context for many key decisions or analyses based on multiple perspectives of an event. Depth of data will enable a stronger reinforcement of these decisions based on more examples of similar events.

Because of this, it’s imperative that an organization’s IIoT platform allows the analytical framework to both easily access representative data samples from a wide variety of data sources and be scalable enough to dive deeply once an analysis strategy is identified. However, most industrial organizations have not yet found the solution to making these systems work, and they often experience continual IIoT project failure.

Rethink data management

Despite our best efforts, data coming from millions of nodes can be incomplete and imprecise, and there is often significant transformation that needs to occur prior to this data being consistently used to empower key outcomes for the industrial customer. It’s important for companies to go back to the basics of data governance to ensure they’re not overloading their IIoT platform.

This includes two key components that are commonly associated with root causes of data overload: the data models and the infrastructure that supports them.

Data modeling. The first step to reducing the likelihood of failure is to have a good understanding of the ideal construct for your data. Often, this is called a canonical data model, and they serve as an appropriate end state for incoming source systems. This is often achieved by defining metadata to organize and translate a nebulous system of scattered data into a more systematized construct. In many industrial environments, the data will be organized into multiple canonical models that can be mapped against each other, as opposed to a universal model that envelops all data that is ingested.

Generally, it helps to transform data into multiple canonical models that can be cross-referenced against each other. This federated strategy enables a more rapid transformation of data removed from disparate source systems into models that can be cross-referenced against each other with greater ease.

Another trick to reduce data overload includes having a down-sampling strategy of incoming data for optimal performance. Not all data is created equal and there is often a cost trade-off — in the form of dollars or time — with having to process more data. Understanding the appropriate granularity of data and aligning it with the right storage and orchestration scheme will help reduce compute cycles and cost.

Data infrastructure. From a data management perspective, it’s important to have a solution for storage, management and visualization of data that is not just interconnected and scalable, but also appropriate for the needs of the end user. Mapping an infrastructure that is tailored to the unique qualities of your data — its incoming velocity, the amount and the sophistication of the workloads — as well as what decisions the data will need to support will help you decide how much compute is done in the cloud, at the edge or on premises. Considerations also need to be made for the appropriate level of security and privacy for both your end users and the systems that are collecting your data.

Final thoughts

In both the modeling and infrastructure scenarios, understanding your domain and the key outcomes that you want will heavily influence your strategy. These outcomes need to be defined by your end user in a way that will positively influence their ability to do their job. After all, that’s the primary impetus for digital transformation. For almost all industrial companies that are data converged, modeled and orchestrated with these end user-driven outcomes in mind, they are the ones that more rapidly achieve value in IIoT projects.

A successful IIoT project can significantly save money by increasing productivity, reducing waste and optimizing key network operations or maintenance processes. But remember, the state of your data is the basis of everything. Deploying advanced analytics and putting your data to work creates an intelligent system that operates with a strong foundation and an agile, repeatable methodology to avoid IIoT project failure.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


November 22, 2019  3:22 PM

The ten things that should happen in 2020

Mark Troester Profile: Mark Troester
app development, Artificial intelligence, Internet of Things, iot, IoT analytics, IoT and AI, IoT trends, Machine learning

This is the first in a two-part series.

DesignOps, AppDev, digital innovation, AI, machine learning, IoT, bots and mobility are leading technologies that have dominated the IT industry, and it’s that time of year to predict what will happen in the year to come with these technologies. Or, at the very least, what I think should happen. These predictions might not come to pass as quickly as I’d like, but I’m certain that forward-thinking organizations will make considerable inroads into many of these areas of expertise.

DesignOps collaboration will gain traction

The Nielson Norman Group defines DesignOps as “the orchestration and optimization of people, processes, and craft in order to amplify design’s value and impact at scale.”

DesignOps is more important than ever given the heightened user expectations set by the world’s digital giants and social media behemoths. Consumers now routinely include the quality of the digital experience in their buying criteria, and one could argue that user expectations are approaching the point where they’re looking to buy an experience versus buying a product or service.

DesignOps historically manages the design process and it must evolve to accommodate those heightened user expectations in a way that facilitates designer and developer collaboration; even the most creative developer isn’t skilled at design, and poorly designed experiences will fail. The DesignOps evolution should include:

  • Inserting the designer earlier in the development process and keeping them there throughout testing and production feedback
  • Implementing technology that allows the designer to use their tool of choice
  • Taking a design-to-code approach that generates the design automatically with round trip collaboration for developer changes

Digital transformation will continue to evolve into digital innovation

Digital transformation isn’t going away and never will if there are ongoing opportunities for transformation. But it’s important to ignore market noise and ensure that your organization has its own definition of digital transformation, as well as its own set of goals and guiding principles.

For 2020, many organizations will augment or expand digital transformation efforts into digital innovation to drive business results. For example, Bain and Company found that revenues for digital leaders grew 14% over three years. This resulted in “more than doubling the performance of the digital laggards in their industries. Profitability followed a similar pattern—83 percent of the leaders increased margins over that period while less than half of the industry laggards did so,” said Bane and Company.

Making digital innovation part of the fabric doesn’t necessarily call for separate teams or big investments. It starts by developing a culture of innovation by grounding teams in objectives and their accountability for them, as well as giving them broad discretion to execute. Try hackathons that aren’t limited to writing software and think about how you can improve business processes, experiment with different GTM approaches and incorporate new sales and marketing efforts.

Organizations will better align application development and web development efforts

We tend to categorize digital efforts relating to customer experience and application development separately. From the customer experience side, we think about content management systems (CMS) or web content management systems, which are now being up leveled to DXP file system. From the AppDev side, we may think about PaaS or SaaS, which is being up leveled to MDXP. But there are many related principles, including multichannel user experience and needed integration with backend data, apps and authentication mechanisms.

For those on this alignment journey, industry analyst advice can be helpful, but it can also encourage the creation of silos that will hinder your business. It is better to coordinate AppDev and customer experience efforts by adopting flexible platforms and technology components to meet your specific requirements. While there’s no such thing as a single technology for all requirements, these technologies will become more interchangeable and agile with open standards interfaces.

The payoff is that content being managed by a CMS can be exposed to different digital touchpoints versus being completely web centric. For example, think of an integrated chatbot experience, or training content exposed via an augmented reality assistant.

Combining efforts will not only make both your customer experiences and AppDev efforts richer, it will also facilitate sharing across the organization for additional business value.

Increased machine learning and AI projects will begin production

Going from lab to production with machine learning and AI projects is challenging, even for organizations with a full contingent of data scientists and the data analysts required to prepare the data.

But there can be issues with data, such as difficulties in running algorithm experiments, assessing the results accurately, moving things into production, integrating results with operational systems, change management, keeping up with changes, establishing the compute infrastructure required to process scale at the individual asset level and more.

A lot of the difficulty comes from the vast amount of manual processes. The good news is that there are several cloud-accessible packaged services and automated data science tools and platforms that fit more elaborate use cases where custom models are needed. They don’t eliminate the need for data scientists, but they make them more productive and help lower the bar so that other roles, such as app developers and business analysts, can participate effectively in the process.

Another encouraging sign is the acceptance and support these projects are receiving from AppDev. Many colleges are making data science a part of the computer science curriculum, and forward-thinking AppDev leaders are looking across their offerings to determine where machine learning and AI can play a role.

Automated conversational interfaces will start delighting users

We’ve all done our time dealing with automated phone messages where escape by pushing zero to get a human is impossible. And now automated chat is replicating this infuriating experience. Why?

In many cases, it’s because the chatbot has been designed and implemented with decision tree logic. A developer working with a business expert tries to code every possible conversation path, which can lead to a stilted conversation and a possible dead end or inaccurate conclusion to a conversation.

The good news is that an AI-driven chatbot can solve many of these problems:

  • It reduces the amount of development required to build a chatbot and allows business experts to be more involved in design and implementation
  • It can be trained much like you are training a person, making it ideal for transactional chat interfaces that can schedule appointments, submit claims and more.
  • It delivers a more natural intuitive conversation as it learns from experience and is designed for specific business purposes
  • It can be integrated into multiple interfaces, such as web sites, mobile applications and kiosks.

Finally, while we think about chat as a front-end construct, the accessibility of back-end services allows consistency for different digital channels, enables data and application integration, enterprise and social authentication integration that the chatbot can use, and it can be extended to support other forms of conversational engagement to home devices.

In the next installment, we’ll look at 2020 trends for modernization, employee and partner user experience expectations, breaking down tech silos, rethinking the edge and business and tech alignment.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


November 22, 2019  12:27 PM

The secondary network: What is it and how do we manage it?

Richard Beeson Richard Beeson Profile: Richard Beeson
IIoT, IIoT connectivity, IIoT design, IIoT network, IIoT strategy, Industrial IoT, IoT benefits, IoT connectivity, IoT sensors, secondary network

If you’re in IT, you’ve probably never heard of the secondary network. If you’re in OT, it’s most likely one of your biggest concerns.

Secondary network devices reside physically within the OT realm. However, the devices don’t exist on the primary OT network, creating the potential for data siloes. Many of these devices exist to feed information to IT applications. However, IT is often blocked by firewalls, diodes and even razor wire from the OT realm. One of the biggest stumbling blocks for IoT remains breaking down the cultural barriers between OT and IT.

Understand the secondary network

The secondary network is the name for the collection of sensors and devices deployed within the demilitarized zone of industrial operations, which is a physical or logical subnetwork that contains and exposes an organization’s external-facing services to an untrusted network. These assets aren’t connected to their SCADA systems, which function and serve as the nervous system in industrial environments by monitoring and controlling the robots, chemical reactors and critical systems to ensure productivity and prevent accidents.

Unfortunately, linking assets to SCADA networks requires highly reliable, redundant and fail-safe implementations that are not trivial or cheap. As a result, companies often circumscribe the number of devices they link to them. For example, a food manufacturer might link its ovens because cooking time and temperature is critical to safety and quality, but not its mixers which can be run manually or with a basic timer. In manufacturing facilities, robots will be linked to SCADA, but the HVAC system might be linked to a separate siloed system or not connected at all.

A sensor that monitors water flow of a remote pump owned by a water utility is also a secondary network device. In addition, a Raspberry Pi board that gets moved from part to part to study power consumption is also considered a secondary network device.

The secondary network continues to grow

Many pieces of industrial equipment that you’d think should be tied to SCADA are also part of the secondary network because they were put in place before the manufacturer adopted a more comprehensive automation plan. In 2018, Kevin Prouty of IDC surveyed manufacturers and found that 22.2% of OT equipment was still not connected to a network, compared to the 40.1% of equipment that wasn’t connected in 2016.

Calling the secondary network a network can be misleading because the actual network doesn’t yet exist. It’s just a dotted line where people think it should exist. Innovation, plummeting prices and new services are driving demand and deployment of these networks.

For example, Toyota Motors Europe is looking to better harness the ambient environmental data inside its production facilities to help the company achieve its goal of zero carbon contribution by 2050. In addition, startups like Petasense have developed services for analyzing real-time vibration analysis with the signals captured by moveable, relatively inexpensive sensors.

Wind developers are putting gateways and sensors on bearings and other components in their turbines to boost capacity factor and lower repair costs. Some companies have projected they can cut the cost of particular repairs by up to 90% by monitoring hydraulic fluid pressure or lubrication remotely with new sensors, or save 20 million euros a year at offshore wind farms by cutting boat trips in half.

Ten years ago, the primary technology for learning about power outages was the phone. Utilities would first learn about an outage when a customer called to complain. But sensors have allowed companies, such as DTE Energy, to reduce power outages by 3 minutes per year per customer. For example, DTE Energy has attached 3,000 sensors to its transmission lines to help it more rapidly detect power outages.

With 2.2 million customers, that means DTE Energy avoids 6.6 million customer minutes — cumulatively about 12.5 years — of darkness every year. By linking these through a secondary network and avoiding SCADA integration, DTE Energy believes it avoided $25 million in implementation costs.

To accelerate the adoption of these sensor and devices at the edge, open source project efforts such as Linux Foundation’s LF Edge have emerged and gained significant traction since the beginning of 2019. In addition, Dianomic recently contributed Fledge to the Linux Foundation’s LF Edge effort to accelerate further growth in the second network and in the smart devices serving industry.

Communication around secondary networks is key

Secondary networks are sometimes referred to it as a control and operations companion cetwork. To some, the term secondary implies that the network is secondary in importance. Calling something secondary could encourage more data silos, but the security requirements can be just as demanding. Some OT engineers call the secondary network the IoT network, but this too creates confusion. To IT, everything inside the OT zone belongs to an IoT network.

Secondary network devices reside physically within the OT realm. The devices, however, don’t exist on the primary OT network, creating the potential for data siloes. Many of these devices, in fact, exist to feed information to IT applications. IT, however, is often blocked — by firewalls, diodes and even razor wire — from easy access to the OT realm.

One of the biggest stumbling blocks for IoT remains breaking down the cultural barriers between OT and IT. That means adopting a mutual language and view of the universe.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


November 21, 2019  11:00 AM

The future of physical retail: Connected devices and connected shoppers

Mike Leibovitz Profile: Mike Leibovitz
commercial IoT, connected devices, Internet of Things, iot, IoT analytics, IoT benefits, IoT connectivity, IoT devices, IoT in retail, IoT platform, IoT software, Machine learning, Network security, retail, Wi-Fi network

U.S. consumers are expected to spend more than $1 trillion this holiday season, according to eMarketer. Ahead of the Q4 surge, brick-and-mortar retailers are experimenting with IoT technology, from smart mirrors and tablets in fitting rooms to VR headsets and beacons on the retail floor, to transform the shopping experience and drive more offline sales.

These cutting-edge IoT technologies are defining the stores of the future. But what’s often overlooked is that these novel, customer-facing innovations all depend on foundational backend systems.

Putting the smart in smart fitting rooms

Smart and connected fitting rooms are on the rise in major retailers across the country. For example, Ralph Lauren outfitted its flagship store in New York with smart-mirror fitting rooms so shoppers can request different sizes and colors, view product information and even adjust the room lighting all through the connected mirror.

AR and VR technologies also introduce new possibilities for consumers to interact with products and companies in personalized and exciting ways. Timberland deployed AR mirrors in their storefronts to attract and engage foot traffic. Their smart mirror enables those passing by to virtually “try on” various outfits and even share their experience on social media.

These examples illustrate the rise of experiential retail, and both smart fitting rooms and AR devices depend on a fast, reliable internet connection in order to function properly. The storefront magic mirror loses much of its appeal if it requires shoppers to wait several minutes for the images to load.

Highly available, flexible and scalable Wi-Fi connectivity is imperative for supporting these bandwidth intensive IoT, VR and AR applications. Retailers can use automated Wi-Fi radio frequency management to proactively identify and mitigate issues before they occur, which ensures a high quality of service for shoppers. As a result, consumers will experience minimal latency and reliable connectivity when using the experiential technology.

Making payments mobile and secure

Consumers expect the in-store checkout experience to be as seamless and efficient as shopping online. Recognizing this, many retailers are deploying IoT payment devices such as tablets, smartphones and smart carts to expedite the checkout process for consumers. In turn, this can also help retailers streamline their in-store operations and resources.

Automated checkout can reduce cashier staff requirements by up to 75%, resulting in savings of $150 billion to $380 billion per year in 2025, according to McKinsey. One critical concern that could jeopardize the successful execution of a mobile payment strategy is security. Ninety percent of consumers lack confidence in IoT device security, including point-of-sale (POS) devices, according to Gemalto. Verizon’s 2019 Data Breach Investigations Report found that while breaches involving POS have declined in recent years, attacks against ecommerce payment applications are on the rise, accounting for 81% of breaches.

As the use of mobile-based payments and connected devices grows, so too does the attack surface. IT managers must ensure their in-store network includes a mix of threat detection, protection, and surveillance to prevent the exposure of confidential consumer and business data.

Furthermore, most retail IT teams are small with limited resources, and they are responsible for managing multiple retail locations across different regions. A network equipped with machine learning and proactive AI functionality to identify anomalies and potential threats can help augment human intelligence and reduce the time spent manually monitoring for security vulnerabilities.

Connecting to the connected customer

There’s a lot of hype around net new IoT devices that retailers are deploying in their stores. But it’s not just businesses that are more connected; it’s customers, too. Buyers today increasingly shop with mobile phones, tablets, smart watches and other wearables, putting real-time data on discounts and competitive pricing at their fingertips.

According to Salesforce research, 71% of shoppers say they use their mobile devices in stores and eMarketer reports that 69% percent of consumers look for reviews in-store on their phone before approaching a retail associate. That leaves retailers a narrow window of opportunity to meaningfully engage shoppers who otherwise may opt to purchase from a competitor.

The good news is this proliferation of devices on the network also creates a proliferation of valuable consumer data, which retailers can leverage to deliver more personalized offerings and customer experiences. For example, Macy’s uses beacon technology so that when a customer opens the Macy’s application while shopping, the application sends targeted promotions and contextual information based on where the customer is in retail locations. In many ways, this allows retailers to meet customers where they’re at, and curate more touchpoints throughout the physical and digital shopping experience to achieve unified retail commerce.

It’s important to remember that the network edge is now a cornerstone of brick and mortar retail stores. It’s the point where an organization and its customers meet; it’s where users engage, mobile transactions occur, and IoT devices connect and are managed. Retailers can apply analytics to the data coming over in-store wireless networks to understand customers’ preferences and make offers to them that are highly contextual and catered to their specific needs. Analytics can also be used to inform location-based services, RFID, and electronic shelf labeling to reduce friction in a shopping journey and create impactful experiences.

Modernizing brick and mortar by connecting the dots

As we head into the busiest shopping season of the year, it is an opportune time for retailers to evaluate how their in-store technology elevates or hinders the end-to-end customer journey. Whether it’s high-tech experiential IoT devices, mobile POS systems, or your customer’s smartphone, remember that the network is critical to a seamless in-store experience.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


November 20, 2019  4:35 PM

Four ways IoT will change the way we work in 2020

Steve Wilson Profile: Steve Wilson
augmented reality, Internet of Things, iot, IoT development, IoT integration, IoT use cases, Virtual assistant, workplace IoT

Did you ever think you’d be talking to your phone like it was a friend, asking it where to find good tapas? Or that your refrigerator would be able to talk to you and let you know you’re low on milk? We used to use watches to tell time. Now, we rely on them track our heart condition, get directions, send emails and texts and talk to our friends. The way IoT has transformed how we interact with our personal devices and how we live would have seemed like science-fiction just a few years ago. In 2020, IoT is going to revolutionize the way we work. Here are four things you can expect to see:

Work will be less of a chore

They wouldn’t call it work if it was fun. Or would they? Technology has made work more complicated and frustrating than it needs to be. Every year, companies spend billions on applications to streamline functions and processes and make work simpler. But they’ve put too many in place that are too hard to use, which has only made things worse. The devices we rely on at home to manage our lives know our preferences and modes of operating. They make it super easy for us to stuff done. “Alexa, call me an Uber.” At work, company-issued technology seems to just slow us down. On average, it takes four or more applications just to execute a single business process!

In 2020, the same technologies that have made our personal lives so easy will become pervasive in the office and turn the employee experience on its head. Tactical busy work will take a back seat to the strategic, value-creating stuff we want and are paid to do because devices will automatically deliver the insights we need when and where we need them and in many cases, just do the work for us.

Virtual assistants will ease the pain

Statistics show the average employee spends about 65% of their time on busy work and in meetings and 20% searching for information. That leaves just 15% — or roughly 1.2 hours a day — for meaningful work. Virtual assistants who know who we are, what we do and how we like to do it will give us this much-needed time back. We won’t need to go through the painful process of digging through enterprise apps to execute simple workflows, like requesting time off or booking travel. We’ll just ask our virtual assistant to do it. Beyond that, sometimes we won’t even need to ask. Sales opportunities will automatically be moved out of pipeline and into the system of record as soon as you close them. Recordings of virtual meetings will automatically be sent to participants as soon as they are over.

Technology will shadow us

The days of lugging laptops, tablets and mobile devices everywhere may finally be over as technology will follow us. Digital workspaces will deliver the apps and information we need anywhere, anytime, to any device. So, when we walk into a conference room for a meeting, we won’t have to dial in attendees on the Polycom or fire up a WebEx or Zoom session. The IoT-enabled workspace will already know who we are, the meeting we are there to attend and presentation we need and just get things started using the equipment that’s already there.

Augmented reality will redefine collaboration

Augmented reality used to be all about fun and games, but software developers have gotten serious, and augmented reality is set to transform the way we collaborate at work. We’ll see and interact with colleagues and customers around the world in new ways through smart glasses and AirPods; break the isolation of working remote and experience corporate meetings as if we were there in person; test new products as if we were in the field; learn new skills in digitally enhanced classrooms; and we’ll receive information in context and process it more quickly than ever before to make better, more informed decisions.

There are still people out there who think IoT is just for toys. And some of them thought the internet was just a consumer fad, too. IoT means business. And in 2020, we’ll see proof.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: