IoT Agenda


April 9, 2019  11:51 AM

Break out of IoT proof-of-concept purgatory

Lou Lutostanski Profile: Lou Lutostanski
Internet of Things, iot, IoT business model, IoT data, IoT partners, IoT partnerships, IoT pilot, IoT pilots, IoT proof of concept, IoT strategy, IoT use case, poc

Businesses are spending $745 billion worldwide on IoT hardware and software in 2019 alone. Yet, three out of every four IoT implementations are failing. Why?

One big reason: Leaders are failing to go all in.

To make IoT successful, you need to transform not just some hardware and software, but the way your business works. These dynamic deployments require an entirely new approach, far beyond the traditional push to get new business applications off the page and into production.

If the right steps aren’t taken in the beginning, say you don’t think far enough beyond the IT infrastructure, you end up in limbo: caught between the dream of what IoT could do for your business and the reality of today’s ROI. That spot is called proof-of-concept (POC) purgatory.

Sound familiar? Here are five signs you might be in IoT POC purgatory — and tips on how to escape it.

1. You have a lot of data … and not much else

There’s no surer sign of POC purgatory than an IoT technology that produces only dashboards. Making data visible is an effort in futility if you aren’t applying AI to make it smart — to truly drive insights in your organization. To do this, though, you need a clear and well-communicated business objective from the earliest stages of your IoT project.

That objective — whether it’s operational efficiency, better customer service or bottom-line revenue generation — allows you to use the right technology to develop actionable insights from your data. That’s when a mixture of cloud, customer, employee, public and real-time data sitting in a repository meets the analytics that point to actions that can change your business.

Without a clear business objective, the lights of IoT might be on, but no one is truly home.

2. You keep getting pushback from unexpected places

All the new technologies within IoT mean one thing: people — and lots of them.

It’s surprising how many stakeholders come out of the woodwork along the path to implementation. These stakeholders can include project managers, system integrators, operations specialists, installers and business stakeholders from HR, marketing, sales or customer service — all of whom will have questions, comments and critiques.

Of course, looping in all areas of the business is crucial to get a proof of concept going. To keep that momentum up and avoid proof-of-concept purgatory, however, you need solid change management plans that push IoT into your business. Customizing communications to each stakeholder group will ensure they understand exactly what’s in it for them when your implementation succeeds.

3. Your teams aren’t speaking the same language

Once you have the right people in the game, you have to make sure everyone is working off the same playbook.

For example, IT and operational technology (OT) teams have long had their separate realms to play in. This siloed approach stands in the way of not only a smooth deployment, but also a scalable one as additional features get added and the system matures. IoT requires the skills of both teams to succeed. Not only do these teams need to work together, they also need to trust each other. IT professionals need to trust OT devices to connect to their carefully constructed networks, while OT leaders need to feel comfortable with a new security stack interacting with their hardware.

This collaboration is just the start of the people challenge: It can take up to 10 partners to get an IoT system to market, not including your internal stakeholders, so it’s crucial that everyone’s moving in lockstep.

4. You can’t get the CEO on board

Big initiatives take big support. Frequently, we see businesses get stuck when line of business managers love the idea, but can’t get the C-suite to sign on the dotted line and put it into production. Here’s where pilots drag on and leaders become disillusioned with the project.

To avoid spinning your wheels here, it’s crucial to ensure that all the appropriate C-suite stakeholders, up to and including the CEO, understand what your IoT deployment will give to the business.

Construct a simple product roadmap that takes busy business leaders from robotic arms and sensors to factory floor insights to what really matters: dollars-and-cents impact on the bottom line.

5. You want to be in IoT, but you don’t know why

A good use case is like a lighthouse. With it, it’s easy to keep the boat steered in the right direction. Without it, it’s easy to get lost. Often, business leaders read articles about the opportunity and know only that they need to jump into a boat. But everything above — the alignment of teams, the insights driven from data and operationalizing of the system — is driven by one major thing: the use case.

Lack of an apparent use case is by far the most common reason why people are in POC purgatory. Considering IoT implementations can take a year or more to get off the ground, it is mission-critical to have clarity on what the deployment is trying to achieve from the very start.

If this sounds like you, there’s a way out. Go all in. Align around your use case first. Align your in-house capabilities and external collaboration around the technologies best suited for your objective. Align your data intelligence to ensure it is delivering insights that transform your business. Align your employees around the business transformation you are trying to complete — and retrain and reskill them continuously as the system evolves.

When you align internal stakeholders to the business objective and ensure that any external partner is functioning as an extension of your team, you can truly go from stuck in purgatory to full production and tangible payoff for your business.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

April 8, 2019  12:36 PM

Protecting the exploding attack surface: A blueprint for government agencies

Reggie Best Profile: Reggie Best
Attack surface, cyber-situational awareness, government security, intent-based network security, Internet of Things, iot, IoT compliance, IoT devices, IoT risks, iot security, IoT threats, Network security, securing IoT

No industry possesses more confidential, sensitive and proprietary information than government agencies. From citizen data and employee files to military plans and details about national laws and regulations, federal, state and local government agencies are a gold mine for nation-states and other criminal groups.

This is nothing new, of course. Over the past few years, we’ve seen no shortage of government-targeted attacks. What is fairly new, however, is the rapidly expanding attack surface, which is giving attackers more ways than ever to infiltrate government networks and get their hands on the nation’s most sensitive data.

IPv6 explodes the attack surface

In 2012, the U.S. government mandated that all government agencies transition to IPv6, which was designed to overcome the problem of IPv4 address exhaustion. With IPv6, there are more than enough IP addresses to accommodate every connected device, which, in the age of IoT, cloud computing and digital transformation, is a necessity.

From a government perspective, this transition to IPv6 — which some agencies are just beginning — along with the accelerating rate of cloud adoption means that almost everything — from military weaponry to building management control systems to voting machines and census-collection tablets — is IP-enabled and part of the network ecosystem. And while this is driving greater effectiveness and efficiencies from an operations standpoint, it’s also introducing tremendous security risk.

First, more network devices mean more endpoints susceptible to attack. Second, thanks to cloud computing and digital transformation, applications and systems are deployed, changed and removed at a faster rate than ever before, leaving security teams constantly trying to understand the state of their network infrastructure. And last but certainly not least, security teams are struggling to bring all network assets under the correct security policy to control access and ensure a strong security and compliance posture.

In today’s “hybrid agency,” where IT infrastructures are massively distributed and constantly changing, is it possible to really know what’s on the network, maintain proper policy hygiene and attain continuous security that moves at the speed of innovation? The answer is yes, thanks in large part to cyber-situational awareness and intent-based network security.

Achieving cyber-situational awareness

Agencies must find a way to monitor network assets and activity across physical, virtual and cloud environments. This means achieving real-time visibility into all endpoints and resources across all computing infrastructures; understanding how those endpoints are connected to the agency, the internet and each other; and identifying suspicious traffic, potential leak paths to the internet, anomalous activity, unknown rogue devices and shadow IT infrastructure.

In other words, IT security teams must be able to identify threats and vulnerabilities to the infrastructure as they emerge and change, so they can develop effective incident response and risk mitigation strategies. Agencies that use cyber-situational awareness in this way have the real-time and accurate network visibility needed to properly protect their networks, critical data and our nation’s infrastructure. Following are five tips to achieve this ideal state:

  1. Validate the network IP address space. Understand the scope of IP address space in use and visualize the network topology. Instead of working from a set of known addresses that you think encompass the entire organization, verify that there are no unknowns.
  2. Determine the network edge. Understand the boundary of the network under management.
  3. Conduct endpoint census. Understand the presence of all devices on the network infrastructure, including traditional IP-connected devices, such as routers, gateways, firewalls, printers, PCs, Macs, iPhones, etc., and non-traditional IP-connected devices, including medical equipment, security cameras, industrial control systems, etc.
  4. Conduct endpoint identification. Assess the nature of devices on the network, including type, operating system and model.
  5. Identify network vulnerabilities. Evaluate and comprehend network anomalies, such as unknown devices, unmanaged address space, leak paths, etc., for remediation.

Integrating intent-based network security

Once you have a real-time, holistic understanding of what’s on your network, you can then implement proper policies and rules. Until recently, IT security teams manually wrote rules for every enforcement point. In today’s complex, dynamic hybrid environments, manual policy management processes just aren’t sustainable — not to mention, they’re costly, burdensome and prone to error.

Intent-based network security provides a desperately needed shift in global security policy management — one that automates policy orchestration and allows agencies to take advantage of innovation without slowing down development processes or introducing enterprise risk.

At a high-level, intent-based network security unites business, DevOps, security and compliance teams by enabling them to collaborate on a global security policy. Non-security personnel determine the business intent of applications and security personnel define the accompanying security and compliance intent, and then all three are aligned so policy changes can be fully automated and meet the needs of all parties. Manual rules-writing becomes a thing of the past, and all assets across the hybrid agency are brought under the proper security policies.

The powerful combination of cyber-situational awareness and intent-based network security enables agencies to use next-gen technologies and processes, such as IoT and cloud computing, without introducing security and compliance risk. IT security teams can successfully protect their network assets regardless of how many there are, where they reside or how they change. And the size of the attack surface no longer matters because security is finally able to adapt at the speed of change — and in today’s world, that’s a blueprint for success.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


April 8, 2019  10:55 AM

IoT for smarter cities: Lessons learned from around the globe

Tom Amburgey Profile: Tom Amburgey
Internet of Things, iot, IoT applications, IoT use cases, Smart cities, smart city, smart city applications

Often referred to as the next Industrial Revolution by industry experts, the internet of things is radically changing business, consumers and governments. According to a recent IDC report, global spending on smart city initiatives is expected to grow to $158 billion by 2022 as cities continue to invest in the hardware, software, services and connectivity that enable IoT capabilities.

Many private sector industries have already implemented and capitalized on the huge potential of IoT technologies, but the public sector has had its share of early adopters. Facing rising citizen expectations and better public engagement, many cities and municipalities have recently introduced disruptive IoT projects to improve their vital services and their citizens’ quality of life.

Not just limited to back-office applications, IoT projects are enabling cities to better adapt and respond to changing conditions — improving critical services like infrastructure and public safety. These cities can now deploy resources more effectively, increase city sustainability and preserve energy. Let’s look around the globe at two successful cities that are using IoT to improve back-office processes and engage people with a citizen-centric application for water and traffic management.

Smarter water management: More accurate tracking

According to a recent study by IoT Analytics, Europe tops the list of IoT smart city projects, capturing nearly 50% of the global project base. For many years, smart city initiatives have been a priority for European policy leaders and companies. In 2011, the European Smart Cities Initiative was created to support smart city projects with a special focus on reducing energy consumption in Europe.

One result of this initiative is found in Castellón, Spain, where IoT technology is being used to accurately track and control water management. From an initial pilot of 600 smart water meters to the current implementation of 30,000, the smart water platform provides the city with real-time data on its water resources.

This innovative system includes long-range and low-power capabilities to collect and communicate household water consumption information, allowing the city to accurately track and control water management. Additionally, this initiative has enabled Castellón to quickly detect leaks, eliminate breakdowns and easily manipulate the water supply network in real time, preventing loss of service and costly repairs.

The results have ensured that the city continues to provide ample water to its citizens while reducing unnecessary waste.

Knowledgeable command centers: Better urban mobility

While Europe accounts for 50% of smart city initiatives, the focus on the Asia-Pacific region continues to grow. Although the region accounts for just 15% of current smart projects, studies suggest this trend will quickly change, with more than 50% of smart cities expected to exist in China by 2025, creating an economic impact of over $300 billion.

In one example, Singapore is working to improve urban mobility by introducing smarter technology to make roads safer and keep traffic flowing smoothly. Because traffic is a growing concern for many metropolitan areas, including Singapore, the city-state plans to feed traffic data into a centralized operations control center, which will aggregate the data and provide real-time traffic information to the public. Equipped with live traffic information on mobile phones, web portals or navigation devices, motorists will have instant insights into traffic incidents and congested routes, allowing them to identify alternative routes.

This initiative is designed to reduce the number of motorists in congested areas to enhance safety on major roads and expressways. Additionally, city officials gain the necessary data required to adjust traffic light systems based on shifting traffic conditions.

The U.S. is poised to learn from global successes

While governments around the globe are focused on increasing productivity, reducing costs and improving their citizens’ quality of life, the U.S. has traditionally been lagging in developing innovative and disruptive smart city technologies. This trend is poised to change, however, as IDC reported that the U.S. is expected to account for one-third of global spending on smart city initiatives in 2019.

By carefully studying the current slate of global smart city projects, governments and municipalities in the U.S. are better prepared to bring these successes to their constituents. This enables us all to look toward a future in which many U.S. cities will be equipped with innovative technologies that alter the way constituents interact with their cities. From pedestrian detection centers at intersections, automated dispatching systems that vastly reduce response times and even integrated asset management tools to drive preventive city maintenance, the city of tomorrow will soon be here today.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


April 4, 2019  11:07 AM

Securing the ‘M’ in IoMT to ensure patient safety

Leon Lerman Profile: Leon Lerman
Connected Health, connected healthcare, Healthcare, Internet of Things, IoHT, IoMT, iot, IoT cybersecurity, IoT devices, iot privacy, iot security, patient data security

The market for the internet of medical things, or IoMT, which includes medical devices, servers and applications that connect to computer networks, is experiencing explosive growth. Hundreds of thousands of connected medical devices, such as patient monitors, IV pumps, MRI machines, infusions pumps and ventilators, are linked to hospital networks to improve the quality and efficiency of delivering medical care. Expected to reach $136.8 billion worldwide by 2021, researchers have suggested that healthcare will become the biggest market for IoT by 2020, with 40% of all devices designed to be used for healthcare purposes.

Although all of these innovations can improve the quality of in-patient care, their ability to communicate over internal computer networks has introduced new vulnerabilities to cyberattacks. Hackers are increasingly targeting hospitals because of the high price they can command for sensitive patient data and the recent success of ransomware attacks.

The growing security risk

Medical devices need an extra level of protection because if an attack causes them to malfunction, patient lives are at risk.

For example, if a cyberattack tampers with smart infusion pumps that enable hospital staff to dispense and change medications automatically through the wireless network, dosages could be changed with disastrous results. Or an interruption in the transmission of readings from sensors embedded in patients’ beds, for instance, could prevent healthcare providers from being alerted if there is an urgent need for patient care.

A cybersecurity technology for IoMT needs to ensure in the event there is an attack that medical devices can continue to provide patient care. Such a technology must prevent malware and malicious activity from interfering with normal operations.

Special expertise required to secure IoMT

While generic IoT security systems secure all the endpoints the same way, securing medical devices requires knowing each device’s role in the various clinical workflows to accurately assess the impact of a cyberattack on patient safety. The “M” in IoMT makes all the difference.

Security professionals are used to protecting email servers, different databases, laptops and other mobile devices, but when it comes to securing medical devices they lack the visibility and understanding of how devices operate. It’s very important to understand where the medical devices are and what their role is in medical workflows and clinical processes. This understanding helps security professionals apply the right controls and the right security policies to properly protect the assets and their communications without interrupting hospital operations.

Each device needs to be analyzed based on the possibility that information can be leaked or can be a potential danger to a patient’s health. For instance, a PACS (picture archiving and communication system) would have a high privacy ranking, while an infusion pump would have a high patient safety ranking. Based on the ranking system, those devices that need immediate attention can be identified and managed accordingly.

There also needs to be special consideration regarding how critical the device is for patient care at every particular hospital. If a local hospital only has one MRI machine and is servicing the entire region, then it may need stricter controls than an ultrasound machine where there are several other machines that can be used if that machine is compromised.

In addition, the IoMT security system also needs to have a very good understanding of all of the device’s connections, including gateways, nurses’ stations, interface engine servers, terminal servers, printers and other middleware. If a server is attacked, it’s important to know how many other entities will be affected and to remediate appropriately to ensure patient safety.

As IoMT technology improves the efficiency of patient care, smart medical devices that communicate over a hospital’s internal network will become more prevalent. Securing all of the medical devices using traditional IT or even generic IoT cybersecurity approaches isn’t good enough. Security professionals need to take into account the medical context of the device communications, including the role of the devices in the medical workflows, to apply the necessary controls to ensure patient safety and protect sensitive data.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


April 3, 2019  1:50 PM

If you think 5G will solve all IoT challenges, think again

Mehdi Daoudi Profile: Mehdi Daoudi
5G, Connectivity, Internet of Things, iot, IoT connectivity, IOT Network, IoT networking, IoT service providers, IoT wireless, Mobile connectivity, mobile network operators, mobile operators

5G represents the next generation of mobile technology, with GSMA predicting that by 2025, about half of all U.S. mobile connections will occur on 5G networks. With its dramatic bandwidth enhancements, 5G is expected to be 100 times faster than 4G, and is viewed as the seismic shift that will spur adoption of enhanced mobile broadband services as well as IoT, particularly mission-critical use cases. According to estimates, 20 billion devices will be online and connected by 2020.

But even as mobile operators worldwide transition to 5G, the path to IoT will not be obstacle-free. As internet infrastructure providers of all types strive to keep pace with mobile operator investments while protecting their own profitability, we expect some inevitable speed bumps, particularly when it comes to ensuring device performance — low latency, fast connections and superior reliability. Here’s why:

Trying to build Rome in a day: Any time a big project is undertaken in a relatively short timeframe, the likelihood for error and missteps tends to increase. According to McKinsey, mobile operators’ network-related Capex is expected to increase by 60% from 2020 to 2025. Throughput, scalability and reliability will need to increase many times over current levels to support the surge in users, connected devices and traffic volumes brought on by 5G.

Inevitably, this will have a trickle-down effect on external infrastructure service providers, like cloud, CNS and DNS. The entire delivery chain needs to beef up to keep pace with 5G, all while protecting individual providers’ financial interests.

Some providers with more bullish outlooks will dive in quickly, believing their investments will pay huge dividends in just a few short years. Among these companies, we’ll likely see build-outs of size, scope and pace never before seen, to support the anticipated eight-fold increase in global mobile data traffic.

The alternative ‘wait and see’ approach: Other providers will take a more cautious approach, setting certain utilization thresholds that, once met or exceeded, will then warrant additional investment. In the meantime, these providers will reconfigure existing resources, deploying approaches like virtualization to maximize capacity already on hand.

While both approaches have their merits and drawbacks, one thing they share is an increased tendency for performance blips. In the case of the more bullish providers, we’ve seen multiple instances over the years of rapid expansion leading to missteps and errors, some of which seem completely random.

For example, the recent highly publicized Google Border Gateway Protocol misdirect — whereby Google-destined traffic from across the world was inadvertently dispatched to Russia and China — was ultimately the result of an error that occurred when Google expanded its Nigerian presence through a peering relationship with a local ISP.

While the more cautious providers are rightfully focused on preserving profitability, they risk being caught flat-footed as 5G traffic volumes escalate. Additionally, while techniques like virtualization can offer some near-term flexibility, constantly changing partitions and instances can make it much harder to quickly locate growing hotspots.

Comparing a firehose to a cocktail straw: 5G is a last-mile-only mobile connectivity technology. While it will connect massive volumes of users and IoT devices the mobile internet, the speed and throughput advances only apply to this one leg of the packet journey. Even if two communicating devices are connected to mobile broadband via 5G, if the broader internet path between them is restricted or has vulnerability points, the speed or reliability of the messaging will inevitably suffer regardless of the devices’ last-mile connection speeds.

5G will dramatically increase mobile video and media usage, as well as IoT adoption worldwide. We can expect external infrastructure providers to make some inadvertently clumsy steps as they grapple with significant growth demands. This may lead to occasional bouts of added service latency or unreliability. While this may be somewhat tolerable for some consumer-facing applications and appliances, it will not suffice for mission-critical IoT applications, like connected cars, connected cities, medical devices and more.

In summary, we expect to see a few years of awkward adolescence and growing pains as the broader internet infrastructure plays catch-up with 5G. To insulate against performance risks, IoT users and device manufacturers — particularly those of the mission-critical variety — will need to be extra vigilant. The challenge ahead lies in harnessing and analyzing massive volumes of device performance data and proactively identifying growing hotspots across an infrastructure of unprecedented depth and complexity.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


April 3, 2019  11:28 AM

Combining IoT with the power of blockchain to get over the logistics hurdles

Christopher Justice Profile: Christopher Justice
Blockchain, customer experience, distributed ledger technology, DLT, Internet of Things, iot, IoT applications, IoT data, IoT logistics, logistic, Logistics, smart supply chain, Supply chain

From hefty delays to below-par tracking, and from IT or operational failures to theft, the risk to cost, profit margins and reputation can be unabating for logistics businesses. Add to this the shifting demands from always-on consumers, who crave efficient, cost-effective, secure and instant systems, and the challenges become even steeper for companies operating in this ecosystem. IoT innovations are starting to transform the landscape to meet these demands, but there are limitations. Can converging IoT and blockchain be the answer? President and COO of Atlas City, Chris Justice shares his thinking.

According to a report by Cisco, some $2 trillion have been generated from IoT deployments in the logistics and supply chain space, with more growth inked for the coming years. A significant driving factor behind this take-up of IoT by companies is the global shift in consumer behavior: Our ever-connected world has fuelled our expectations for instant deliveries, efficient and seamless services, and safe transactions. As consumers, we demand reliability and transparency, our need for trust is innate and our fickle and brand-promiscuous habits are becoming more difficult to please.

The “Amazon way” has quickly learned to satiate our 24/7, door-to-door shopping needs and has fast become the norm for many brands, large and small. But in the background, the logistics infrastructure is creaking under the pressure.

To survive, logistics businesses are having to adopt new tools and methods. And this is right across the board, from the non-perishable product market segments to the highly perishable. Consider the number of online grocery delivery or cooked food delivery options we have available in the palm of our hand: Competition in the space is at its highest.

Because of this competition — which is driven by these new behaviors — IoT innovation in the logistics sector is gaining momentum. From asset management to fleet tracking and inventory management, there’s a host of IoT innovations that are already transforming the logistics space and helping businesses reach a whole new level of capability.

But there are limitations to what IoT can do on its own. And what’s more, while IoT is solving problems, it’s also bringing a new set of challenges to the logistics table.

Managing the explosion of data

By 2030, Cisco predicts that 500 billion devices will be connected to the internet. Much of this growth comes from our desire for immediacy, our need to get everything on-demand, to track the providence of goods and trace assets. It’s what is “required” by IoT, but it’s not practical. Not on a centralized cloud database.

The explosion of data we’ve experienced in the last 10 years is old news, and indeed, data generated from humans will continue to snowball. But the next explosion of data we expect from machines and the IoT space — from drones and autonomous vehicles to automated systems that are driving production decision — will dwarf anything we have experienced to date.

The big challenge here will be how to manage that data. And for the logistics landscape, managing it in a centralized cloud-based database framework is just not going to work — not until you have a distributed technology computing framework in place.

We need to move more intelligence out to the edge of the compute rather than back to the center — such as a centralized cloud system. We need to take the IoT data, link it to a distributed ledger node that will run logic at the edge — much nearer to the devices — to turn the data into actionable information or insight so that you have more autonomous decision-making. The right kind of blockchain and distributed ledger technology will play an instrumental role in this because it’s immediate.

If the anticipated boom in growth happens, it will become too difficult and too slow to get actionable insight from centralized cloud-based databases — they’ll also be costly to run. To make any sense or use of all the new data coming from these 500 billion devices, something in the middle will be needed to connect all those IoT devices, to feed the machine learning and algorithms.

It’s important to remember here that the very definition of IoT is expanding — for instance, some now include autonomous vehicles within the definition. Is a car a device itself? It has lots of IoT devices in it that have to be connected to the car, which has to be connected to other systems. And to make all of these systems work, you need to be able to run distributed applications. There has to be logic. A communication channel. And decentralized decision-making.

This is where blockchain distributed ledger technology comes in: At its core, it’s largely based around having decentralized decision-making or the validation of decisions and transactions without a central control authority. By converging IoT with blockchain, it’s possible to make all these “requirements” of IoT more practical in the logistics space because you’re enabling applications to make decisions closer to the IoT devices rather than from a centralized global cloud framework. It’s about putting intelligence into the whole phenomenon of IoT. It’s about enabling IoT devices of the future to actually talk to one another and make decisions themselves — based on rules and applications, of course.

The right kind of blockchain distributed computing technology will be able to put business logic and decision-making at the edge, it is able to authenticate participants, manage authorizations, and efficiently and cost-effectively share data among related and unrelated participants and store that data for purposes such as providing an immutable audit trail.

Real-world benefits

It’s one thing to theorize about the power of IoT and blockchain together, but what are the tangible benefits for real-world enterprises?

Efficiency and improved customer experiences that drive loyalty and supercharge bottom lines are key benefits that a logistics business can feel when the two converge. But what’s most important here is real-time action. This is what will drive efficiency and enable all of these automated systems of the future to become reality.

Imagine two self-driving cars. An obstacle is in their way. They communicate with each other in real time and are aligned so that they can take an action to avoid the obstacle. This is more than just efficiency. What’s happening here is the intelligence is being put back at the edge, among the IoT devices, so that real-time decisions can be made, autonomously but within the set of rules provided.

Importantly, distributed ledger technology will give the logistics industry the ability to authenticate, authorize and validate. All of the information is picked up through IoT and written to a distributed ledger to create this audit trail, meaning it can be tracked from source to destination.

There are still outstanding challenges to solve, but the good news is that there’s lots of work already going on in relation to authentication and ensuring sensors cannot be tampered with throughout the whole chain.

The power of the two combined will also enable logistics businesses to deliver new kinds of services that weren’t possible before: With the right authentication at the source, a grocery store will know not only exactly where the fruit or produce originated and whether it is organic or Fair Trade, but also how long the remaining shelf life is, thereby reducing the enormous costs of spoilage. With integrated shipping information, originations, contents and bills of lading, insurance and payments, goods will flow through ports and customs authorities more quickly, saving billions of dollars in the process, keeping costs and prices down for consumers, and even enabling consumers to order directly from suppliers, reducing costs and improving choice even further. This is already happening with some small goods ordered from places like China. When logistics, IoT and blockchain are truly integrated, this will explode to include all kinds of bespoke goods and services.

Bright future ahead

While some might be feeling the pinch of pressure from increased competition and changing consumer demands, the overall outlook for logistics businesses is golden — especially if the forecast from Maersk that a 10% saving on the $1.8 trillion logistics industry due to automation efficiencies is realized.

It’s an exciting time for this sector, with new consumer behaviors bringing a host of new technical challenges to both the global and local supply chains. The overriding learning, however, is that whether you’re talking about global or local supply chains, they will all have to be more integrated than ever before — and have the ability to take real-time action — if they are to successfully meet these changing needs. Those that can achieve this will build consumer confidence and retain market share.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


April 2, 2019  3:50 PM

Connecting smart buildings and smart homes in the IoT era

Marc Pegulu Profile: Marc Pegulu
Internet of Things, iot, IoT applications, IoT connectivity, IoT devices, IoT wireless, LoRA, LoraWAN, LPWAN, Smart Building, smart home

Market awareness and adoption of smart homes and buildings continues to rise yearly, with 2019 set to be another busy year for the trend. From smart kitchen appliances and smoke detectors to voice assistant-enabled lighting and irrigation systems, builders and homeowners are using IoT to connect everyday devices to the cloud to create new experiences for buyers at a rapid pace. In fact, Business Insider forecasted that in the United States alone, more than 1 billion smart home devices will be deployed by 2023.

The key driver to making this technology work is connectivity that is long range and interoperable with a variety of different applications and implementations.

While short-range connectivity options already enable many connected smart home and smart building devices, more manufacturers are turning to IoT technologies that offer both indoor and outdoor connectivity in one. LoRa-enabled devices deliver exactly that.

The secret is that LoRa technology has the ability to maneuver dense building materials and therefore can reach devices deployed underground — all wirelessly. This offers an advantage compared to other technologies challenged by those conditions, not to mention the high costs and design flaws with running wired networks. Devices connected to a LoRaWAN-based network, the open, low-power wide area networking protocol based on LoRa, can be deployed either privately or by connecting to a public LoRaWAN-based network thanks to the freedom afforded by unlicensed spectrum.

Let’s explore some examples.

Connecting smoke and heat detectors to IoT

The United States Fire Administration estimated that nearly $14.3 billion in property damage was lost to fires in the U.S. in 2015. The majority are smaller commercial fires that usually run high risks for not only property damage, but also human injury. Fires have the capacity to spread out of control within only a few minutes, making early detection a pivotal way to limit potential tragedy.

LoRa-enabled sensors in homes and buildings collect data throughout the day, such as levels of heat, smoke or gas, in a given room. Data from each sensor is periodically sent to a LoRa-based gateway that connects the device to the network, similar to a router. Each gateway can handle the messages of several devices at once. Once a message is sent to the gateway, it is forwarded on to a network server where in can be analyzed. From here, an automatic response can be triggered depending on the conditions in the building. The response is sent to emergency personnel and the property manager via mobile device or computer, allowing them time to respond effectively.

Managing water to wash away high bills

Water is one of the more expensive items on the utility bill, costing Americans on average $15 to $77 per month. From the dishwasher and washing machine to toilets and the shower, water use is unavoidable.

With LoRa technology, it’s easy and affordable to manage home water usage and guard against potential water or moisture damage in the process. Similarly, LoRa-enabled sensors can measure water use and send this data off for analysis. If a problem is detected, applications can automatically shut off the water supply and alert the homeowners. Additionally, the LoRa-based system can send periodic updates to inform the users of their consumption trends. This allows the homeowners, landlords and business owners to see their problem consumption areas and develop and execute a plan for smarter, more efficient water use.

Getting started today

Ensuring the correct wireless system is used is the only way to build or retrofit a dwelling to meet today’s IoT era. As an easy and proven alternative to current wireless communication networks, LoRaWAN-based networks deliver higher penetration in dense urban areas and a longer distance in non-urban areas. Added to the competitive operational cost and extended lifetime of the sensor batteries, this technology reduces energy consumption and contributes to a lower maintenance cost.

The drivers are the same for all LoRa-based initiatives. Easy installation is teamed with the affordable operation of advanced technology. But perhaps the most important driver is LoRa technology’s capability to deliver more functionality and efficiency for builders around the world.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


April 2, 2019  10:23 AM

How hybrid IT and 5G are enabling smart cities

Martin Olsen Profile: Martin Olsen
5G, ai, Hybrid IT, Internet of Things, iot, IoT applications, IoT data, IoT edge, IOT Network, IoT sensors, IoT wireless, Smart cities, smart city, smart city applications, smart city network

About 90 million people in 185 countries around the world use Waze, the crowd-sourced navigation app that helps drivers quickly get from point A to point B while avoiding potholes, traffic jams and, yes, police. The app collects data from users and pushes relevant information to drivers, virtually in real time, wherever they might be. It is the kind of modern miracle we take for granted in today’s hyper-connected world, but we shouldn’t. When we talk about IoT, the edge of the network, machine learning and AI, we are talking about Waze and its ilk. The future is now.

And it’s not just Waze. Interesting AI applications are cropping up around the globe. From smart irrigation controllers that detect water leaks and help reduce water consumption in drought-stricken environments to innovative, IT-enabled lampposts that double as data intelligence networks to improve transportation, traffic and personal safety — these types of technology revolutions are happening all around us. We’ve just come to accept it all, starting with the supercomputers we hold in the palms of our hands.

Some areas are reacting to these advancements faster than others. For example, some Australian cities, including Adelaide and Canberra, already are developing the critical network edge infrastructure needed to support next-generation citizen services and boost the economy. Industry analysts anticipate by 2020, 50% of global companies will personalize individual experience via biometric data. The advent of 5G is imminent, and 5G along with hybrid IT will enable and accelerate all of these innovative technologies.

Hybrid IT, integrated systems, edge computing and similar approaches to pushing compute resources as close as possible to the end user are driving all sorts of new business models, services, innovation and applications, and creating a new level of personal experience. All of this is enabled by the IT resources at the edge and the plethora of new mobile devices — even smarter phones, tablets, wearables, connected devices, smarter cars/transportation, etc. — and it all is fundamentally reliant upon and enabled by the digital network.

Smart cities require tens of thousands to hundreds of millions (or more) of sensors — probably on the order of 5 to 10:1 per person — to be deployed in numerous configurations to provide meaningful and actionable data in real time. This includes highway, pedestrian, air, water, sound, vibration, temperature, pressure and countless additional data points from sensors and cameras all around us, including underground, on and in water, and overhead.

These sensors and the billions of connected devices require a resilient digital communications network that is available 24/7, self-healing to a degree, with commercial, industrial, government and first responder private networks, some of which will be in restricted-use frequencies to further ensure access and availability in times of emergency.

4G and especially 4G LTE helped launch this new digital landscape. 5G will launch us to the next level where we can truly deploy the smart city concept and all the benefits it entails. Sacramento, Calif., San Francisco, New York City and Columbus, Ohio, are among the forward-thinking cities in the U.S. that are developing and deploying smart services, as well as laying the foundation for future digital capabilities via networks and infrastructure. There are others, and more still to follow as the 5G rollout continues.

For more information on these topics, check out our white paper that uses edge archetypes to identify the most promising use cases for 5G.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


April 1, 2019  11:04 AM

Even big cloud providers are embracing edge computing

Laz Vekiarides Profile: Laz Vekiarides
Cloud Computing, Cloud providers, Edge analytics, Edge computing, Internet of Things, iot, IoT cloud, IoT edge, IoT service providers, IoT services

Edge computing is attracting a lot of attention these days, even from cloud providers that have previously said “everything will go to the cloud.” For example, take AWS Outposts, a new service that AWS announced in November. This service, which should become available next quarter, enables customers to run AWS on-premises, and then connect to the AWS cloud. To accomplish this, AWS ships customers fully managed racks of AWS-designed compute and storage hardware running VMware.

But there’s more. AWS IoT SiteWise, part of a suite of IoT services, collects data from industrial sites through an on-premises gateway. The very next week, Microsoft announced new functionality for its own IoT service, Azure IoT Edge. Clearly, AWS and Microsoft now realize that the cloud alone will not be enough to provide the performance that enterprise IT and the growing number of applications for IoT will require.

The market interest from cloud providers, tech vendors and investors in edge computing is easy to understand: The global edge computing market was worth just shy of $8 billion in 2017 and is expected to surpass $20 billion by 2026, according to market research firm Stratistics MRC.

But that market interest is ultimately driven by a known technical weakness in public cloud-only models: performance. Applications, workloads and data sets that require a nearly instantaneous response function poorly when they are located far away from the users who are working with them. There’s nothing an engineer can do to transmit data faster than the speed of light, and the distances between large metro areas, multiple endpoints generating data and big cloud data centers are large enough to introduce significant latency, an issue that’s becoming even more acute with the growth of IoT.

As IoT continues to expand, users and applications will be generating more data than ever. But as effective as the cloud is for storage, the concept doesn’t work when it comes to delivering a fast response when there are great distances involved. And the availability of internet bandwidth in places where IoT and other remote devices are apt to be located can be quite performance-constrained.

When you consider how much data IoT will create — connected cars alone are expected to generate and use as much as 40 TB of data per vehicle per day — there are simply not enough pipes and bandwidth to move to and from the cloud with a low level of latency. Autonomous vehicles, for instance, can’t afford to wait seconds to send data to the cloud, process that data and receive an answer. By then, it may be too late to avoid a hazard or take a better route to the destination.

Enter the edge

Increasingly, large organizations are coming to the realization that a lot of processing and analytics power has to sit at the edge. It’s physically impossible to put it all in the cloud and expect to avoid latency issues.

Edge computing enables data to be aggregated at a local point of presence, ensuring a very high-performance, low-latency option for processing. It consists of smaller data centers in locations closer to where users are working. These enable them to communicate with data without the latency challenges.

Further, edge computing enables users to benefit from cloud elasticity and economics without the problems presented by distance. Cloud providers will likely extend their delivery models closer to the edge — or at least they should — because that’s where computing will increasingly be needed.

The edge, therefore, extends the cloud — especially for data access — closer to end users, where it can be consumed and managed as if it were stored locally. This way, applications can take advantage of all the cloud has to offer without any of the performance drawbacks.

It’s clear that edge computing is a critical component of a successful hybrid cloud strategy. As both data and its importance continues to grow, organizations need to begin building their edge computing strategies or risk missing out on the real opportunities IoT and the cloud have to offer.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


March 29, 2019  1:33 PM

Investing wisely in digital security measures

Kevin von Keyserling Profile: Kevin von Keyserling
#eHealth #Healthcare IOT #Wearables #wireless medical devices, Connected Health, Digital security, Encryption, Internet of Things, IoHT, IoMT, iot, IoT devices, iot security, PKI, Public-key infrastructure

As the internet of medical things grows, the delivery of care moves beyond the four walls of a hospital, and patients become more engaged and accountable. Device manufacturers must be wary of this impending shift because experts predict that there will be between 20 and 30 billion devices within IoMT by 2020, making first-to-market pressures even greater. Above all, medical device manufacturers that rush to drive innovation can’t forget about one of the most important considerations with lasting consequences: device security.

This precaution isn’t “new” news by any standard — security is undoubtedly a hot topic that has found its way into war rooms and boardrooms. For device manufacturers, this means aligning priorities with customers’ needs and giving every device its own secure identity. But administrative stressors exist, including R&D budgets intended strictly for product development and stringent compliance requirements. Implementing a strategy that elevates digital security into production can often be a challenge as a result.

We were interested in quantifying the size and scope of these challenges, so we recently partnered with the Ponemon Institute and produced a study, “The Impact of Unsecured Digital Identities,” (registration required) in which we gathered input from 600 IT professionals across the U.S. The results reinforced exactly what we predicted: Lack of attention and investment in digital identity management is putting deployed technologies at risk.

Finding the dollars for digital security

One of the most interesting discoveries was that organizations spend $18.2 million on IT security annually, yet allocate only 14% to public key infrastructure. Additionally, 65% of respondents are adding layers of encryption to comply with regulations and policies, and 63% say additional certificates are increasing operational costs. These results imply that independent budgets are often standard practice and additional dollars are hard to come by. Decision-makers need to consider more creative ways to use long-term, security-centric funding. It can be as simple as incorporating a separate line item for security investments or as intricate as entrusting capital to a specialized department that can be activated as needed.

Budget investments that prevent catastrophe

Respondents experienced an average of five failed audits or compliance in the past two years, with a 42% likelihood that incidents like these will occur over the next two years. Not enough to drive organizational change? How about a price tag of $14.4 million being tied to failed audits and/or compliance from insufficient management practices? The study confirms that failed audits and lack of compliance are one of the costliest and most serious threats to an organization’s ability to minimize risk.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: