IoT Agenda

March 5, 2019  1:21 PM

Combatting the continued expansion of IoT threats

Anthony Giandomenico Anthony Giandomenico Profile: Anthony Giandomenico
botnets, Internet of Things, iot, IoT botnet, IoT networking, iot security, malware, securing IoT, Segmentation

The expanding threat landscape resulting from the convergence of cyber and physical systems is providing cybercriminals with additional entry points into the corporate network. This is not going unnoticed by cybercriminals. The Fortinet Threat Landscape Report for Q4 of 2018 not only showed that half of the top 12 exploit spots were held by IoT, but that four of those spots were held by IP cameras. As it turns out, bad actors are now focused on exploiting the inherent IoT weaknesses in these cameras to possibly monitor or control the very same devices we use to monitor our physical safety and security.

Access to IoT IP cameras not only enables cybercriminals to snoop on private interactions and enact malicious onsite activities (such as shutting off cameras to better physically access restricted areas), but also enables them to be used as a gateway into the cyber systems they are connected to in order to launch distributed denial-of-service attacks, steal proprietary information, initiate ransomware attacks and more. The adage “who’s monitoring the monitors” is quite apropos here.

To prevent this sort of compromise, organizations need to establish security protocols designed to protect connected physical systems from attack, including segmentation, baselining behaviors, and alerts and quarantines that are triggered when behaviors change. This is also a reminder that every IP-enabled device, especially those that are part of your physical security system, needs to be part of your vulnerability and patch management process. This is especially essential as more and more physical security devices traditionally assigned to operational technology networks are now being converged into the IT environment. In this new interconnected world, security cameras are merely the canary in the coalmine. Criminals that gain access to things like fire suppression systems and alarms could potentially cause catastrophic harm.

These security protocols will also need ongoing updates as IoT threats continue to evolve. Fortinet’s Q3 report for 2018 contained an entire section detailing the evolution of IoT botnets over the last few years. One important 2018 adaptation was the ability to implant cryptojacking malware into infected IoT devices in the home. There is no reason why business devices wouldn’t be next. While mining cryptocurrencies requires high CPU resources, and individual IoT devices may not offer much in the way of processing power, hordes of easily compromised and largely idle IoT devices may offer such power through scale.

However, that Q3 report also revealed the merging of destructive tendencies with IoT botnets. Traditional malware like BrickerBot rendered over 10 million devices completely useless since its launch in 2016. While this might only be an inconvenience when your internet-connected coffee maker in the office break room bricks up, but what about a medical device in a hospital, an HVAC system in your building or a connected thermostat regulating the temperature of an industrial-sized boiler filled with caustic chemicals?

We don’t have to wonder because we now have malware like VPNFilter designed to target IoT devices, and even industrial control systems. Once installed, it can not only steal website credentials and monitor SCADA protocols, but it also includes a kill switch that can physically destroy an IoT device. And it also has the ability to inject malicious code back into the network session it is monitoring, allowing for crossover infection to endpoint devices. And the bigger issue is that traditional security systems do very little to secure vulnerable IoT systems.

Shifting security strategies

This weakness in many security strategies is about to get worse. The sudden, exponential growth of the attack surface due to the rapid expansion of IoT devices and edge-based computing, especially when deployed in emerging 5G networks, means that literally billions of IoT devices will be interconnected across massive meshed edge environments, where any device can become the weakest link in the security chain and expose the entire enterprise to risk.

To address this challenge, organizations will need to make some fundamental shifts in how they think about networking and security:

  • IT security teams will need to develop new segmentation strategies to isolate devices and limit exposure. Segmentation will also need to be extended across networked environments for which organizations may or may not have full control, such as 5G networks and public cloud services, in order to protect wide-ranging workflows, transactions and applications.
  • Security must become an edge-to-edge entity, expanding from the IoT edge across the core enterprise network and out to branch offices and multiple public clouds. To do this, everything connected to the enterprise ecosystem needs to be identified and rated, and their state continuously confirmed. Once effective visibility and control are in place, all requests for access to network resources must then be verified, validated, authenticated and tracked.
  • Organizations must devise security that supports and adapts to elastic, edge-to-edge hybrid systems, combining proven traditional strategies with new approaches and technologies that operate seamlessly across and between multiple ecosystems.
  • Disparate security tools will need interoperability to share information and stop threats. This will require vendors to establish new open 5G security standards, integrate APIs into their systems and develop agnostic management tools that can be centrally managed to see security events and orchestrate widely distributed security policies.

In the meantime, organizations need to adopt open standards and common operating systems to ensure as much consistent interoperability as possible across their evolving network. Correlating event data, sharing real-time threat intelligence and supporting automated incident response will require security technologies to be deeply integrated. This will mean the development and adoption of a holistic security architecture that uses machine learning, artificial intelligence and automation to accelerate decision-making and close the gap between detection and mitigation.

Situational awareness is also key. Organizations need to understand their critical processes and data, identify cyber assets and know what OS and applications are installed. They will also need to map their network architecture to understand data flows and possible blind spots, and identify threat actors to get an idea of how they will try to break in and what resources they are most interested in obtaining.

Knowing is half the battle. It will help you engineer as much risk and vulnerability out of your network as possible, and it will also help you select those security systems that are most appropriate to protecting your unique environment. Just remember, to be the most effective, any security technologies you choose need to be able to interact with your other enforcement points by sharing events, correlating intelligence and coordinating a holistic response to threats.

Because multiple exploits targeting IoT devices not only topped Fortinet’s charts for Q4, but indeed, for the whole past year, organizations cannot afford to take a wait-and-see approach to network security. Real-world exploits are already causing business disruption, including the destruction of IoT devices, and are poised to inflict further damage as techniques evolve and IoT-enabled networks continue to expand. Being aware of these changes is the first step toward creating stronger defenses across the expanding network — a necessity as IoT increases in size and momentum, and becomes increasingly embedded deep into our business strategies and networking strategies.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

March 5, 2019  12:03 PM

Does greater product connectivity mean smoother after market?

Guy Courtin Profile: Guy Courtin
Internet of Things, iot, IoT connectivity, IoT data, IoT devices, IoT in retail, IoT products, IoT wireless, product lifecycle, retail, retail IoT, Retail IT, smart supply chain

Although the holiday season is in the rear-view mirror, retailers are still feeling the impacts of yet another busy shopping rush. Santa Claus likely did not get every wish list item correct, and the result is a spike in returns over the first 60 days of the new year. It is not easy to predict which inventory will re-enter the supply chain, and retailers are constantly trying to better anticipate and handle these returns.

In a consumer-focused, commerce-dependent retail world, brands not only have to constantly address the needs of the consumer, but also must prepare for the increasing possibility of inventory returns. Couple this with the fact that many retailers, such as Costco, Target, REI and IKEA, among others, have incredibly customer-friendly return policies, and it is a recipe for a lot of returns. So, can greater connectivity assist in this new world of returns? In the near term maybe, but in the long term — absolutely. Here is how.

We are only just scratching the surface of connected products. For the most part, retailers and brands are looking to attach RFID or IoT to their products to prevent shrinkage or for greater visibility throughout the supply chain. Connectivity is important for basic visibility, but retailers and brands are not yet prepared to push this connectivity beyond these boundaries. Most importantly, consumers need to become comfortable with allowing this connectivity to become a part of the product lifecycle.

Furthermore, consumers are becoming increasingly comfortable with connectivity throughout all aspects of their lives. From connected vehicles to connected personal devices, consumers are becoming more accepting of our information being consumed by a growing number of devices. In fact, this consumption of data is even accelerating in the home, as people use connected devices in their houses such as smart speakers and connected appliances. These trends signal a growing acceptance of sharing more data across an increasing number of touch points. But, how does this impact the world of returns?

Those working in the supply chain need to start thinking about using this greater number of touch points as an opportunity to better understand how their products are being used. For example, consumer product companies spend vast sums of money to acquire point-of-sale data about which of their products are being purchased. What if they could access data from connected appliances in homes to tell them how their products are being used? This amount of information could shed light on not only how the products are truly being used, but also understanding the cause behind returns. For example, are the products not being used as they were designed for? Or, are consumers’ expectations not in line with the intent of the product? Similar to how the automotive industry has used greater connectivity of their products to enable predictive maintenance, consumer supply chains can do so with their own usage data to better anticipate when items might come back for return or exchanges.

Greater connectivity is coming to a host of areas in the consumer ecosystem. This connectivity holds the potential to allow companies in supply chains to better manage the lifecycle of their products from purchase to usage and finally to return or end of cycle. The notion of predictive maintenance is no longer monopolized by large industrial machines, such as airplane engines or wind turbines. As a greater number of consumer-focused goods are becoming smarter, retailers need to take advantage of the data that is being made available to them — and ultimately better adapt their supply chain to an increasing amount of returns.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

March 4, 2019  6:04 PM

Five IoT security and privacy trends to look out for in 2019

Peter Berg Profile: Peter Berg
cybersecurity, Data privacy, Internet of Things, iot, IoT data, iot privacy, IoT regulations, iot security, PET, privacy-enhancing technologies, security regulations

What will 2019 bring to the internet of things? The 2018 forecast predicted new functionality and new markets, including convergence with AI enhancements to make devices smarter and broader adoption by manufacturing. Now that the technology is proven, business models are verified and appetites from both consumers and industrial users are whetted, is it smooth sailing from here on out?

If only it were that easy. 2019 is likely to become the year that IoT gets really complicated since the stakes are high and the challenges multi-faceted.

Here are five IoT trends to look out for in 2019:

1. More hacks and increased spending on cybersecurity. Did you hear about the family in Orinda, Calif., whose Nest surveillance camera warned them about an impending ballistic missile attack? The risk of physical harm from North Korea may have been fake, but the hack was certainly worrying. With billions of connected devices proliferating on the market, the rewards from stealing data continue to grow, as will investment in hack prevention and damage limitation. An estimated $124 billion will be spent on data security globally in 2019. As cybersecurity costs rise, companies are looking for ways to protect their user data and decrease their risk.

2. Greater interest in selling data. As data collection increases, so does the temptation to monetize. TV manufacturer Vizio thought it had a great business model: sell its flat-screen TVs at break-even prices, then generate income by selling customer data. Oops. Settling a data-tracking lawsuit is costing Vizio an estimated $17 million — after already paying a $2 million fine to the FTC. Vizio doesn’t have to stop selling data, but it does need to be more transparent and offer clear options for customers. A better choice for companies like Vizio is to use a secure data exchange that preserves the functionality of the data, without revealing compromising or sensitive information.

3. New data harvested from IoT and workforce management tools. Gartner predicts 70% of organizations will integrate AI to assist employees’ productivity by 2021. Gartner also predicts that a quarter of digital workers will use virtual assistants daily. Now that IoT systems can track productivity and workers, it creates new privacy concerns. Companies will need a way to ensure that individual privacy — including Social Security numbers and HIPAA data — isn’t compromised in the collection, sharing, and analysis of data.

4. Regulations: Study the fine print. 2018 marked the introduction of several new data protection regulations, from the California Consumer Privacy Act to the European Union’s General Data Protection Regulations. Although California’s most stringent law doesn’t go into effect until 2020, it’s expected to have broad implications for the rest of the country, which will require all companies to read the fine print. The EU rules, which went into effect in 2018, laid out how personal data can be collected and stored, but also stressed that it isn’t forbidding the sale of data per se. For example, the EU states: “When the data used for AI are anonymized, then the requirements of the GDPR do not apply.” “Mythbusting,” the title of a fact sheet released in January 2019 by the European Commission, supports this statement, dispelling rumors that GDPR will stifle innovation in artificial intelligence. Companies will need to closely study the laws, which may force some of them to adjust their business models.

5. Increased popularity of privacy-enhancing technologies (PETs). Will the expense of cybersecurity and privacy throttle IoT innovations? Rather than push for weak or general rules that treat all organizations equally, in 2019 there will be a greater focus on using technological solutions to solve privacy and security problems. “One possible solution is to encourage the use of privacy-enhancing technologies,” wrote the Harvard Business Review. “PETs, long championed by privacy advocates, help balance the tradeoff between the utility of data while also maintaining privacy and security.” PETs include differential privacy, such as that in place at Apple, and homomorphic encryption used by Google. Another option is to decouple sensitive information and store data away from a company’s system, protecting privacy and reducing risk while still allowing the use of the data.

These are exciting times for IoT device manufacturers, consumers and the many global companies that support them.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

March 4, 2019  1:34 PM

Achieving IoT, cloud and edge security starts with visibility

Reggie Best Profile: Reggie Best
Edge computing, edge security, IIoT, IIoT network, IIoT security, industrial internet of things, Industrial IoT, Internet of Things, iot, IoT edge, IOT Network, iot security

In my first IoT Agenda post, I discussed how the internet of things and the industrial internet of things are dramatically expanding organizations’ attack surfaces and introducing new security and compliance risks. In this article, I want to focus on how IoT and IIoT have escalated the importance of gaining visibility into and control over cloud computing and edge computing environments.

IoT 101

Before we can truly appreciate the role of cloud and edge computing in IoT/IIoT, we need to first have a basic understanding of how IoT works. At a very high level, distributed IoT/IIoT infrastructures consist of IP-enabled sensors, processors and other devices that collect data and then use some form of connectivity (e.g., Wi-Fi, Bluetooth) to push that data to the cloud for processing, analysis and action.

From a security standpoint, the IoT and IIoT devices themselves must be protected because they are part of the network ecosystem and, if compromised, can serve as a gateway to IT and operations technology (OT) networks, as well as the treasure trove of information they contain. Implementing proper cloud security measures is equally important, since the cloud is home to data aggregation and analytics processes — and end users rely on this information for decision-making and to take appropriate action.

Using the cloud for IoT/IIoT data analysis works just fine in many instances, but not all — and this is where edge computing comes in. Public cloud data centers are usually located in remote places, far away from the end users they serve. Because of these distances, sending IoT/IIoT data to the cloud for analysis and then delivering it back locally takes time — time that some users just don’t have, for example, a doctor relying on an IoT medical device, or a facilities manager in charge of a nuclear power plant.

Edge computing processes data in close proximity to the site of data generation, which eliminates latency and performance issues, enabling real-time control decisions. Given these benefits, edge computing is being increasingly adopted within organizations that rely on IoT to provide instantaneous machine-to-machine interactivity and responsiveness. This has made edge infrastructure security more important than ever — but many IT security teams struggle with edge security, especially within IIoT environments.

Securing the edge

The issues associated with securing edge computing in an IIoT environment are, at a conceptual level, the same as any other networked connectivity, namely:

  • Privacy: An assurance of privacy of the communication between parties (i.e., data encryption);
  • Authentication: Enforcing assurance that parties are who they say they are before allowing access to edge networks and devices; and
  • Authorization: A mechanism of authorization to the various services to which parties are entitled.

However, while there are well-known solutions to these challenges in traditional, legacy computing environments, IIoT environments remain challenging. The reason is that OT networks were, until recently, isolated from IT networks and the internet, so industrial control systems, sensors, controllers and other IIoT endpoints weren’t exposed to common IT threats and, therefore, weren’t designed to run security software — there simply wasn’t a need for them to have this ability. Today, however, this is all changing as OT networks merge with IT networks, and once isolated IIoT systems and devices are now IP-enabled — but still lack the power, compute cycles and storage to run security software.

This has presented IT security teams with several security challenges. First, in many cases, security teams won’t be able to “bolt on” security at all; they’ll need to replace the OT endpoints altogether, which takes time, money and resources. Second, for those endpoints that do have the capacity to run security software, the overhead of adding encryption, authentication and authorization systems and processes may actually increase latency, which would negatively impact real-time embedded OT endpoints responsible for sub-second or even millisecond reaction times. This would be a major step back, since reduced latency is the reason edge computing emerged in the first place. And last, but certainly not least, edge and endpoint OT devices are often located in inaccessible, less hospitable environments, making it very expensive for organizations to implement and maintain security.

OT networks will eventually adopt IT security processes and protocols, but revamping products and infrastructures in this way will take decades. What can be done today?

Security starts with visibility

When it comes to IoT/IIoT, it’s important for organizations to have an accurate understanding of not only their IT/OT networks, but their cloud and edge computing infrastructures as well. In other words, they must be able to answer questions such as: Who has access to what endpoints? Are IoT, edge and cloud systems being properly managed? Are there leak paths to and from the internet that could be compromised by cybercriminals? Is network traffic normal? And the list goes on.

The only way to answer these questions is by gaining visibility into three equally important areas:

  1. Visibility into all endpoints and assets across all computing environments;
  2. Visibility into how those endpoints are connected to the enterprise, the internet and each other; and
  3. Visibility into whether the endpoints and subsequent traffic are expected, or if they indicate suspicious behavior, anomalous activity or rogue devices.

You can’t protect what you can’t see. The first step to winning the IoT/IIoT security battle — whether in an IT or OT environment — is visibility. Once visibility is achieved, organizations have access to the information they need to fully understand their risk posture, prioritize security strategies based on this understanding, and use IoT/IIoT data and other next-gen technologies to advance business processes without introducing unnecessary risks.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

March 1, 2019  3:20 PM

The fourth Industrial Revolution is here — are you 4IR ready?

Mohamed Kande Profile: Mohamed Kande
fourth Industrial Revolution, IIoT, IIoT strategy, industrial internet of things, Industrial IoT, Internet of Things, iot

The change is seismic. The impact, drastic. The potential is limitless. The fourth Industrial Revolution doesn’t have to be your enemy. We’re witnessing a shift where unforeseen technological innovation and advances are driving customer expectations. These demands often pose challenges to industries, companies and leaders to not only deliver on their customers’ needs, but understand how these technological advancements can benefit their businesses. Moreover, they must begin to anticipate market changes driven by technological innovation.

When tackling the fourth Industrial Revolution (4IR), there are four primary factors business leaders should consider:

  1. Strategy: Blending disparate emerging technologies to advance strategic goals is one of the key challenges in preparation for the 4IR. Quite simply, there is no off-the-shelf “4IR kit.”
  2. Growth: When a 4IR strategy is designed and carried out correctly, it is agile and resilient, capable of incorporating tomorrow’s new technology and business models with as little disruption or redesign as possible.
  3. Trust: Data powers the revolution. But customers’ willingness to entrust companies with sensitive information about their lives and businesses hinges on the quality of experience offered to them and knowing their data is protected.
  4. Workforce: Demand for digital talent is at an all-time high, and companies cannot wait for there to be enough graduates or trainees to fill it.

So, how do we address these considerations? Turn the revolution into your business’ evolution.

When faced with the challenges to digitize our own business and to understand the technological disruption taking place across the market, we created several solutions. From reshaping our business model to bringing the creative industry inside our four walls, there’s a series of efforts we’ve taken to adjust to the market and inspire significant change within our PwC culture. We also focused on expanding our talent with different backgrounds and recently embarked on a digital upskilling journey with 55,000+ individuals across the U.S. The key here is that it is a journey.

As companies are looking to reshape their businesses to respond to technological innovation, they must keep the following principles in mind:

  • Understand both business model innovation and product innovation. Simply put, typical product innovation will no longer be enough. Leaders will need to balance product innovation with the more radical approach: business model innovation. By focusing on business model innovation versus product innovation, the focus shifts from the organizations of today and toward the organizations of the future.
  • An increase in cross-sector activity. Deep knowledge expertise has proven critical in traditional business models, but as we look to evolve, having the flexibility to work and collaborate cross-sector is imperative.
  • Hiring isn’t the sole answer on talent. We can’t expect to solve digital disruption by only hiring digital talent. The digital fitness of all staff, new and existing, is part of the culture shift necessary to motivate and retain your workforce, while preparing them for the needs of tomorrow.

It’s a journey. One that requires a highly adaptive mindset with a fluid approach, a deep knowledge of new technologies and an even broader understanding of how all parts of businesses are impacted by this revolution. Yes, it’s difficult laying out a strategy for a future with uncertainty, but the key is to start now.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

February 28, 2019  4:25 PM

Overcome power budget limitations to achieve breakthrough IoT systems

Ori Mor Profile: Ori Mor
battery, battery charging, battery life, Internet of Things, iot, IoT hardware, IoT power, IoT software, IoT wireless, powering IoT, Wired, Wireless, wireless power

Companies are constantly striving to introduce new and unique connected devices to market, and there has been a great deal of innovation aimed at the home and business that transform the way we live. Home security, for example, has been an emphasis for IoT ecosystem development and has already had a steady adoption in the home and business, from smart entry systems and security cameras to smoke alarms.

Similarly, Nielsen reported that nearly a quarter of U.S. households own a smart speaker. The IoT revolution is upon us and there is no doubt that it’s just scratching the surface of its peak potential. With that said, there is an anchor that I firmly believe is preventing these devices from saturating the market — and that is power.

Batteries are convenient, but constraining

Batteries that power IoT devices pose a solution, as well as a problem. While they serve as portable power options that are quick to install anywhere, batteries also impose strict power budgets and limit device functionality.

For example, while wireless security cameras may be easy for homeowners to self-install due to the lack of wiring involved, batteries are known to quickly deplete. Constantly changing batteries is a hassle, and homeowners are often left wishing they had invested in wired systems that offer features such as streaming video. Similarly, smart locks require constant battery replacements in order to power the device, and even more so if the user wants video recording, biometrics or cloud storage.

An executive from a smart home products company we recently spoke with gave some insight into challenges they face regarding powering IoT devices. “At first,” he said, “we wanted to make sure customers don’t have to replace batteries more than once a year, but we realized we could not fit the features we wanted under those power constraints.” He added, “Ultimately, we decided that the need to replace batteries every six months would be okay, so we could include the features most desired in the product.” Whether you agree or disagree with this executive, this exchange illustrates how battery power handcuffs solution designers.

Let’s assume the device is using four high-end AA alkaline batteries — likely too large for some IoT devices, but it’s a good reference point. If the batteries need to last one year, the average power consumption is only 0.002 watts. Compare that with wired devices that can take 1, 5 or even 50 watts. You can begin to see the problem. This huge gap explains the dramatic functionality difference between battery-operated and wired devices.

Would wired devices serve as a better alternative?

While some may believe that wired devices are the solution, they carry their own set of problems. Consumers can’t easily install wired IoT devices themselves without having a background in electrical wiring or fishing wire through walls, and, at an average cost of $100 per hour for labor, professional installation is quite expensive. Also, if a problem arises with a wired IoT device, installation teams are required to make a service trip to the home or facility to troubleshoot and resolve the issue.

The concept behind long-range wireless power relies around three simple steps:

  1. An energy transmitter converts electrical energy into some physical phenomena.
  2. That physical phenomenon travels through air to reach an energy receiver.
  3. That phenomenon is converted back into energy on the receiver side.

There are different approaches to make wireless charging a reality, including radio waves, ultrasound and infrared light. When looking into options, product designers need to investigate a few key parameters:

  • How much energy can be available for the receiving device? At what distance?
  • What is the transfer efficiency?
  • How large are the energy transmitter and receivers?
  • Is it safe? Are the quoted values within the regulator’s (FDA, FCC, etc.) tolerable limits?

Using the light spectrum for energy delivery

One promising alternative is long-range wireless power that uses infrared (IR) light. It is promising for several reasons, the first being that IR light is natural light. About 50% of the sun’s energy is IR, so humans have already been living with IR for millions of years.

Light can travel in a thin, straight line. For instance, with a laser pointer you can shine a small dot on a wall from far away. In the same manner, an IR transmitter can focus a tight energy beam on a small receiver. This means high efficiency, and that all or most of the transmitted energy reaches the receiver and that little or no energy leaks into the environment.

Energy transmission using IR light can safely deliver hundreds or thousands of times more energy than batteries or other wireless charging methodologies due to the fact that the transmitted energy significantly dissipates the further it travels, limiting the usable energy available at the device level. The combination of power, distance and safety of IR holds significant promise to product designers and end users alike.

What’s the future of wireless power and IoT?

Regardless of approach, wireless power is not always the right answer. Sometimes a battery provides plenty of energy for a particular sensor. If a device needs a new battery every five years, wireless power might not be a high priority. By the same token, if an IoT device needs tens of watts, wireless power might not be able to deliver the amount of power required. Sometimes, it is not practical to install an energy transmitter within reasonable distance of the client device.

Long-range wireless power can absolutely be useful as a third option for energy delivery. If power cords are cumbersome and batteries are insufficient, wireless power can be an alternative. Designers equipped with much more than batteries can create breakthrough IoT innovations, simplify installations and put the power back into consumer’s hands. Wireless power is not a perfect fit for all applications, but forward-thinking companies will soon consider this technology and the benefits it provides.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

February 28, 2019  1:50 PM

Connectivity in smart(er) cities

Morné Erasmus Profile: Morné Erasmus
Internet of Things, IOT Network, IoT networking, IoT sensors, IoT wireless, Smart cities, smart city, smart city applications, smart city network, Smart sensors

More than 55% of the world’s population lives in cities, and by 2050, some 70% of the global population is expected to live in a city. In North America, city-dwellers already exceed 80% of the population. The urbanization of the world’s population will create huge challenges related to public health, pollution, energy, housing, safety and traffic. We need smarter ways to build and operate the cities of the future, and that’s why smart cities are a hot trend today.

In a smart city, there will be sensors everywhere, and IoT sensors, coupled with feedback loops through data centers or edge processing gateways, will fuel new applications and services. Think of streetlights that dim when no people are detected on a street, or sprinklers that turn on water only when needed, or parking sensors and cameras that direct traffic to available parking spots through a dynamic application.

Connectivity is the first key challenge in building out these sensor networks. There will ultimately be billions of devices, and they will need connectivity back to a data center or edge processing unit so they can relay their data to a smart city application. Of course, many smart city sensors will connect via wireless networks using protocols ranging from Zigbee, LoRa, Wi-Fi and 4G or 5G, but there is no such thing as a wireless network without wireline connectivity. Fiber remains one of the most robust ways to deliver large fronthaul and backhaul traffic.

As a result, cities that have been familiar with providing rights of way for water, gas and electricity services will have to add a fourth utility — connectivity. Cities must plan and implement ways to enable broadband connectivity to all citizens and edge devices for their future connected cities.

Until now, it has largely been service providers’ responsibility to build out fiber networks, but this is changing in the smart city era. Cities need ubiquitous fiber coverage, but they can’t rely on service providers to build out these networks everywhere because carriers normally only invest in areas where they can receive a good ROI. Cities need to provide infrastructure for poorer neighborhoods to counter the “digital divide” by ensuring that connectivity reaches everyone.

There are many hurdles to be overcome. Regulatory issues, lack of technical expertise and the need for new financial models still burden broadband deployments, but we are starting to see innovative approaches from cities with the political willpower to advance their communities. For example, cities are crossing departmental silos — the department of transportation typically has fiber at intersections, and cities are looking at sharing these fiber assets. To ease fiber deployments and reduce costs, cities should coordinate among departments so that when one department — water and power, for example — is planning to dig up a street, the city can lay conduit for future fiber deployments to reduce costs and simplify deployment. Streetlights are another example — cities are deploying smart light poles that incorporate concealment systems for sensors, cameras, wireless nodes, power systems and other components, and running fiber to them. These cities will have desirable assets when 5G small cells roll out and a leg up on the competition.

It will be each city’s responsibility to accelerate deployment and assist with these challenges to make it easier to deploy technology in their cities, or else they risk being left behind in the technology evolution and widening the digital divide. They already enable gas, water and electricity, and now they need to enable connectivity — not necessarily pulling the actual cables or lighting up the fiber, but enabling rights of way and providing the real estate for smart devices. In this way, cities can lay the groundwork for the rapid evolution and deployment of smart city applications and broadband connectivity throughout their community.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

February 28, 2019  12:00 PM

Convergence is the name of the game in IoT billing

Wojciech Martyniak Profile: Wojciech Martyniak
billing, converged services, Converged systems, Enterprise IoT, Internet of Things, iot, IoT billing, IoT monetization, IOT Network, IoT services, monetization, monetizing IoT, Network convergence, revenue streams

The internet of things is made up of many different moving parts. Most fundamental is the network of sensors embedded in the things themselves. And, of course, the provision of connectivity is imperative since there would be no network without it. Whether via Wi-Fi, Bluetooth or 5G, a range of protocols are employed to connect the myriad disparate components and allow them to communicate — sharing data with each other, with businesses or with service providers. This data also is essential to the functioning of IoT. Not only is its analysis critical in delivering the insight and intelligence required to inform business operations and processes, but it can represent a valuable revenue stream — if correctly monetized. Then, there are the security measures that must be put in place to protect the sensors and the data they generate.

With so many parts in play, it can be easy for businesses and operators to take a divide-and-conquer strategy, attempting to integrate a combination of different products from different vendors for each element of their own IoT network. Others might even adopt a one-size-fits-all approach to piecing together the various different architecture and network elements in the hope that they will work seamlessly together on any given use case.

Neither of these approaches is recommended, however. For one thing, IoT is still relatively immature and we don’t know exactly what the future will hold in terms of its development, especially with the imminent arrival of 5G and all that entails.

On a more pragmatic note, service-level agreements can differ from deployment to deployment, and the quality of experience within a particular use case can vary depending on usage and environmental conditions. Events such as these can have a negative impact on billing, IoT monetization and, ultimately, the bottom line. In order to avoid this and to make their IoT networks more effective and lucrative, businesses and operators should therefore rethink their approach to IoT, and start looking at it instead as a converged service.

Benefits of a converged network

Sometimes referred to as a next-generation network, a converged network merges multiple diverse networks over one common, standardized architecture. This enables a more effective, efficient transport of various kinds of traffic, such as data, video and voice.

Traditionally, converged networks were designed to allow voice and data telecommunications to exist in harmony; the advent of IoT, however, means that converged networks are now required to accommodate M2M communication as well. Indeed, the convergence of network, data and services can allow for the seamless, automatic connection of devices integral to IoT, regardless of the devices’ location, vendor or operating system. Common standardized infrastructure on most converged networks has a variety of services and protocols. When done right, they offer efficiency benefits including interoperability and the ability to carry out system upgrades without the need for downtime across the entire network. It can also reduce the risk of having to rip and replace. As IoT technology evolves, a single converged network may significantly reduce an organization’s ongoing maintenance costs and can drive efficiencies in other areas of the business.

Preparing for an uncertain future

For example, a converged services approach can be taken to billing. Whereas operators once billed solely for connectivity, the sophisticated requirements of IoT now mean they are able to bill their customers for devices, applications and bundled services too. By taking a converged services approach, and cooperating with different players in the IoT ecosystem, including CSPs, application providers or device vendors, operators are able to construct and implement bundled services and billing mechanisms that meet the needs of any use case, whether B2B, B2B2C or B2B2B. What’s more, taking such an approach allows them to revise and update them as conditions change over time, thus ensuring that they are always providing the most appropriate — and profitable — offering at any given moment.

Networking has evolved over the years as a result of the latest technological developments in the space, and is likely to continue to do so as IoT grows in size, maturity and complexity. And it’s this complexity that can hinder a business or an operator in its attempts at a successful IoT deployment. Rather than taking a patchwork approach to building an IoT network — or a pre-emptive strategy based on relatively little knowledge of how IoT will play out — a converged services approach is the most sensible means of implementing the ever-changing and evolving elements that make up IoT. With a flexible and cost-effective approach, while delivering high quality of experience across any use case, convergence is future-proof. And when the future is uncertain, this must be a consideration for any operator looking to monetize IoT and maximize their return on investment.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

February 27, 2019  3:52 PM

Breaking down the silos to enable full potential of IoT

Andreas Neubacher Profile: Andreas Neubacher
Enterprise IoT, Internet of Things, iot, IoT applications, IoT platform, IoT standards, IoT strategy, Standardization, standards

Mobile communications, based on the GSM standard, succeeded globally by enabling interoperability. With the availability of globally standardized technical interfaces, new business relationships and new forms of cooperation emerged.

Today, users can choose their mobile phones from any vendor, independent of the network operator or the vendor of the mobile network equipment. Before standardized GSM technology, none of this was possible.

IoT is following in GSM’s standardization footsteps

The market for voice communication and wide-area connectivity sits on top of a rich ecosystem of industry partners. The IoT market is less advanced, with only parts of what will be an efficient, standards-driven ecosystem developed so far. This is currently one of the main obstacles for the IoT story.

Today, there are plenty of standardized wireless and wired technologies for IoT connectivity within different verticals, including LTE-M, NB-IoT, 2G, Bluetooth, Wi-Fi, 3G, LTE, Ethernet, Profinet and so on. However, interoperability on higher protocol layers largely depends on proprietary technologies.

In practical terms, IoT customers and integrators can use connectivity from any network operator or select any radio module. However, it’s almost impossible for them to choose or develop devices independently of the platform they use for device and application management functions. As a result, end users are left with siloed applications, which in turn results in the loss of benefits such as data and vendor interoperability.

IoT platform essentials

Looking at the reference architectures of IoT platforms, most of them provide the basic functions that are common for many IoT use cases. These include functions for device management, security, registration, group services, subscription and notification services, etc.

Although these common functions share the same names, individual implementations differ in the absence of standardization. In addition, differences in protocol usage prevent device interoperability across different platforms.

Obstacles to IoT interoperability

To realize the full potential of IoT, “things” need to talk to and understand each other. An even greater obstacle to attaining this outcome is the different data models employed within the countless different proprietary implementations.

For instance, if someone was to call an end user on their phone and speak in Swahili, most users would probably not be able to communicate or achieve a meaningful result.

The same issue occurs in communication between things when they use implementation-specific data models.

Today, each vertical industry comes along with its own fora and specifications bodies to develop data models for its own vertical. This is quite natural.

For example, in the industrial automation industry, organizations like OPC are working on data models and objects which can be used on the shop floor. In the automotive industry, ETSI’s Intelligent Transport Systems technical committee is working collaboratively to define messages and data models for communication between cars.

Many IoT applications also involve several partners in a distributed value chain. For instance, an intelligent application for an industrial plant might automatically order feedstock from one or more partners for its production line. Supplies are typically ordered and delivered by several partners spanning the industrial and transportation domains. It is easy to see how this scenario can end up in up in an “island of things” configuration, since different partners in the value chain belong to different verticals, each with their own specific data models.

How oneM2M standardizes interoperability

This is where the oneM2M standard comes into play. OneM2M functions as a software middle layer, by interconnecting devices with their respective application-infrastructure entities (cloud-based), independently from their underlying transport networks. In effect, it creates an abstraction layer that allows application developers to create value from their business and operational applications without having to deal with the technical protocols for connecting to and managing devices.

OneM2M solves the problem of implementation variances for common service functions. Its technical specifications provide a global standard for the basic functions, such as device management, security, registration and so forth, that all IoT use cases employ. The use of oneM2M specifications in field-deployed devices ensures data and vendor interoperability.

Furthermore, oneM2M provides global standardized APIs on the application-infrastructure side, where customers can interact with their device and/or even their own platform. On the device side, oneM2M’s APIs help developers tailor applications for their specific purpose without the need to master technical details about the underlying connectivity networks.

To enable end-to-end communication across different verticals, oneM2M provides the tools that enable various interworking possibilities. One approach is to map data models for shop-floor machines and sensors as oneM2M resource structures and vice versa. Since such interworking definitions are available for other verticals, such as automotive and rail, different verticals can communicate among each other relatively easily.

Working across the IoT ecosystem

The primary aim of oneM2M is to standardize the common services necessary to deploy and operationally support IoT applications across multiple verticals. This implies a horizontal focus, aiming for a high degree of reuse and cross-silo interoperability. Vertical sector requirements are also important to oneM2M standardization participants. In the manufacturing and industrial sectors, oneM2M established a liaison with the Industrial Internet Consortium. In Europe, oneM2M is working with Industrie 4.0 members, beginning with a workshop on February 12, 2019.

OneM2M is also actively involved with the Open Connectivity Foundation, targeting interworking opportunities for consumer IoT applications.

OneM2M’s own standardization continues to address new frontiers for interoperability and interworking with the development of its latest specifications, Release 4. Release 4 will encompass industrial, vehicular and fog/edge architectures. It also lays the groundwork for semantic interoperability and tools to help user adoption.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

February 27, 2019  1:27 PM

Lean 4.0: Efficiency, flexibility, quality and continuous improvement

Angel Lazaro Profile: Angel Lazaro
Artificial intelligence, cloud, IIoT, industrial internet of things, Industrial IoT, Industry 4.0, iot, lean manufacturing, Lean strategy, PLC

Consumers nowadays want to buy personalized commodities to suit their particular needs. This means production runs have to be shortened or new settings phased in while the production lines are still underway.

At the end of the 19th century, Sakichi Toyoda, founder of the Toyota Group, began to champion the lean way of thinking, which would eventually develop into the full-fledged lean manufacturing methodology. This was a new production management model that set out to streamline production systems, cutting out or cutting down all activities that add no value for the end client in the production process. In 2011, the fourth Industrial Revolution kicked off, a new concept of the hyper-connected industry in which new, groundbreaking technologies boost the efficiency of production processes.

Let’s take a look now at how Industry 4.0 and its technologies might round out the main lean principles.

Perfect quality at the first time of asking
Here is where artificial vision techniques applied to the inspection of finished goods really come into their own. These techniques, combined with AI and deep learning, will help pinpoint faulty patterns, thus flagging defective goods before they move on to the next production phases. Moreover, in combination with cloud computing, we will be able to carry out complex calculations that help us identify any quality leaks and tolerance-margin shrinkage early enough to head off the corresponding defect beforehand.

Minimizing waste
Application of automation techniques and, above all, collaborative robotics, will enable us to reduce production-process waste generation. For example, collaborative robots that can enable plant space to be shared with people without needing to cage in the robots.

Continuous improvement
IoT- or IIoT-based monitoring of process data, or even the state of machines or persons, will enable us to improve maintenance tasks, switching programmed maintenance processes to predictive maintenance tasks and also improving workers’ quality of life by monitoring working conditions or their own safety by means of wearables.

Pull processes
New consumption models, such as the Amazon Dash Button or mobile apps, together with cross-selling techniques based on consumer trends and drawing from big data techniques like customer segmentation make it possible to bring clients closer to final production.

The application of collaborative robotics as the basis of automation, instead of the development of highly productive but un-reconfigurable machine tools, makes production processes much more flexible. In combination with control system virtualization techniques based on virtual programmable logic controllers, this will also make production systems much more flexible.

These are only some examples of how lean manufacturing and Industry 4.0 are not in fact different trends, rather are they two sides of the same coin. Applied shrewdly and within a proper cybersecurity framework, they have been conceived to coexist and mutually complement each other in the interests of improving the products and services we offer our customers.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: