IoT Agenda


January 22, 2020  4:06 PM

IoT-based smart cities: Shaping the future we’ll live in

Sanjeev Verma Profile: Sanjeev Verma
industrial internet of things, Internet of Things, iot, IoT analytics, IoT and AI, IoT benefits, IoT devices, Machine learning, Smart cities, smart city applications, Smart sensors

The ever-growing population along with rapid urbanization will impact global cities the most. As per an estimate, more than 180,000 people move to cities every day. This migration of population to metropolitan and urban cities will result in reduced resources and increased congestion in cities.

However, there is a simple solution that enables cities to manage this large influx of people.

The transformation and adoption of IoT technology has empowered the development of futuristic infrastructures and a connected ecosystem that fuels sustainable economic development and high quality of life. These IoT based solutions have elevated traditional operations and created new services that make cities more efficient, secure and cost effective. They have offered exemplary benefits that make the urban lifestyle more enjoyable and comfortable. Some of these benefits are:

IoT Based Smart Cities Shaping the Future We’ll Live In

Soruce: Biz4Intellia.

Smart traffic

Conventionally, existing traffic management systems have many loopholes in them. For instance, on a four-way crossing, the traffic lights operate on the theory that only a fixed amount of traffic will convey from a crossing at a particular time interval. Practically, that is not the case at all.

In fact, it is quite difficult to even estimate and moderate traffic conditions. Even on a normal day, many factors such as congestion on crossings, speed of incoming traffic and drivers’ driving behavior play a role in determining traffic conditions.

IoT technology can be used effectively to manage the flow of traffic to some extent. It has enabled the development of high-end infrastructures that facilitate a smooth flow of traffic. Smart lights, signs, signals and even vehicles form a part of a connected vehicle-to-everything (V2X) network that enables the transportation of goods, services and people at an efficient rate.

An IoT based traffic management system can monitor several variables, such as location and speed of the vehicles. The smart traffic signals can collect data from connected cars and manage lights to facilitate the flow of traffic. Subsequently, the car drivers can also be notified to reduce their speed in order to encounter green lights on every crossing.

This connected network between vehicles and the traffic management system empowers a smooth flow of traffic and also reduces the emission of greenhouse gases, which makes life in cities more healthy.

Efficient utility management

The smart cities of the future must boast an efficient supply of energy and water. It is a pre-requisite without which there won’t be any meaning of defining a city smart. In the current scenario, where potable water and energy resources are constantly depleting, IoT can help cities manage the supply of utilities to the citizens.

IoT can be used by utility providers to manage resources efficiently. It institutes a robust distribution system for both water and electricity that helps provide uninterrupted supply to the citizens. Moreover, IoT can aid in the detection of water and electricity losses and empower end-users to reduce the consumption of these supplies.

IoT has also led to the development of smart meters and grids that help these utilities to distribute these resources based on distinct requirements in individual localities. This equipment can monitor the amount of water and electricity being discharged to end consumers, which further enable companies to bill their customers dynamically.

Effective waste management

Waste management is an important aspect of any city. As per statistics, the average person generates more than 0.7kg waste every day. For clean and sustainable living in a city, it is important to collect and manage this huge amount of waste.

IoT-based smart waste management systems help municipalities reduce their waste collection cost, reduce garbage surfeits in bins and even analyze the trash generated on a regular basis. This lowers the manifestation of diseases in cities and helps create a healthy neighborhood for the citizens.

Advanced security

The biggest concern for any person is to find a locality in a city that ascertains a safer environment for citizens. IoT-based smart cities, along with the above-mentioned benefits, also inaugurates surveillance conditions via CCTV cameras.

Even though CCTV’s are not a breakthrough piece of equipment, their embedment with face recognition capabilities via AI and machine learning technology is marvelous. These devices can be used to detect dubious people before they commit a crime. Furthermore, they can also be used by police and law enforcement to conduct search operations and establish law and justice in the city. For example, criminals can be recognized via AI-powered CCTVs and the IoT system can then instantly alarm the police.

In addition, IoT-based panic buttons and hotlines can be instituted in different corners of a city to help emergency vehicles reach an accident site quickly. Smoke and fire alarms fitted in buildings and monumental sites of a city can help the life-saving authorities in a similar fashion.

The establishment of such advanced security systems helps law enforcement and emergency vehicles to respond quickly in emergency conditions. Finally, they establish a secure and protected environment for citizens that help them live a peaceful and safe life.

Final thoughts

The connected ecosystem developed from IoT provides countless benefits in terms of boosting efficiency, creating a sustainable environment and improving the lifestyle for the citizens. Its amalgamation with other disruptive technologies, such as AI, blockchain and machine learning, can even automate the day-to-day processes.

Smart Cities are going to be the future in which we’ll live. The investment in smart cities is expected to even cross more than $710 billion by the end of 2023. The vigorous change is clearly visible with IoT.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

January 21, 2020  12:55 PM

If you can see it, the computer can see it too

Robert Schmid Profile: Robert Schmid
Agriculture, AI and IoT, Internet of Things, iot, IoT analytics, IoT and AI, IoT data, IoT data management, retail IoT, Smart cities, Smart sensors

If you’re exploring IoT-enabled technologies for your enterprise, don’t overlook an asset you might already have on site, such as video cameras. IP cameras, which are digital video cameras that use the internet to receive control data and send back image data, are one of the most powerful data-collection tools available for IoT networks. And your business might be sitting on this source of untapped data and value.

By design, video cameras can provide a nearly complete visual account of operations and security by collecting data from multiple sources, which is a scenario well suited to IoT. Unlike other parts of the IoT ecosystem, these cameras’ IP cousins have been around for decades, meaning IP cameras have more mature technology with the robustness and dependability that comes with time-tested upgrades, as well as the ability to support a range of integrated technologies. This maturity also means manufacturers, installers and system integrators are more likely aware of cyber risks and have established best practices to prevent and monitor potential breaches.

IP cameras are also ubiquitous. What part of modern life doesn’t involve an IP camera? We use them to  monitor security in our homes, businesses and cities. We communicate through video on our phones and computers. They help cars and trucks navigate. We also use them for both entertainment as well as education. And nearly any of these cameras can be used as a business intelligence tool, provided the extra volume of data their digital video files generate is processed on the edge. Fortunately, hardware and software for edge processing abound with providers such as Amazon offering solutions that can provide real-time, secure processing capabilities. Consider these scenarios:

Video analytics and retail

One reason brick and mortar retailers struggle to compete with e-commerce shops is a relative lack of data, such as how many people visit their shops and when, as well as data about shopping patterns, such as where people browse, where they linger longest and which items they handle most. Fortunately, there’s a physical solution for these physical stores. Discreet IP cameras can track customer movement and behavior, providing the data that can reveal underlying reasons for both successes and misses on the floor and help store managers course correct accordingly.

When this video data is integrated with other networked sensors and systems, such as beacons, physical retailers can benefit from the deep, insight-rich analytics ecommerce providers have relied on for years. These analytics can help spark improvements, such as optimized store layouts and staffing levels, more personalized customer service, merchandising and marketing, and table stakes such as security and loss prevention.

Smart cameras, smart city

Say you’re planning to drive into town for an important meeting, and you know parking can be a challenge. Using networked cameras with video analytics, city managers can send parking information to your phone via an application showing what’s available and where. Sweeten the mix with analytics backed by AI, and your drive can be cross-referenced with parking pattern data to predict where a free spot is most likely to occur when you arrive. You might even be able to reserve that spot. All starting with cameras.

Eye in the sky, food on the plate

Shifting from town to country, farmers are also using cameras on drones for visual assessments, including mapping canopy cover and drainage, assessing crop growth and counting plants to predict yield. As in the retail space, data from these cameras can be combined with data from other sensors measuring light, humidity, temperatur or soil moisture for even more detailed crop analysis.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


January 20, 2020  1:06 PM

A look back on eight IoT predictions

Richard Beeson Richard Beeson Profile: Richard Beeson
Internet of Things, iot, IoT data, IoT device management, IoT predictions, iot security, IoT strategy

Back in early 2015, I made a series of predictions about the trajectory of IoT at my company’s user’s conference. The talk took place five years ago, but things move fast in IoT and so I looked back to see how I fared with my predictions.

1. An explosion of devices

Prediction: 50 billion connected devices by 2020

Reality: Technically wrong, but right in the larger scheme of things. The 50 billion devices prediction began in the early part of the decade, promoted by Cisco and Ericsson. Cisco later upped their forecast to 500 billion by 2020. Gartner predicted 20.4 billion, 451 Research 8.8 billion and IDC 41 billion, but these figures didn’t include PCs, IT equipment and phones. However, 451 Research did place a stricter definition.

The predictions were a bit optimistic, but here’s the interesting part: The growth rate is faster than expected. In 2016, the world had 17.6 billion IoT devices and would have 30.7 billion by 2020, according to IHS Markit. In 2018, IoT devices were grazing the 31 billion mark, according to IHS said. The forecast for 2030? 125 billion.

If you look at some successful categories, such as WiFi, smartphones and LCD monitors, you see a pattern of forecasts being regularly upgraded. Thus, the growth rate points to positive outcomes, though I am now questioning the real value drivers behind this uptake.

2. Data variety will blossom

Prediction: We’ll see greater variety of data types, more complex data types and unusual combinations of data. For example, sensor or device specific context, such as GPS and location data, will be combined with different sources to produce unexpected insights.

Reality: Yes, there is a proliferation of new cost-effective sensing technologies and sensors that are leveraging a broader range of human senses, such as acoustic, vision and smell. These technologies are also sensing phenomena way outside the bounds of human capability, driving incredible improvements in physical and situational awareness.

But what’s equally interesting is the unusual combinations and use cases. For example, insurance companies have begun to experiment with IoT devices to monitor the health of pipes as a way to reduce water-related claims. A mining company in Canada, Syncrude, uncovered a driver safety problem while mining oil temperature data.

What’s the fastest growing market in IoT? Fast Food, said Charlie Wu, Product Manager at Advantech.

3. Big data will get bigger

Prediction: Cheap sensors and proliferating techniques for extracting value of data will mean that the conventional idea of a refinery having 100,000 or even 500,000 sensors gets “blown away”.

Reality: Correct, but I missed a bigger issue. The total amount of data continues to double every two years. Real-time data has also become one of the fastest growing categories of information and will double in its demographic heft to 30% of the data population by 2025, according to IDC.

Here’s the twist: In industry, we’re not seeing new sensors as much as companies taking advantage of data they already collect. For example, a graduate student working for Lonza — which makes specialty ingredients for food and pharmaceutical companies — figured out a way to potentially increase capacity by 15 to 20% using data Lonza already had.

4. Devices get dynamic

Prediction: Companies will shift from fixed-location and fixed-function sensors, shifting instead to dynamic sensors and devices that can be configured for different applications and locations.

Reality: I was expecting a move from specialty appliances and sensors to agile and dynamic solutions, equivalent to the “Software Defined X” phenomena, just as I am expecting this to take root in the processes and operations of traditional manufacturing. Efforts we see in the Open Process Automation hint at this agile approach.

The dominant paradigm is still mirrored in companies such as Petasense, who have successfully brought devices like this to market. The declining cost of hardware might also result in fewer dynamic devices and more rip-and-replace devices. For example, EDJX is coming out with nano servers made from inexpensive or used parts that can be run to failure.

I am now more conservative when I expect to see this phenomenon reach a critical impact. Which paradigm wins remains to be seen, but I don’t expect it to be either or.

5. Protocols gone wild.

Predictions: Vendors will develop their own protocols to meet their specific needs, whether driven by the nature of the information, the overall architecture or evolutions in technology that are cutting edge. Unfortunately, that will create confusion and incompatibility.

Reality: Unfortunately, this is true. While industry standards such as OPC UA have achieved broad adoption, each vendor tends to implement the standard differently, resulting in incompatibilities. If history is any guide, many of these problems will work themselves out as time goes by. In contrast, networking is seeing the evolution of different standards for different tasks. For example, 5G devices that require constant, fast and high-volume bandwidth with LoRa targeting applications that generate less and less urgent data. Over time, expect a few to emerge as favorites for particular tasks and use case scenarios.

6. Community and data sharing.

Prediction: Data sharing between organizations present tremendous opportunities for efficiencies, but questions about security and privacy will invariably make acceptance gradual.

Reality: True, but there’s more to it; some sharing is occurring. For example, Eli Lilly monitors contract manufacturers in real time to help ensure its product quality. Similarly, YPF uses it to monitor its third-party wind providers and their performance.

We see significant advances in supply chains. And there are massive needs for community and data sharing if we are going to solve the future energy generation and distribution challenges as renewables such as distributed energy and microgrids continue to disrupt the centralized approach to energy. But still, widespread communities are not here yet.

One big thing I missed here was blockchain, which could accelerate communities by enabling decentralized solutions.

7. Computing will become geo diverse

Prediction: While a building or a factory is rooted in one spot, the computing resources to better optimize them could be anywhere, and increasingly will be exploited by a wider variety of people. Public clouds will be necessary to efficiently manage this intelligence.

Reality: Partly right. Public clouds play a critical role in helping companies analyze large data sets or handling certain types of applications. But it’s becoming increasingly clear that a significant percentage of compute tasks and data storage will continue to take place locally, either inside facilities or devices. Concerns about latency, bandwidth availability and bandwidth cost are driving a shift to hybrid computing architectures.

To put it another way, we regularly swing from eras of centralized to distributed computing architectures. IoT is pushing the pendulum toward distributed.

8. Data silos will raise their ugly head

Prediction: Controlling every aspect of an IoT application, such as device analytics and data, might be good for vendors, but it’s terrible for customers.

Reality: Luckily, it’s not happening as much it could have as consolidation and exits in this market have helped. In addition, the approach is evolving: IoT platforms will increasingly not be offered as monolithic services. Rather, companies will focus on a few core areas of expertise, such as analytics, device management and security, and customers will weave together platforms from these offerings.

Final thoughts

I think I generally predicted the direction of the market. What’s going to be most interesting is to watch the development of data communities. Technology can be easy to predict because hardware gets better and cheaper while algorithms become more accurate. Human behavior and cooperation are more of a wild card.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


January 17, 2020  4:21 PM

2020: The year IoT really gets to work

Steve Wilson Profile: Steve Wilson
Internet of Things, iot, IoT adoption, IoT and AI, Virtual assistant, workplace IoT

IoT devices have changed the way we live. Now, they are reshaping work.

There’s a lot of hype around IoT devices. But the hype is about to get real. The technology that we use to help us at home has made its way into the office. And it’s about to change the way we work in monumental ways.

Most of us start our work days by logging in and pulling up our calendars to see where we need to be. Then we open our inbox and wade through email to see what we need to do. Then we start navigating multiple apps to complete mundane tasks like filing expenses and approving time off. Then we begin searching for the information and insights we need to do our real work. It sucks up the bulk of our time and leaves us feeling frustrated and unproductive at the end of the day.

But work is about to get simpler and smarter. And we will be more engaged and productive as a result. IoT devices are no longer confined to our kitchen counters and nightstands. They’re on our desktops and conference tables and embedded in our digital workspaces. Now when we log on, we don’t have to search for anything. The most pertinent tasks and insights we need to focus on are automatically delivered to us in an intelligent feed on the devices we prefer to use — our phones, laptops, tablets and even smart watches. With the click of a button, we can execute mundane tasks in seconds. And it’s all delivered in context with intelligence, so we can focus on meaningful work and create value.

It’s exciting stuff, but according to James Bulpin, senior director of engineering at Citrix, we haven’t seen anything yet. I recently sat down with James to get his take on what the future holds when it comes to IoT devices and the future of work.

Steve Wilson: IoT devices have certainly changed the way we live. And while things have been painfully slow on the work front, they seem to be picking up speed. What’s driving this?

James Bulpin: Artificial intelligence-powered, voice-controlled IoT assistants like Alexa, Google Assistant, Siri and Cortana are already commonplace in many homes. This popularity is building pressure to integrate virtual assistant functionality into enterprise technology, as well, which will significantly reconfigure and enhance the employee experience. By 2021, Gartner, Inc. predicts that “25%of digital workers will use a virtual employee assistant on a daily basis. This will be up from less than 2% in 2019.”

Wilson: How we will we interact with our IoT devices at work?

Bulpin: Currently, the relationship between humans and IoT devices, such as virtual assistants, is predominantly a transactional one, relying on simple voice commands like “Alexa, tell me the weather forecast for this afternoon.” However, the AI and machine learning that power IoT devices is progressing rapidly, and in the near future, things like virtual assistants will be far more than just a voice or chatbot interface. In fact, the virtual assistant is likely to become a pervasive form of intelligence across the workplace that can surface through all digital platforms and resources, including data and apps, helping individuals to accomplish their daily tasks more efficiently.

Wilson: So are they going to put us all out of jobs?

Bulpin: Such innovation will always trigger some concerns over the technology’s impact on people and their job security and the demand for skills. But it’s more likely that IoT devices will make work a better experience for everyone. Workers will always face certain limitations due to our personal capacity for work or mental processing power, for example. IoT devices can help people and organizations to do and achieve things they couldn’t otherwise. Eventually, we envisage the creation of a level playing field between workers and their devices, built upon a relationship of mutual trust and collaboration, where the device undertakes more routine tasks for the individual, allowing them to focus on delivering their best work.

Wilson: How do you see the IoT market evolving?

Bulpin: The natural-language processing of voice recognition technology is growing steadily in sophistication, and eventually, conversations between an individual and their devices will be peer-to-peer, indistinguishable from human-to-human conversations.

Beyond this, the next logical step is for IoT technology to have the ability to understand human gestures. We’re already exploring the potential of gesture-recognition technology in all its forms, which will enable devices to interpret priorities and passion points, for example by identifying when human gestures have become more animated. Gestures could include pointing, eye gaze, and arm movement.

And before long, they will begin to independently solve problems and make proactive suggestions for workers. They will have the ability to calculate an individual’s workload, perhaps suggesting when to take a break, as well as to highlight the tasks that should be prioritized or delegated. By this point, workers may come to appreciate that IoT devices might even know “best” based on analysis of previous behavior and patterns.

Wilson: So it really is all about augmenting employees and empowering them to use their special skills to do meaningful work and create value.

Bulpin: Exactly. Ultimately, the IoT devices will help an individual to organize their work or tasks to keep them productive, while also understanding their personal capabilities. They may also begin to take on some monotonous, repetitive tasks to assist workers further, allowing them to spend more time engaged in high-level thinking, creativity, and decision-making, making it possible to focus on the best, more interesting work, most of the time.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


January 16, 2020  3:36 PM

How to ensure RPA delivers

Pat Geary Profile: Pat Geary
connected RPA, digital workers, Internet of Things, iot, RPA, RPA benefits

With all the industry hype and confusing claims coming from various robotic process automation vendors, gaining clarity on the “different flavors” of this software is now a major issue. Without a definitive understanding of what robotic process automation is, organizations risk choosing the wrong options. Gartner agrees and predicts that through 2021, 40% of enterprises will have robotic process automation buyer’s remorse due to misaligned, siloed usage and inability to scale. Unfortunately, these issues typically reveal themselves after a proof of concept has been completed and attempts are made to scale automation programs more broadly.

To avoid these pitfalls, robotic process automation (RPA) must enable operational agility while not replicating slow and expensive IT projects, and still meet all the governance, security and compliance requirements. To achieve these goals, RPA must be underpinned by no-code, collaborative, business-led design principles that offer a platform for humans and digital workers to deliver automation capabilities. RPA must also deliver on its promise of being a potentially transformative tool — especially at scale — by enabling business users to inject greater speed, accuracy, productivity, efficiency and innovation.

To fully realize these outcomes, here are few key must haves to consider when selecting RPA technology:

A no-code, business driven initiative

Due to a scarcity of software development skills, those RPA technologies that require coding effort will soon be stuck in the work queue and suffer the same delays as traditional IT projects. The IT department’s proper role in RPA should simply be to uphold the necessary governance, security and compliance requirements for business-sustainable transformation.

Therefore, consider a connected-RPA platform that solves the long-standing integration challenge of system interoperability by repurposing the user interface and providing code-free connectivity with any system. This innovation also enables digital workers to use and access the same IT systems and mechanisms as humans, so they can automate processes over any past, present and future system independently of machine APIs.

Being business-led also means that non-technical users shouldn’t have to build or program digital workers. They should be able to use an intuitive OS to train and control them by creating automated processes by drawing and designing process flowcharts, then publishing them to a secure, central system. While designing the processes, users orchestrate a unique process definition language that both robots and humans understand, which also eliminates the need for coding.

Deliver real value

To offer real business benefits, a digital worker should also be a pre-built, highly productive, human-like, multi-tasking, full-service, modular and autonomous device that understands the context of processing tasks. These intelligent capabilities enable them to perform activities in the same way as humans rob, though faster and more accurately while also working with and learning from humans, other robots and systems.

The universal system connectivity of these digital workers can also enable business users to deploy AI, natural language processing, intelligent optical character recognition, communication analytics, process optimization and machine learning capabilities.

Drive collaboration and scale

Agile transformation at scale is only ever achieved through centralized effort, so insist on collaboration. RPA must enable users to share, reuse and expand automation capabilities by contributing their published process automations into a centralized repository. This enables the whole enterprise to manage, share, improve and re-use these assets. This centralized design also provides clear audit trails of all process automations and greater security, both of which are key factors in driving successful RPA outcomes.

To ensure that business-led and IT endorsed RPA projects are successfully sustained as they evolve, consider a vendor that employs a Robotic Operating Model delivery methodology. This defines how to most effectively identify, build, develop and automate processes at scale across an organization.    

Alternative RPA options

Within this rapidly growing market, buyers are being misdirected towards a whole swath of RPA- branded tools which diverge from RPA’s original no-code, collaborative and business-led philosophy. These RPA vendors promise easy-to-use, instant record-and-deploy automation tools, but the reality is that they have code-heavy deployments, endless debugging activities, code-based versioning and project-artefact management, as well as dependency and change management overheads.

Let’s be clear: These automation technologies offer limited scaling capabilities, which is no good for businesses looking to use these tools to transform their operations.

The problem with desktop recording and the notion of a personal software robot is that a single human user is given autonomy over a part of the technology estate — their desktop — which introduces a lack of control and creates multiple security and compliance issues. Desktop recording spells trouble for the enterprise as it captures choices based on an individual’s interpretation of a process versus a central consensus for the best path. This obscures a robot’s transparency and hides process steps that, when duplicated over time, becomes a potential security threat and limit to scale.

There are two more major drawbacks of the desktop approach to automation. Firstly, if a robot and a human share a login, no one knows who’s responsible for the process, and this creates a massive security and audit hole. Secondly, if a robot and a human share a PC, there’s zero productivity gain as humans can use corporate systems as fast as robots. This approach doesn’t save any time or make the process any easier for the user.

Many of these desktop automation deployments never get beyond simple sub-tasks, which have been executed using an agent’s login and run on their own desktop. Although helping with that task, they deliver very limited capabilities and are not transformative at all.

Ultimately, choosing the wrong brand of RPA can limit the scale and potential of automation to the confines of the desktop and introduces a variety of risks. However, connected-RPA provides the platform for business-led collaboration, securely and at scale. In fact, by using this version of RPA, more than 1,800 of the world’s large organizations are achieving major productivity increases, greater innovation and improved processes so that they can stay agile and ahead.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


January 15, 2020  2:15 PM

 The rise of thing commerce and what it means for software development

Antony Edwards Profile: Antony Edwards
connected devices, Ecommerce, Internet of Things, iot, IoT and AI, IoT devices, IoT software, IoT strategy, Thing Commerce

As connected devices continue their explosive growth, thing commerce is moving from niche to mainstream. So, what exactly is thing commerce? Thing commerce is where connected machines, such as smart home appliances and industrial equipment, will make buying decisions for people by either taking direction from customers or by following a set of rules, context and individual preferences. Thing Commerce will ultimately include buying things, reporting a problem, requesting services and negotiating a deal.

The rise of Thing Commerce will see more companies and consumers interacting with virtual assistants in smart appliances who can make purchases on their behalf. The days of remembering to buy milk and ensuring that fresh produce is not past its sell-by date will be eradicated. These tasks will all be handled by interconnected machines that will deliver a frictionless commerce experience for customers.

However, before this utopian reality can happen, businesses need to rethink how they develop and test the software and systems to support this new age of commerce. There are three core elements that thing commerce providers must embrace as they build and deliver software and applications:

Test the user experience

With Thing Commerce, there are multiple products and services composed of a variety of technologies from an array of vendors. As a result, development teams across the ecosystem need to reorientate from focusing on testing code compliance to understanding the actual user experience.

Embracing a user-centric approach to testing ensures you identify errors, bugs and performance issues before they have the chance to impact the user experience. This requires adopting an intelligent test automation platform.

Intelligent automation and bug hunting are mission-critical

The only way to truly test the Thing Commerce ecosystem from the user perspective is to utilize an intelligent automation engine. Intelligent AI-driven automation creates a model of user journeys and then automatically generates test cases that provide thorough coverage of the user experience, as well as system performance and functionality.

In addition, the AI algorithms hunt for errors in applications based on user journeys automatically generated from this bug-hunting model. This approach enables teams to quickly find, identify and address problems before release.

Continuous testing, continuous learning and predictive trends

Testing any digital experience is not a one-and-done exercise. It must be a continuous process so that you’re monitoring the digital experience over time. An AI algorithm will watch test results, learn and look for trends. The learning algorithms will enable predictive analytics. For example, it can identify if the increasing delay in a particular workflow is likely to result in the connected system failing to replace out-of-date produce before the family meal.

Final thoughts

Thing Commerce promises a world of possibilities that will free people from many mundane chores. However, for Thing Commerce to realize its potential, it’s essential that organizations change the way they develop software to ensure it delivers a consistent digital experience that delights customers. If not, Thing Commerce might erroneously deliver 20 bottles of milk to a smart fridge, delighting no one and incurring a lot of friction along the way.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


January 13, 2020  4:20 PM

Operational IoT must be seen to be secured

Reggie Best Profile: Reggie Best
Endpoint devices, Endpoint management, Firewall security, Internet of Things, iot, iot security, Operational technology

By correlating a comprehensive understanding of your enterprise’s active IP address space against known threats as new data becomes available, including IoT and OT endpoints as they are connected to the network, you have intelligence you can act on.

A big barrier to effectively securing IoT and operational technology devices is simply not knowing they are there. Lack of visibility has been a recurring theme in FireMon’s annual “State of the Firewall Report,” as has managing complexity. The report doesn’t even begin to dig into the impact of IoT growth.

This year’s survey found that 34% of respondents reported having less than 50% real-time visibility into network security risks and compliance. From a firewall perspective, respondents are dealing with a lot of complexity – nearly 33% reported having between 10 and 99 firewalls in the environment, while 30.4%reported having 100 or more. Additionally, nearly 78%are using two or more vendors for enforcement points on their network, while almost 60%have firewalls deployed in the cloud.

Given the challenges firewalls create for security professionals, you can imagine how the exponential growth of IoT endpoints are compounding complexity. This is partially because they behave differently, and in turn, they must be onboarded and managed differently.

IoT and operational technology endpoints are driving enterprise network growth

IoT visibility has become a crucial area in the security market, and more traditional vendors — including Palo Alto, Checkpoint, Forescout and Cisco — are responding accordingly by acquiring IoT expertise and operational technology (OT) know-how.

As data center workloads migrate to cloud computing and infrastructure-as-a-service delivery models, a significantly larger percentage of the enterprise network will be comprised of IoT and OT endpoints. Previously siloed systems — such as security cameras and sensors, turnstiles, badge readers and even building control systems — mean IoT and OT is converging with more traditional enterprise endpoints such desktops, laptops and servers into a single, fluid IP-based infrastructure.

With everything on one network that no longer has a clear and defined perimeter, threats can easily migrate between the smarter, evolving OT areas into the IT domain, which makes visibility more essential than ever.

You must have visibility to manage IoT and OT complexity

Obtaining the required level of visibility demanded by an environment populated by IoT and OT endpoints requires automation, something FireMon’s report also identified as a frequent pain point for respondents.

IoT and OT endpoints demand automation, both from an initial discovery perspective and from an ongoing status perspective. Given the nature of the devices, endpoints such as security cameras and turnstiles can be added in large volumes at once or on a piecemeal basis. Devices must also be checked regularly to ensure they are operating normally.

Visibility means having a consistent view of all these endpoints, including basic characteristics such as connectivity and device function. It also means understanding the infrastructure it’s connected to and how even a simple OT device is in a position to affect more complex IT operations if it’s not properly provisioned from a security perspective. The way these devices connect to the network can open unexpected and unwanted paths into the heart of the organization. All it takes is one leak to drastically affect security and compliance posture.

In order to achieve adequate visibility, IT admins must see IoT and OT endpoints being onboarded in real-time so they can automatically apply global security and segment devices. This is necessary to limit the negative impacts of any anomalous activity, which must be easily detectable for security teams to proactively respond.

Establishing visibility of IoT and OT endpoints as part of the broader IT landscape enables you to begin tackling the unique complexity they bring to the network.

Diversity and variety compounds complexity

The diversity of IoT and OT endpoints should not be underestimated. In the same way multi-cloud environments add to complexity and confusion over shared security responsibility, there are new device behaviors security professionals must be ready to handle.

Just as servers, desktops, laptops and smartphones can all begin to misbehave and pose a threat to the corporate network, so can the many IoT and OT devices that are added to fluid IP infrastructure. The failures and glitches of more traditional hardware tend to be par for the course for security teams, but the variety and diversity of IoT and OT endpoints get more complicated, especially when they function in an unexpected way.

The consequences of these endpoints being compromised have significant ramifications. In the healthcare realm, the devices can be lifesaving, and in many other scenarios such as energy generation and delivery, water and waste management, and traffic control, their security is paramount to keeping people safe and maintaining quality of life for entire communities.

Because most of these endpoints are embedded, enclosed devices, often the ability to secure them using agents — as with more traditional IT endpoints — is somewhat limited. This means visibility, discovery and management must be much more network centric. In the same way IT security teams have evolved to manage and monitor hybrid cloud environments, IoT and OT endpoints have further diversified the environment to create a more dynamic infrastructure.

A unified view requires a network-centric approach

Reducing complexity and increasing real-time visibility means having a single platform that will discover, monitor and remediate when necessary — not just cloud, virtual, physical and software-defined network infrastructure, but also the proliferating IoT and OT endpoints that merge with traditional IP infrastructure.

A network-centric approach solves the IoT and OT device conundrum because it discovers and monitors the cloud accounts, network paths and endpoints inherent to traditional IT infrastructures. It also watches for changes in real-time to identify new leak paths that might be created by IoT/OT environments. By correlating a comprehensive understanding of your enterprise’s active IP address space against known threats as new data becomes available, including IoT and OT endpoints as they are connected to the network, you have intelligence you can act on.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


January 9, 2020  4:30 PM

Positioning IoT to profit through product packaging

Cris Wendt Profile: Cris Wendt
Internet of Things, iot, IoT business model, IoT data management, IoT monetization, IoT products, IoT strategy

This article is the fifth in a six-part series about monetizing IoT. Find the previous article here.

A few common approaches can be used to bring products to market to create diverse revenue streams within IoT. So far, this series about monetizing the IoT stack has defined various elements of the stack that can be monetized, introduced the three dimensions of monetization and described two of the three dimensions of the monetization framework for IoT: monetization models and monetization metrics. This article will address the third dimension of the monetization framework for IoT, known as product packaging.

Product packaging is the methodology for partitioning product functionality to bring different products to market with a variety of offerings, such as a single product leveraged by different SKUs to generate multiple revenue streams. This is another area where art meets science. Product management can be creative in meeting market needs and revenue goals with different approaches to portioning products’ functionality.

Product packaging examples

Product packaging can be a very broad topic, so I will identify a few, common approaches. For example, imagine a fictitious IoT utility equipment provider, Sensorytics. Sensorytics has a SaaS analytics platform that’s sold to various municipalities throughout the world.

The analytics platform ingests utility smart meter information, such as gas, water and electricity, and stores and uses that information to perform various analytics, reporting and alerting. In terms of relating this to the IoT stack, we’ll concentrate on the cloud aggregation and analysis part of the stack, but the principles apply to each level of the stack as well as to multiple elements of the stack combined into larger bundled offerings.

The analytics platform can provide value to the different types of utilities that a municipality offers. The product manager at Sensorytics initially decides to create three multiple offerings, one for each utility:

Market-based structures

The initial step is to create a base analytics package for each of the three different types of utilities, which might see different value from the basic offering that provides a slightly different set of basic reports that are specific for that market. Initially, the product manager has three offerings:

  • Analytics for Gas Management
  • Analytics for Water Management
  • Analytics for Electricity Management

Persona-based structures

Digging deeper into the gas market, the product manager identifies three different, additional specialized offerings to address three different personas who see different values based upon a different analysis and set of alerts provided. Upon release, the product manager has three additional offerings:

  • The finance department manager of the utility company is interested in a Gas Analytics for Billing to focus on gas consumption for billing services.
  • The service department is interested in Gas Analytics for Services to identify possible problem areas in the gas lines in order to proactively react to problem areas that have been identified.
  • The marketing department is interested in Gas Analytics for Usage in order to optimize pricing based upon usage patterns.

The Product Manager decides that the offerings will be structured in such a way that the gas utility company must first buy the Analytics for Gas Management before they can buy the more specialized persona-centric offerings.

Journey-based structures

As customers begin to use the Analytics for Gas Management offering, they start to ask Sensorytics for increased functionality so that the product matches the users’ workflow. Building on the persona-based model described above for gas utilities, the service department decides that it desires richer functions and a wider footprint. This leads to additional products that are available to the service department; the three additional products, which match the workflow of that department, bring the total number of offerings to nine.

Gas Analytics for Services Dispatching. This is a reporting engine that is used by the services department for dispatching service personnel to various locations based upon where the analytics platform identifies problems areas.

Gas Analytics for Services Timecards. This is an extension to the basic analytics platform that now inputs service personnel timecards to aid in services billing.

Gas Analytics for Services Optimization. Used to aggregate information gathered from the utilities end-points, this is combined with services personnel location and expertise to design an optimum workflow personnel utilization plan to maximize coverage and minimize services costs.

Best practices

As you can see, there are a variety of different offering structures that can be utilized to create multiple revenue streams from a single product or platform to match different market, persona and work-flow requirements. This can be further extended to create different bundles of individual offerings or to create offerings as a combination of product function and product metrics.

One word of caution is to keep models relatively simple. Avoid SKU explosion, which is characterized by a situation where 20% of the SKUs generate 80% of the revenue. Balancing the simplicity of the offering with revenue maximization is part of the art of product packaging.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


January 8, 2020  4:31 PM

What 2019’s ‘summer of outages’ means for IoT

Mehdi Daoudi Profile: Mehdi Daoudi
Cloud outages, Internet of Things, iot, IoT analytics, IoT cloud, IoT cloud strategy, IoT connectivity, IOT Network, IoT strategy

In the summer of 2019, several cloud service providers experienced nagging bouts of unplanned downtime, impacting thousands of businesses. Google had an outage in June, which brought down several of its most popular services including Search, Nest, YouTube and Gmail, and was hit by another major outage in early July. Apple also experienced a widespread cloud outage in July, which affected the App Store, Apple Music and Apple TV. Cloudflare, Facebook and Twitter also had problems.

The macro conditions underlying these outages can often be boiled down to increased internet complexity or rushed-to-market software releases. While IoT was not involved in these recent problems, the implications of such unpredictability are significant for any cloud reliant IoT project, especially those with lives and safety depending on them.

This is because IoT and cloud computing are increasingly intertwined and symbiotic technologies. IoT devices generate huge amounts of data, with the cloud often serving as the central data collection and analysis repository. For example, consider a large multinational enterprise with IoT-connected thermometers across hundreds of factories, each one constantly generating data for analysis. These thermometers might be connected to other IoT devices and services, such as a factory manager’s remote, smartphone-based thermostat app. All of this requires superior speed and availability to work, making the recent spate of outages a major cause for concern.

Industrial IoT applications like this are just the tip of the iceberg. It’s one thing for factory thermometers to go down due to the cloud, but what happens when IoT is managing something even more critical, such as hospital systems and equipment?

The recent outages shouldn’t dissuade IoT projects from leveraging the cloud because, in many cases, the cloud offers higher levels of security, reliability and delivery speed than organizations can deliver themselves. But it does mean these organizations must be discerning and proactive about protecting themselves, especially if their IoT applications are mission critical.

Monitor the cloud yourself

Assurances from a cloud provider regarding availability and speed, round-trip time of packets traveling to and from your connected IoT devices, can give some peace of mind and a sense of the provider’s overall infrastructure health. However, this should be considered supplemental information only and cannot be relied upon exclusively for ensuring that IoT device connections are reliable and fast.

This type of direct to the cloud and back type of monitoring is not necessarily indicative of reality. Cloud service providers have partnerships with internet service providers (ISPs) and better network intelligence on how to route traffic. This means that, whenever possible, cloud monitoring will bypass the broader internet infrastructure that IoT device data must traverse, keeping packets in transit on their own networks and optimizing speed from point A to point B, and vice versa. This can result in a skewed, overly positive sense of IoT device connectivity and communication speed, because in the real world, networks and other external elements can get in the way.

Don’t track the cloud from only the cloud

While it’s critical for organizations with cloud-dependent IoT projects to do their own monitoring, they should never monitor the cloud from cloud-based infrastructure only. For the reasons outlined above, you might get a warped view of actual performance. Never monitor the cloud from the same cloud provider that’s handling your IoT project. If this cloud goes down, you’ll be blind to how your co-located IoT system is doing. You must make certain your monitoring vantage points are a mix of backbone, ISP, wireless and other node types.

Monitor IoT device availability

In an IoT world, devices essentially are the end users, so it is important to consistently monitor them and ensure they are reliable and interoperating with other IoT devices with exceptional speed. Since cloud service providers’ infrastructure consists of datacenters and other servers spread across the globe, a problem can occur anywhere and impact isolated segments of IoT devices.

That’s why it is critical to have as many eyes as possible, in all the key geographies where you have IoT devices running, as well as from the various network vantage points through which your IoT devices connect to the internet. This will put you in the best possible position to proactively detect IoT outages or slowdowns. Combining this with deep analytics will give you a head start in addressing the problem, whether it’s related to the cloud or not.

Have redundancy plans in place

If your IoT project supports a mission-critical process, you should consider having a multi-cloud strategy as a form of backup and protection. This might require a good amount of work, but it’s often worth it. You’ll need to make sure all the key phases of an IoT project, namely real-time data processing and storage, can be quickly ported over to another cloud in the event of primary cloud failure. This means testing failover strategies in advance to ensure cloud-to-cloud interactions are fast and reliable enough to support real-time data replication.

Final thoughts

The cloud has many attributes that make it ideal for supporting IoT projects. Not surprisingly, growth in IoT data has led to cloud service provider growth and expansion, which supports more IoT data and projects. Together, the cloud and IoT represent a set of inextricably linked technologies of the future.

But organizations running IoT in the cloud must proceed with caution. If we’ve learned anything, it’s that even the strongest businesses in the cloud industry can — and inevitably will — go down. It’s up to you to take the steps needed to better prevent your IoT project from going down with them. In fact, this is something you can’t afford not to do.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


January 6, 2020  12:19 PM

It’s tough living on the edge

Gordon Haff Profile: Gordon Haff
Cloud management, Edge computing, Internet of Things, iot, IoT cloud, IoT data management, IoT edge computing, IoT strategy

The past couple of years have seen computing pushed out to the edge of the network at an ever-faster pace. The details vary as there are many different “edges” depending upon what problem is being addressed. But the overall trend is clear. By 2023, over 50% of new enterprise IT infrastructure deployed will be at the edge rather than corporate datacenters, up from less than 10% today, according to IDC.

High volume data streaming in from IoT devices, which often must be quickly processed, filtered and acted upon, is one driver of edge computing. But there are an increasing number of other application areas, such as telco network functions, which are best optimized by placing service provisioning closer to users and devices.

None of this is a repudiation of cloud computing, but it does illustrate how assumptions that computing was on a path to wholesale centralization were simplistic at best. In practice, enterprise computing is highly heterogeneous, and organizations are mostly pursuing hybrid cloud approaches.

Although distributing pushes compute out to where data, users and devices live has advantages, it also introduces challenges relative to centralized computing. These fall into three general categories: architecture and technology, ongoing operations and security.

Architecture and technology

The scale of some edge deployments is such that they can use similar software used in datacenters. For example, OpenStack is popular among telco’s to create private clouds at the edge just as it is for creating private clouds in a more traditional on-premises environment.

However, even if some of the software stack is common, edge installations must take several unique considerations into account. For example, you can’t just flip a switch and add more servers from a central pool if more capacity is needed at the edge.

It’s important to plan for the needed compute, as well as storage, networking and any other hardware, up-front. Upgrading hundreds or even thousands on edge sites is an expensive undertaking. At the same time, the cost of over-provisioning those hundreds or thousands of sites adds up quickly too. The lesson here is that you must design deliberately.

As noted earlier; the edge can look very different depending upon the application. An appropriate architecture for hundreds of clusters with tens of servers each differs significantly from one another for thousands of smaller endpoints, much less one that’s made up of millions of individual edge computing devices.

Operations within large distributed systems

There are also practical issues related to operating a large distributed system. All those edge clusters might be installed in locations that don’t have an IT staff and might even be in places with no permanent human presence at all.

We need to account for the fact that this is a distributed system connected by potentially unreliable and throughput-constrained networks. How do we want an edge cluster to behave if it loses its connection to the datacenter? If disconnected operation makes sense, the system needs to be designed with that in mind.

We also need to deal with failures within the edge cluster itself. Failures are a normal expected event at scale. We must provide redundancy while also considering cost tradeoffs. Is it cheaper to install some extra hardware so that repairs can be made mostly on a slower-paced scheduled basis? Or are we better running leaner and treating failures as an urgent event?

Site management operations, such as deployments and upgrades, must be handled remotely and be fast, reliable and automated. Good monitoring and logging are required for centralized management to work at all. Effective analytics can also help to predict failures and thereby head off some problems before they occur.

Edge computing security

In some ways, security is a subset of operations in the context of edge computing, but it’s important enough that it’s worth calling out separately. I’ve written previously about IoT device security specifically, but edge computing as a whole also has some specific security challenges.

The scale of many edge computing installations means that the automation mentioned above must apply to security as well. Automating patching and security scanning is a good practice. But using automated tooling that enforces security policies and minimizes potential vulnerabilities at distributed sites is essential.

The edge has other unique considerations. In general, datacenters have established robust physical security practices around controlling access to the hardware, properly disposing of assets, such as disk drives that may contain sensitive information, and generally providing a highly engineered and controlled environment.

This is often not the case with edge clusters. Branch office and other remote systems have had to take these factors into account for a long time. However, edge locations might not even have the level of controls that a bank branch or satellite company office does. And the scale can be much greater.

Plan and plan again

One could argue that there are relatively few challenges that we see in edge computing that we don’t also see — to greater or lesser degrees — elsewhere. But that’s the rub: We see requirements for failure resiliency and automation everywhere. But dealing with them at the edge, where both distribution and scale are so great, can be especially challenging. For example, once a fix is identified, rolling it out to every edge cluster is probably a much more significant task with more failure modes than in the case of centralized infrastructure.

The above example further highlights that edge architectures need to be carefully planned. This includes up-front design work that considers the practical realities of a highly distributed system that exists largely outside of controlled datacenter environments. But it also includes deliberate planning for the on-going operations on the entire system, including provisioning, failure recovering, upgrades and security.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: