IoT Agenda


July 26, 2019  11:27 AM

Your smart devices should live on the edge

Jason Andersen Jason Andersen Profile: Jason Andersen
Edge computing, Internet of Things, iot, IoT architecture, IoT edge, IoT edge computing, IOT Network, Smart Device

This is the second article in a two-part series. Read part one here.

In my earlier article, I discussed the many benefits of smart devices, especially in industrial environments. I also discussed the challenges of capturing and analyzing data at the edge of the network. Now, it’s time to dive into how to establish an edge computing architecture for processing raw IIoT data.

The case for edge computing

While pushing all of your data up to the cloud is possible, it doesn’t provide you the desired value from a performance standpoint. The key to harnessing raw data produced by intelligent devices is to first turn the data into useful information at the source.

Edge computing is the best way to achieve this, as the technology provides cleaner data, better data protection and lower costs. Cleansed data ensures the supervisory systems receive the highest quality information, while isolated smart devices are better equipped for defending against cyberattacks across an entire enterprise. Significant cost savings are possible by incorporating edge computing into an existing IIoT-to-cloud system, driving data reduction.

Steps to success

As with all products, there are some challenges associated with edge computing. Considerations include developing a plan, fielding personnel support and factoring in cybersecurity. At the end of the day, the goal for an edge computing solution is to provide clean data in a secure real-time manner to support mission critical demands. So, how can your company achieve this?

A proper edge computing platform must be powerful, protected and autonomous. It must also be simple to deploy quickly and manageable by remote site personnel. If disaster does strike, a single button restore is the difference between getting back online in a few minutes versus a few days.

The first step is to establish an overall architectural strategy for your edge computing platform. This will provide the roadmap for connecting your smart devices at the edge and sharing that data with the cloud. A comprehensive architecture addresses four levels: interoperating with field devices, capturing data, controlling information and connecting with cloud intelligence. The final goal of your strategy here is to transmit useful information up to the cloud, enabling long-term analytics and even artificial intelligence.

Next, you need to establish a secure connection. Often, securely connecting assets involves logical concentrations of smart devices at the site. These concentrations must be identified so that the edge computing hardware can be located near these areas. This helps with troubleshooting and support. Often times, the best locations for an edge device are in harsh environments, requiring a ruggedized solution.

Here, you may want to consider a purpose-built edge computing hardware platform. These products are built for industrial service, optimized for the edge-located role, and tailored to the personnel who will support them. Integrated redundancy with no single point of failure incorporated into these products means that these systems will run indefinitely, while autonomous self-monitoring and remote management minimize maintenance effort.

As data available from IIoT and smart devices grows, the need for edge computing is becoming clearer and clearer. This type of data requires pre-processing at the edge of the network to foster efficient and cost-effective services. The most ideal way to meet the needs of smart devices is through edge computing solutions that use purpose-built platforms deliver accurate and secure data analysis.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

July 25, 2019  4:32 PM

Think before you deploy: the before, during and after of IoT data

David Simmons Profile: David Simmons
Data Analytics, Data Management, Internet of Things, iot, IoT analytics, IoT data, IoT data management, IoT deployment

The amount of data flooding in from IoT deployments can be overwhelming, but if nothing is done with the data, then why collect it in the first place?

For nearly 15 years, I’ve been saying that IoT data must be at least these three things:

  1. Timely: arriving quickly, and as close to real-time as possible
  2. Accurate: properly reflecting what is going on in the deployment
  3. Actionable: contributing to your ability to make timely business or operational decisions

So, how can you maximize the business of your IoT deployment, while minimizing costs? Here are some tips on making your IoT data timely, accurate and actionable.

Utilizing the digital twin

A digital twin is a data-model of a physical system using IoT sensor data to represent all the parameters of that physical system. If you think of a jet engine and all of the telemetry sensor data streaming from it in real-time, you could build a software model of that engine based directly on that sensor data. With the digital twin, you can manipulate various characteristics of the engine and see the effects in real time, all without having to use a real jet engine on an actual commercial airliner.

Mechanical engineers can use this timely, accurate and actionable sensor data to make informed and accurately tested adjustments to existing jet engines or influence the design of future jet engines.

Think before you design and deploy

Before designing and deploying a system, determine what data you want and need to collect and how it will be used to inform your business decisions. In the manufacturing scenario, you might want to monitor the vibrations coming off of a large piece of industrial equipment, so that you can identify upcoming maintenance problems, such as a bearing going bad.

Without this monitoring, the bearing would simply fail, and you would be forced to shut the line down while a part is ordered, shipped and installed. Having a production line down for any significant amount of time can be extremely costly. If you were monitoring the vibrations on the machine, you would be able to monitor changes, identify the problem, and order the replacement before it failed, thus minimizing and potentially eliminating any effect on production output.

Downsampling your data

Historical archiving is often overlooked when it comes to the analysis of IoT data. We’ve seen how collecting and acting on data in real-time can have huge effects on your business, but storing all that data forever can be extremely costly. Downsampling data is a process of reducing the sample of data down to the most important parts and in the least amount of space possible.

Collecting data at the millisecond level will help you identify and avoid potential problems, but will you really need that level of detail two, three or even four years down the road? Probably not. What you really need is to identify the data trends and highlight any anomalies or events that may have occurred. You can significantly reduce the amount of data stored over time by downsampling and expunging the highly granular data that is unnecessary and expensive to store.

You can store your millisecond-level data from the past 30-days, which is enough time to get highly accurate analysis of potential changes and looming problems. You could then downsample that data to one-minute averages while preserving any events or anomalies and the data immediately surrounding those events. You could even further downsample that data to five-minute averages — again, preserving the events and anomalies, and the highly granular data surrounding them — which could be stored indefinitely.

If a database has the ability to automatically expunge data via data retention policies, it could easily be completely automated. Further, automating could significantly decrease data storage costs, all without losing the important insights gained from IoT sensors.

Embrace the data wave

The data deluge is real. Rather than avoid it, businesses need to take the necessary steps to embrace it. Insights gathered from timely, accurate and actionable data have the potential to unlock business opportunities, streamline operations and minimize costs.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


July 23, 2019  12:16 PM

Moving edge computing applications from pilot to production

Chetan Kumar Profile: Chetan Kumar
Edge computing, Internet of Things, iot, IoT edge computing, IoT performance, IoT scalability, iot security, IoT updates

In the year 2019, businesses are realizing the advantages of edge computing and moving their applications closer to the source of data. Organizations are moving edge applications for video surveillance, industrial IoT, smart mobility and smart cities from pilot to production. Each application performs mission-critical tasks, and any downtime of an edge computing site or application will be detrimental. Careful planning is essential before moving an application from pilot to production.

Source: Pixabay

Source: Pixabay

Here are the top five elements to keep in mind while moving edge applications from pilot to production:

  1. Scalability of the distributed system

Edge computing is all about distributing compute close to the source of data. For example, if a smart video surveillance computing device is installed on every site, which houses 4 to 40 cameras, this will distribute a large number of devices geographically across multiple sites. Unless your management systems and back-end infrastructure are built to handle this scaled setup, then your deployments will fail. Proper planning to handle distributed and scaled edge computing is a must to start production.

  1. Performance management

With high chances of downtime at edge computing sites, the ability to get complete  control of the infrastructure and application becomes critical. Consider unmanned automated retail stores. Failure in applications can cause significant business revenue loss. Until the systems are back, up and running, no transactions can take place. Effective application and infrastructure performance management tools can identify bottlenecks and get to the root cause of the application problems in a complex and chaotic edge computing environment. Applications can fail due to compute and network infrastructure failure, software bugs and peripheral device failures, including sensors, cameras and controllers. Applications can also underperform due to resource constraints in edge devices. Application failure or underperformance can render business technology useless. You should not move your applications from pilot to production without proper performance management mechanisms.

  1. Remote management

Consider smart parking management mechanisms deployed at multiple sites across the country. Any failure in application, system or infrastructure might need a technician dispatched to the site to fix the problem. This can be a big burden on business operation expenditure and cause revenue loss. To avoid this, edge computing applications and management mechanisms must be built with secure remote access.

  1. Over-the-air or firmware upgrades

Over-the-air firmware and application upgrade problems have already been solved, but they are an essential ingredient in edge computing technology and need a fresh look. Business agility requirements push for more frequent application upgrades. New learning algorithms for machine learning and AI applications call for upgrades. Firmware upgrades — file system and kernel-level — are also essential to plug security vulnerabilities. Kernel upgrades might be needed to address critical bugs in scheduler and drivers.

  1. Security

Security is undoubtedly the most attention-seeking element in edge computing technology. With the distributed nature of edge computing applications, threat actors get a bigger attack surface, increasing the vulnerability. It is essential for organizations to choose the right security audit of the operating system and applications to identify vulnerabilities before deployment. Security cannot be an afterthought; it has to blend in every aspect of the design, from application to infrastructure. IT pros can better understand and identify security vulnerabilities with better visibility of edge computing infrastructure and applications.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


July 23, 2019  10:41 AM

Optimize the customer experience with intelligent automation

Chris Huff Profile: Chris Huff
Automation, customer experience, Internet of Things, iot, IoT analytics, IoT data, robotic process automation

There’s no way to undersell the importance of delivering a stellar customer experience. Disappoint your customers at any point in the buying journey — whether onboarding or support and upsell — and you’ll put their loyalty at risk. That’s the best-case scenario. At worst, a poor experience is amplified through social media, putting real revenue from potential future customers at risk.

This incontrovertible truth is why so many organizations focus on the digital transformation of their businesses. Digital transformation technologies, including IoT and intelligent automation, can function as levers to achieve end-to-end process visibility and efficiency, which makes an organization more data-driven, responsive to customers and profitable.

IoT is a far-reaching system where an increasing number of physical devices — such as cars, appliances and production machinery — integrate with the internet and produce data on their usage. The advantage of IoT is in the real-time data it makes available across a series of online devices used throughout a company.

The data produced by IoT-enabled devices can paint a clear picture of customer experiences and business operations. An IoT-enabled organization can better understand how its people and assets, including machines and infrastructure, come together to produce value. However, data alone isn’t enough to create these outcomes, which is where intelligent automation comes into play.

An intelligent automation platform has the technology required to ingest data produced by IoT devices and draw insights from it. The platform can automate downstream tasks with robotic process automation (RPA) based on insights drawn and initiate workflows or call an organization’s IT pros to action around any pattern recognized in the data. IoT-enabled companies use intelligent automation to draw insights and begin to take action more quickly based on those insights.

Consider, for instance, a manufacturer that experiences a mechanical failure on the production line. Data delivered instantly from the broken machine is ingested by an intelligent automation platform, and the platform delivers a summarized alert to executives. The platform case management capability is triggered to pull in people resources from elsewhere to reorder new parts. These resources can trigger RPA on demand to automate any repetitive and tedious portion of the ordering process. The result is a more responsive organization with a healthy level of collaboration between people and technologies to better execute processes.

The ability to respond more quickly directly affects the customer experience. When factory floor or back office issues are resolved early, the customer won’t experience a disruption. Intelligent automation and IoT truly shines in giving employees insights and the time to enhance the customer experience. With routine tasks automated, customer-facing employees have more time to assess IoT-acquired insights and determine how to translate them into higher-value experiences for clients.

By incorporating intelligent automation technology, IoT-enabled companies can transform information-intensive processes and achieve higher levels of collaboration between people and technology. That means they’re able to work more efficiently and deliver great moments of customer engagement.

Companies must optimize operations and empower their people, so they can better serve customers. Intelligent automation makes it possible for people to build stronger customer relationships and processes consistently exceed expectations in an IoT-enabled company.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


July 22, 2019  3:48 PM

Overcome water and energy challenges with IoT

Asem Elshimi Profile: Asem Elshimi
Data Analytics, energy, Internet of Things, iot, IoT use cases, Smart meter, Smart Metering, Smart sensors, Sustainability IoT, utilities, Water

Smart lights are cool. They facilitate an unprecedented level of luxury. There will be no more getting out of bed to switch the lights off when a single swipe on your smart phone will do.

Smart lights are only an expression of a much bigger disruption: home automation. Every problem you didn’t know you had in managing your home is about to be solved. Fridges order milk, blinds automatically adjust to outside light levels and cameras and sensors everywhere keep you feeling secure. For many reasons, these scenarios were just unimaginable to me.

But there has to be something more valuable that IoT can do, something deeper and more meaningful than ordering milk. The world is going through so much turmoil right now. The planet is overpopulated with energy, food and health challenges. We don’t have enough sustainable energy sources to meet the hyper-growing demand on our near-depleted natural resources. Climate change, on the other hand, is accelerating shortages of fresh water and food. Humanity’s existence is at stake. We are morally obliged to ask a necessary fundamental question about anything in mass development: How is this technology helping to solve the world’s most pressing problems?

We need to ask this question because mass development consumes our productive means. Wouldn’t it be a terrible thing if most of our productivity is wasted on luxury items? Think of how many skilled workers — an entire generation of engineers — are spending their working hours in development of something as massive as the IoT. An investment like that should be beneficial to humanity. In one way or the other, overcoming water and energy challenges depends upon wisely utilizing our workforce.

Thankfully, the IoT can serve a greater cause. The IoT is creating a mesh of connectivity and controllability. This massive scale of controllable things — mostly sensors and actuators — means that we can make our energy systems considerably more efficient than they are today. And thanks to low-cost silicon chips, the deployment of an unimaginably complex network can be fairly cost-effective.

Source: Webber Energy Group

One system suffering from inefficiency is our water and energy distribution networks. The numbers are staggering. We lose significant quantities of pumped fresh water to leaks, and we waste twice as much energy as we use worldwide every year.  This is quite the bottleneck. Residential and industrial loads cannot do any good in maintaining higher efficiency if energy is already wasted in distribution. Any effort to enhance the efficiency of this water-energy nexus must start at the distribution level. Fortunately, public utilities, cities, corporations and consumers are joining forces to address the challenge of water-energy waste. And the one attractive technology that everyone is invested in turns out to be making the distribution of water and energy smarter with mesh networks tracking water transportation. Utility companies would like to attach 50 cent wireless microcontrollers to every critical point in the distribution mesh of water and energy. Once we do this, we can sit back and watch the miracle of electronic brains dynamically adjusting water flow and energy loading to guarantee unprecedented levels of efficiency. For example, the U.K. government has installed 13.8 million smart meters in peoples’ homes, as of December 2018. This is only the start of a global trend to use IoT.

IoT can go beyond making distribution more efficient. More importantly, it can more effectively integrate clean energy sources. Classically, energy systems were built around power plants coming from one end and loads that dissipate energy on the other. However, power grids are witnessing a dramatic change, and the need for sustainable energy is driving a shift toward a more decentralized model. Rooftop solar panels could be used to power up neighborhoods. Smart grids enter the picture to create this complex, dynamic energy transfer.

Source: Webber Energy Group

We are also seeing operators deploy IoT sensors across thousands of oil wells and supporting production equipment in the oil and gas sector. IoT technology monitors and reduces the leakage of methane, a toxic greenhouse gas more potent than carbon in the first 20 years within the atmosphere, which is responsible for 25%of global warming.

This decentralization of distribution systems enables further outreach of energy and water. About 2.1 billion people around the globe lack access to clean drinking water. Unfortunately, to expand the current legacy water and energy distribution infrastructure, we need to build new central distribution units. These are extremely expensive. On the other hand, an IoT-enabled decentralized distribution will make it possible to extend the water and energy distribution outreach to rural villages at a much more manageable cost. This will serve underprivileged populations. I like to envision electricity and fresh water finally arriving at a rural village that was ever so dark and isolated before IoT. While this may not affect the world globally, it subtly improves the overall human condition.

In 2015, the United Nations (U.N.) adopted a sustainable development initiative that targets achieving important sustainable development goals by the year 2030. The goals are centered on great causes that serve humanity. Noteworthy goals focus on solving major world problems such as ending poverty and hunger. The second tier of sustainable development goals basically improves the water-energy nexus efficiency. These are challenges that IoT can solve.

Sustainable Development Goals

United Nations’ 17 Sustainable Development Goals

It’s really up to all of us in the industry to make IoT do more good for humanity. And there is so much room to do so. The industries underpinning IoT are profitable and growing. Economic opportunities to invest in IoT are massive, and the interest in making it happen is persistent. The build-out of IoT on a mass scale is going to happen, and yet the way it will be shaped and the problems it will solve will be left up to the market and the makers to decide. Developers and providers are in a position of moral responsibility. And we need to ensure that the work we put into making this technology serves humanity in the best ways possible.

What could potentially go wrong, given the massive economic opportunity in IoT, is getting distracted from doing good by the lure of revenue from other markets and applications. Revenue is an important driver for healthy industries and organizations. We use revenue to make sure that we aren’t fooling ourselves, and that we are on track for sustaining our businesses. Yet, this could be a slippery slope. We need to be careful when we look at revenue. It should be one of many instruments and in no sense could it be viewed as an end goal. We need to balance business decisions that focus on revenue growth unattractive. And to resolve this, we need to make our goals more and more profitable. Fortunately, this is not terribly difficult in the world of the IoT. According to IoT Analytics, 75% of the developing IoT projects are serving the U.N. 2030 initiative, which is a remarkable contribution.

Despite the massive opportunities of IoT, it also comes with its share of security and privacy threats. A massive connectivity of things provides a surface vulnerable to attacks. Also, the collections of terabytes of data about our homes and industries pose a privacy threat. We do not want to live in a world where this data falls into the wrong hands. People have gone Orwellian imagining IoT being abused by a big brother-like body or government. They aren’t completely mistaken. Yes, IoT poses a threat, but so do many other disruptive technologies. Yet, as Ronald Reagan once put it “The future doesn’t belong to the fainthearted; it belongs to the brave.” We ought to take estimated conscious risks as we evolve as a species. This goes hand in hand with doing everything possible to prevent those risks from happening. Ensuring the best security possible for cloud-connected IoT end nodes is not a specification that customers require; it is an obligation and a responsibility that we all bear together.

While home automation might seem superficial — the world seems much bigger than a light bulb — it subtly serves good causes. Consider how home automation can aid the elderly or those with disabilities. Consider home security systems that can help keep us safe. And more importantly, consider the revenue IoT brings to sustain other greater causes. The same chips we develop for smart lights can be applied to smart utility meters, smart health monitoring devices and many other mechanisms that can enhance our health, safety, access to energy and water and ultimately benefit humanity.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


July 22, 2019  1:02 PM

Stimulating growth in China’s domestic IoT market

Yongjing Zhang Profile: Yongjing Zhang
Connected vehicles, Internet of Things, iot, IoT market, Smart cities

Yongjing Zhang represents Huawei at oneM2M’s technical plenary . In this Q&A, he describes China’s IoT market and the importance of interoperability in IoT solutions.

Please begin by describing your roles and responsibilities within oneM2M.

Zhang: I began working in oneM2M as the management, abstraction and semantics working group chair and recently took on a new role as a oneM2M technical plenary vice chair. Now, my job is to serve the oneM2M community from a higher perspective and this includes external cooperation with other IoT organizations.

Within Huawei, I work for the IoT platform product line in the cloud, AI product and service group as a senior standard manager. My team is responsible for standardization and research around the IoT platform. We also deal with supported service domains, such as the connected vehicle and smart cities

China is a large IoT market in the global context. What are your impressions of what is happening locally?

Zhang: I would say that China is among the fastest developing regions in the world in terms of the IoT market. The government plays a very important and active role forming national policies, innovation projects and industry development guidelines and standards. All these are beneficial to the IoT industry. On top of this, there is a lot of internal innovation momentum from different enterprises. Together, these factors stimulate the domestic IoT market and will help it grow rapidly.

For example, more than 500 cities in China are planned or being constructed as smart city projects. The size of the market is projected to be 1.87 trillion Renminbi ($271 billion) by 2021. IoT technologies Narrowband IoT and city-level IoT platforms are key enablers of this growth. We are seeing these trends in many of the projects where Huawei is participating, including Yingtan, which is the first NB-IoT enabled smart city in China.

Another recent growth story in China is the connected vehicle and smart transportation sector, thanks to the maturing Cellular Vehicle-to-Everything technology, such as a Long-Term Evolution vehicle, and the release of the vehicle-to-everything spectrum, 20MHz, at 5.9GHz. Several provinces and cities have built the test fields in closed sites, highways or open roads, such as Wuxi, for car-to-road coordinated autonomous driving and smart transportation. End-to-end technology, including the chipset, on-board unit, roadside unit platform and applications can now be provided together by industry partners including Huawei, Qualcomm and Datang — who are also members of oneM2M.

There are also many other IoT-driven or related initiatives happening in China across different domains, like Industrial IoT and internet and Internet+. I’d like to just highlight two key successful factors from all these activities — the availability of a common service IoT platform and the readiness of interoperable standards.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


July 19, 2019  2:37 PM

Beyond smart cities: Driving social change with IoT and analytics

Justin Bean Profile: Justin Bean
Data Analytics, Digital transformation, Internet of Things, iot, IoT analytics, IoT data, IoT data management, Smart cities

The world is deep in the throes of a state of transformation. The adoption of IoT technology is on the rise and the promise of our smart cities of the future is beginning to come to fruition. Our governments and municipalities are evolving past manually collecting and crunching data by looking to new, automated sources of data that can make cities safer, smarter, healthier and more efficient.

Thanks to computer vision and machine learning, existing cameras are used as sensors to deliver insights on traffic, parking, pedestrian patterns and bike lane usage to improve signal patterns and road safety. Water and electricity are monitored for conservation and efficiency, and parks are improving nighttime safety with lights and specialized cameras.

Here in the United States, data driven approaches have already become commonplace in the private sector. Companies use massive data sets to better understand their retail or e-commerce customers, manufacturing plants and logistics to improve routing and eliminate waste and unnecessary costs. But there are significant challenges to deploying IoT technology for the good of the public on a large scale. Many factors come into play to halt the progress of innovative programs, including costly infrastructure already being firmly in place across most developed countries, taxpayer dollars being allocated for other initiatives, and high public scepticism or “not in my backyard” attitudes.

To this end, much can be learned by looking overseas to the possibilities of a more connected future to encourage more agile, efficient and evidence-based decision making to improve people’s well-being.

Setting an example

Andhra Pradesh, the eighth largest state in India, is home to approximately 53 million people and a leading example of how this approach can achieve social change and improve sustainability. Like many states, they are tasked with solving complex challenges such as sustainable resource management, poverty, crime and responding to natural disasters. Unlike most states, Andhra Pradesh has one of the most innovative smart city and smart state programs in the world — an achievement typically attributed to more developed nations, and one that’s making a big difference to its people.

Andhra Pradesh has undertaken a journey of digital transformation to drive economic growth and innovation while creating a safer, more sustainable lifestyle for its people. The state’s ambitious, overarching objective is to achieve the 17 United Nations’ sustainable development goals for 2030 – only Andhra Pradesh aims to achieve this by 2022.

Sustainable Development Goals

United Nations’ 17 Sustainable Development Goals

If that weren’t already enough, this progressive, data-driven government also wants its people to be happy. It has set about using data analytics to quantify what happiness means to its citizenry, what leads to happiness and what the state can do to make happiness possible for all of its people in a measurable way. To the government of Andhra Pradesh, the “common person” is its customer.

To achieve these lofty goals, Andhra Pradesh inaugurated its Real Time Governance Center, the hub for the state’s coordinated effort to provide real-time monitoring and reporting across its entire ecosystem of 13 districts with 33 departments, including more than 300 agencies offering 745 services, each with its own data systems and analytics.

Identifying areas for improvement, the center then uses data to track those improvements with the managers of those departments, agencies and services. The center integrates massive amounts of data gathered from IoT devices, security and traffic cameras that use video analytics, department databases and citizen surveys, along with real-time data from its various functions, to enable constructive real-time governance.

Emphasizing accountability and transparency for all of its citizens, the state’s real-time governance initiative takes this approach to enhance several key areas of government:

  • Government services, such as education, ration and pension distribution, agricultural subsidies, and management of infrastructure resources such as streetlights, water reservoirs, buildings and energy.
  • Disaster relief and management, with the goal of predicting dangerous weather events and providing timely warning to citizens prior to the onset of a natural disaster to save lives and reduce the negative impacts.
  • Public safety, with the goal of making Andhra Pradesh one of the safest places to live in the world.
  • Solving human problems with data analytics

Andhra Pradesh has set admirable goals to improve life for its large population and solve social and economic challenges — many of which are shared by other, similar communities throughout the world. Having an accurate understanding of the current situation and historical trends is key to measuring and solving these issues. However, many cities in both developing and developed nations lack good data and good data management provided through smart spaces technologies. Manual observation yields only a very small snapshot of what is taking place in any given area. Without the holistic view of a society that can be gained from using an electronic or automated approach to collecting, analyzing and drawing insights from data, it’s difficult for agencies to get apples-to-apples comparisons and collaborate with each other to achieve their goals.

Data analytics is foundational to Andhra Pradesh’s success in overcoming its challenges and achieving its objectives. For example, the elimination of waste and corruption are important steps to moving toward significant positive social and environmental change. By using data analytics, Andhra Pradesh can track goods or food rations to ensure that they’re not lost or stolen and that the people they are intended for don’t go hungry.

A role model for data-driven social transformation

By ensuring the program has put the right solutions in place, Andhra Pradesh’s leaders have the insights and tools they need to improve their citizens’ lives. These aren’t just hopeful utopian visions without a plan, however — they’re based on measurable information, holistic management and results. By taking a data-driven approach to creating a better life for its citizens, Andhra Pradesh is not only setting the bar for what positive social change means, it’s also providing a model for how other communities around the world can use pragmatic approaches to solve real human and sustainability challenges — to transition our global civilization to one that will not only survive the decades to come, but will help all people thrive along the way.

Andhra Pradesh can be a role model for states, cities and nations around the world to take a data driven approach to improving the lives of their citizens, while creating thriving and sustainable economies. In order to achieve these goals, other governments can gather data to:

  • Asses the needs of their people — what are their reported and measured challenges they face and the scale of these challenges? What are the factors that create these challenges and how can they be adjusted?
  • Set measurable goals and track key performance indicators that are regularly reported on and used to evolve strategies to improve performance.
  • Create accountability by assigning clear roles, achievable expectations and which goals each person or team will seek to improve, with defined rewards for good performance and learning mechanisms to reverse poor performance.
  • Engage the public for feedback and ideas for improvement, while providing transparency into how the data is used and how the organization is performing.
  • Practice agility to change programs or approaches that aren’t working and learn from results regularly, allowing lessons learned to affect strategies, tactics and actions.
  • Share lessons and challenges with sister cities, states and nations to gain new perspectives, ideas and resources for improving life and sustainability for people today and in the future.

With these data-driven approaches to improving lives, economies and our environment, we can achieve the UN’s Sustainable Development Goals together and create a thriving, fair and sustainable society for future generations.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


July 19, 2019  2:15 PM

How 5G, edge computing and IoT create the perfect storm

Bruce Kornfeld Profile: Bruce Kornfeld
5G, 5G and IoT, 5G technology, Edge computing, Internet of Things, iot, IoT edge computing, IoT strategy

In just three decades, cellular network advancements have introduced analog cell phones, text and voice messaging, GPS location capabilities and breakthrough speed improvements with each generation. First introduced in late 2018, the 5G wireless technology is on the horizon with speeds approximately 20 times faster than today’s 4G cellular networks.

With enhanced speed comes a natural influx of data creation at any location on earth at any time. Edge computing has been wildly helpful for quickly processing, storing and analyzing that influx of real-time data produced by a variety of devices. With sophisticated broadband now quite literally in our pockets — along with that calculator your algebra teacher said would never be there — every human with a smartphone could be considered an edge site.

Verizon estimates that about 8.4 billion connected devices are in use — up 31% from 2016 and it will increase to more than 20.4 billion by 2020. As IoT continues to drive the data explosion at the edge, and 5G enters the scene with promises like lower latency, improved speeds and higher capacities, virtualization and powerful edge computing technology must keep up. Today’s edge computing was designed for today’s technology, but there are many innovations that will go mainstream in the next one to two years, including 5G, open source hypervisors and containers. How will the modern IT admin keep up, simplify and automate operations?

Edge-based 5G: Can you hear me now?

5G will make data processing dramatically faster and more efficient. More information than ever before will need to be analyzed quickly, thanks to millions of devices producing data while connected to the 5G network. Prior to 5G, edge computing sites had, of course, used wireless technology whenever feasible, but sites today typically require a traditional, costly wired Ethernet internet connection to access the WAN. This connection is in place to transmit a small portion of data back, such as daily sales transactions, to a home office, cloud or data center for monitoring, backup, archive and data analysis. In some cases, 5G connectivity could eliminate the need for a wired connection, so more edge sites will pop up and be not just affordable, but easier to build, deploy and manage. This will lead to new IoT use cases, data collection and monitoring applications never before imagined.

In a report from Gartner, researchers predicted that by 2024, 60% of countries will have 5G network service provisions available from at least one cloud service provider. As 5G adoption takes place at the edge — where IoT requirements are growing fast — the need for high availability, low cost and simple management becomes even more important because these sites are typically small and have no IT staff to help. Just as 5G needs the edge to be successful, the edge needs 5G to run more efficiently and cost effectively.

IoT reimagined and automated with edge-based 5G

IoT today focuses on how machines and connected devices communicate with each other. In the real world, IoT cannot stand on its own; the network always serves a specific purpose and delivers an operational improvement for an organization. While edge is strengthening the convergence of operational technology and IT, perhaps IoT is also morphing to assume a broader definition: infrastructure operational technology.

5G has greater implications on IoT than simply improving phone speeds; the latest network will increase processing power to connected IoT devices and systems across various enterprises. With 5G embedded in IoT devices, new forms of operational technology and process control will be introduced, alongside new levels of automation that some industries have never seen before. Edge computing, 5G speeds and endless IoT data combine to create automation. Real-world examples include:

  • Traffic control: In smart cities, real-time monitoring becomes more widely available due to faster speeds. Automated decision making improves traffic alerts and rerouting processes, to reduce collision rates and road rage, saving lives.
  • Factory operations: It will become more widespread for food processing plants to automatically monitor temperatures to avoid over- and under-heating, eliminating waste and spoilage, and to deliver higher quality products, saving labor costs and money.
  • Autonomous vehicles: Centralized monitoring could potentially track semi-driverless cars in real time, eliminating the need for a human co-pilot. Centralized co-pilot centers assume human responsibility to monitor directions, upcoming traffic patterns and emergency alerts, ensuring driver safety.

The perfect storm

While 5G is not quite ready for prime time, it is rolling out now. In a Gartner report, researcher note that 5G could become an important enabler of edge computing, but won’t reach its full potential until at least 2023. To prepare, IT administrators should start thinking about how they will incorporate 5G in their IoT strategy to speed transmission and create new applications. Organizations should research and shortlist partners to ease the transition for their future ecosystems now.

Edge has kept up with the technology races and there are many offerings that are simple to operate, not dependent on specific hardware and can encrypt at the software layer to alleviate security concerns in the software-defined 5G network of tomorrow. The future is almost here and promises to be cheaper, faster and better connected than ever before.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


July 19, 2019  10:13 AM

High-productivity architecture creates a faster path to ROI for IIOT

Abhishek Tandon Profile: Abhishek Tandon
IIoT, industrial internet of things, Internet of Things, iot, IoT platforms

One of the biggest dampeners for industrial internet of things projects today is the lack of ROI. The journey from device to analytics to results can take years, and stakeholders have very little to show against the millions of dollars of investment, causing them a lot of distress.

A finished IIOT mechanism is a sum of many parts, such as sensor deployments, device-to-cloud integrations, communication protocols, security, analytics and application interfaces that need to be designed and integrated. Unless all of the pieces come together, the business objective will not be fulfilled. What’s the point of having a cloud strategy and storing petabytes of data when the recipient of the analysis does not get a glimpse of the outcome? What good is a predictive model if the only people who can understand the outcome are data scientists?

High-productivity architecture is the answer to many problems

Another aspect that affects ROI is the time to first outcome. Due to the large number of mini-projects within a large IOT project, the first outcome — which decisions can be based on and tested — can take several months, if not years, to observe. The large quantity of investments that go into IoT projects have a chance of nothing to show if the technology pieces do not fit well. This is a major cause of concern for executive sponsors, as they are subscribing to a potential blackhole.

Organizations can solve these problems by incorporating high-productivity platforms — which support straightforward management and automation of connected devices — in the architecture, which helps development teams prototype and create a product faster.

High-productivity architecture

High-productivity architecture (HPA)

Rather than talking about one application platform, the idea of high-productivity architecture stems from the fact that a series of platforms need to be interconnected with independent modules integrated together in a way that solves the IoT problem in totality. For example, the IoT reference architecture by Azure and the C3 Integrated Development Studio both use highly productive platforms to build high-productivity architectures. The modules for device integration, data storage, analytics and consumption integrate with a focus on result-oriented iterations rather than lengthy software development cycles. Modules automate repetitive tasks at scale to remove the complexity of DIY and still ensure that technology teams can tweak them to suit their case and reach a shorter time to first outcome.

The key to the success of a high-productivity architecture is having a modular, repeatable, automated, integrated and business-oriented approach to problem solving. This ensures faster turnaround, easier buy-in and the ability to integrate with outside technology seamlessly, saving the effort to build the entire architecture from bottom-up. Moreover, lean teams can deliver greater progress in a short period of time, creating savings. The focus needs to be on fast development centered on a defined problem statement and use of already available platforms to solve a business case rather than hope for an outcome using a DIY strategy. This is where high-productivity architecture can really help.

Stacking devices to data to analytics

The typical high-productivity architecture that organizations use to solve an IoT problem would look something like this:

  • No-code for sensors’ data connectivity and process systems to integrate in a data lake on the cloud.
  • Easy to configure business rules engines to create event warnings based on prior knowledge.
  • Solution-oriented machine learning approaches identify and surface patterns with minimal rebuild and high reusability.
  • Integrated low-code application framework to connect to data sources and devices, provides secure authentication, engages services through push notifications and integrated business logic.
  • Highly engaging web and mobile frontend to talk the language of the field engineer and assists him in making faster and better decisions

Each of the modules are essential for a working IoT mechanism and have a level of automation which will expedite the integration and prototyping of the entire technology.

High-productivity architecture is designed to ensure that energy and investment is spent on deriving value for business. It is designed for faster iterations and more focus on outcome. Finally, it has a shorter learning curve for technology teams and employs the highest quality technologies so that there is no compromise.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


July 17, 2019  10:39 AM

Machine learning tips to build a facial recognition tool

Jonathan Fries Profile: Jonathan Fries
ai, Algorithms, Data, Data scientist, Machine data, Machine learning, machine learning algorithms

Even just a decade ago, it was hard to believe that computers would be able to drive cars or easily recognize pictures of a cat. The programming and AI tools around at that time struggled with computers seeing the world around them and processing that information accurately.

In the last 10 years, machine learning has slowly grown from a field of research to a mature technology with real-world applications that are used by some of the top organizations and multiple industries today. At the most basic level, machine learning creates algorithms that solve some of the most complex and interesting problems that technology organizations face.

Many of these problems have moved from science fiction to established fact. Some have become positively easy, such as handwriting recognition, which is now the “Hello World” of machine learning. Through Exadel’s own experience developing machine learning programs for clients, we wanted to share some of the do’s and don’ts of using machine learning technology.

On the surface, the result of the machine learning process looks much like traditional programming because the end product is a programmatic algorithm that processes information. When it comes to the how of machine learning program creation, it actually looks quite different.

How to create a machine learning model

First, a few don’ts of creating a model: don’t define requirements, design a system or algorithm, write code, test or iterate as you would with traditional software development.

With machine learning you must characterize the problem in a way that makes it susceptible to machine learning, and then you must understand if you have the data that can help solve that problem. Next, you define a model, train the model with training data and test the model, hoping that your training resulted in a high probability of success. If it didn’t, you tweak your model and retrain.

Facial recognition app development

One of our clients came to us for help developing an app to make it simpler for secure check-in to an office space. The client had requests to simplify the visitor check-in process and avoid duplicate data entry. When someone checks into the office, they must enter a few pieces of information, including name and phone number, into a tablet at the front desk. For reasons of privacy, this needed to be re-entered every time because we can’t simply provide a list of all the people who have previously checked in. Re-entering this information was repetitious for visitors, but important for the client to know who was in the office and how to get a hold of them. In order to automate this process and to provide security for the information, we decided to use facial recognition to identify visitors and understand if they had been in the office before. If they had been in the building before, we would have a picture on file and could identify them when they took a picture again. We decided to use machine learning, open-source tools and open source projects as a baseline. Not surprisingly, we sought out existing tools to develop this application.

In the existing app, when a visitor first comes to the office, they fill in the information on a tablet and the tablet takes their picture. The check-in tool now has a profile and an image that can be used to recognize each individual.

To create this facial recognition system, we used some off-the-shelf machine learning and computer vision (CV) components:

  • Python: generally the language of choice for machine learning today.
  • Tensorflow: an open-source machine learning and neural network toolkit. Tensorflow is the go-to library for numerical computation and large-scale machine learning.
  • scikit-learn: Simple and efficient tools for data mining and data analysis.
  • scipy: a free and open source library for scientific and technical computing.
  • numpy: a Python library supporting large, multi-dimensional arrays with a large library of functions for operating on these arrays.
  • OpenCV: an open-source library of functions aimed at real-time computer vision.

These are all very common tools used for machine learning projects. We’ve been working with and adapting open source code to tie all of these components together, including the face recognition using Tensorflow GitHub project.

Developing the code and tools to do facial recognition is important, but, as mentioned above, the core of machine learning is to train the model until the results on test data — which has never been evaluated during training — provide a high-enough level of success to say that the developed neural network algorithm can recognize people in the setting — in this case, checking in at the front desk.

Data is very important here as well. Best practices indicate that you should have training data, validation data and test data. Organizations use training data, data that your model learns from, to train the model. Machine learning specialists use validation data to review the trained model. The machine learning specialist may then change or tweak inputs, based on this validation. This is part of the iterative process of developing the model. The machine learning model never sees test data except in the final testing steps. It is the gold standard that is only used once the model is fully trained. It may be used to compare the success of two different, trained models.

The pre-processing and training processes look like this:

  1. Find the face: Find the face within the image. Real world images contain more than the face, so you first must isolate the pieces that comprise the face.
  2. Posing and projecting faces: Even the best computer algorithms work better if every image has the same proportions. We needed to align the face within the image frame to improve its use with the machine learning model.
  3. Calculate embedding from faces: A human describes the difference between faces using visual human-readable characteristics, such as nose size, face width or eye color. We use neural networks that automatically determine machine-readable features.
  4. Use embedding for training model: The step where we are training the model with images or using the trained model.

Once we have trained the model and tested it, we can deploy it so that it can be used by the tablet program to check newly created images to see if they match anyone who has visited the office before.

We created a web API that the tablet application uses to send in a photo to potentially match the new image against the image database.

Machine learning is still a relatively nascent technology, but its applications are starting to become more pervasive. As we start to better understand the best practices and uses for machine learning, organizations must have the skills ready to keep up with the competition.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: