IoT Agenda


July 22, 2019  1:02 PM

Stimulating growth in China’s domestic IoT market

Yongjing Zhang Profile: Yongjing Zhang
Connected vehicles, Internet of Things, iot, IoT market, Smart cities

Yongjing Zhang represents Huawei at oneM2M’s technical plenary . In this Q&A, he describes China’s IoT market and the importance of interoperability in IoT solutions.

Please begin by describing your roles and responsibilities within oneM2M.

Zhang: I began working in oneM2M as the management, abstraction and semantics working group chair and recently took on a new role as a oneM2M technical plenary vice chair. Now, my job is to serve the oneM2M community from a higher perspective and this includes external cooperation with other IoT organizations.

Within Huawei, I work for the IoT platform product line in the cloud, AI product and service group as a senior standard manager. My team is responsible for standardization and research around the IoT platform. We also deal with supported service domains, such as the connected vehicle and smart cities

China is a large IoT market in the global context. What are your impressions of what is happening locally?

Zhang: I would say that China is among the fastest developing regions in the world in terms of the IoT market. The government plays a very important and active role forming national policies, innovation projects and industry development guidelines and standards. All these are beneficial to the IoT industry. On top of this, there is a lot of internal innovation momentum from different enterprises. Together, these factors stimulate the domestic IoT market and will help it grow rapidly.

For example, more than 500 cities in China are planned or being constructed as smart city projects. The size of the market is projected to be 1.87 trillion Renminbi ($271 billion) by 2021. IoT technologies Narrowband IoT and city-level IoT platforms are key enablers of this growth. We are seeing these trends in many of the projects where Huawei is participating, including Yingtan, which is the first NB-IoT enabled smart city in China.

Another recent growth story in China is the connected vehicle and smart transportation sector, thanks to the maturing Cellular Vehicle-to-Everything technology, such as a Long-Term Evolution vehicle, and the release of the vehicle-to-everything spectrum, 20MHz, at 5.9GHz. Several provinces and cities have built the test fields in closed sites, highways or open roads, such as Wuxi, for car-to-road coordinated autonomous driving and smart transportation. End-to-end technology, including the chipset, on-board unit, roadside unit platform and applications can now be provided together by industry partners including Huawei, Qualcomm and Datang — who are also members of oneM2M.

There are also many other IoT-driven or related initiatives happening in China across different domains, like Industrial IoT and internet and Internet+. I’d like to just highlight two key successful factors from all these activities — the availability of a common service IoT platform and the readiness of interoperable standards.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

July 19, 2019  2:37 PM

Beyond smart cities: Driving social change with IoT and analytics

Justin Bean Profile: Justin Bean
Data Analytics, Digital transformation, Internet of Things, iot, IoT analytics, IoT data, IoT data management, Smart cities

The world is deep in the throes of a state of transformation. The adoption of IoT technology is on the rise and the promise of our smart cities of the future is beginning to come to fruition. Our governments and municipalities are evolving past manually collecting and crunching data by looking to new, automated sources of data that can make cities safer, smarter, healthier and more efficient.

Thanks to computer vision and machine learning, existing cameras are used as sensors to deliver insights on traffic, parking, pedestrian patterns and bike lane usage to improve signal patterns and road safety. Water and electricity are monitored for conservation and efficiency, and parks are improving nighttime safety with lights and specialized cameras.

Here in the United States, data driven approaches have already become commonplace in the private sector. Companies use massive data sets to better understand their retail or e-commerce customers, manufacturing plants and logistics to improve routing and eliminate waste and unnecessary costs. But there are significant challenges to deploying IoT technology for the good of the public on a large scale. Many factors come into play to halt the progress of innovative programs, including costly infrastructure already being firmly in place across most developed countries, taxpayer dollars being allocated for other initiatives, and high public scepticism or “not in my backyard” attitudes.

To this end, much can be learned by looking overseas to the possibilities of a more connected future to encourage more agile, efficient and evidence-based decision making to improve people’s well-being.

Setting an example

Andhra Pradesh, the eighth largest state in India, is home to approximately 53 million people and a leading example of how this approach can achieve social change and improve sustainability. Like many states, they are tasked with solving complex challenges such as sustainable resource management, poverty, crime and responding to natural disasters. Unlike most states, Andhra Pradesh has one of the most innovative smart city and smart state programs in the world — an achievement typically attributed to more developed nations, and one that’s making a big difference to its people.

Andhra Pradesh has undertaken a journey of digital transformation to drive economic growth and innovation while creating a safer, more sustainable lifestyle for its people. The state’s ambitious, overarching objective is to achieve the 17 United Nations’ sustainable development goals for 2030 – only Andhra Pradesh aims to achieve this by 2022.

Sustainable Development Goals

United Nations’ 17 Sustainable Development Goals

If that weren’t already enough, this progressive, data-driven government also wants its people to be happy. It has set about using data analytics to quantify what happiness means to its citizenry, what leads to happiness and what the state can do to make happiness possible for all of its people in a measurable way. To the government of Andhra Pradesh, the “common person” is its customer.

To achieve these lofty goals, Andhra Pradesh inaugurated its Real Time Governance Center, the hub for the state’s coordinated effort to provide real-time monitoring and reporting across its entire ecosystem of 13 districts with 33 departments, including more than 300 agencies offering 745 services, each with its own data systems and analytics.

Identifying areas for improvement, the center then uses data to track those improvements with the managers of those departments, agencies and services. The center integrates massive amounts of data gathered from IoT devices, security and traffic cameras that use video analytics, department databases and citizen surveys, along with real-time data from its various functions, to enable constructive real-time governance.

Emphasizing accountability and transparency for all of its citizens, the state’s real-time governance initiative takes this approach to enhance several key areas of government:

  • Government services, such as education, ration and pension distribution, agricultural subsidies, and management of infrastructure resources such as streetlights, water reservoirs, buildings and energy.
  • Disaster relief and management, with the goal of predicting dangerous weather events and providing timely warning to citizens prior to the onset of a natural disaster to save lives and reduce the negative impacts.
  • Public safety, with the goal of making Andhra Pradesh one of the safest places to live in the world.
  • Solving human problems with data analytics

Andhra Pradesh has set admirable goals to improve life for its large population and solve social and economic challenges — many of which are shared by other, similar communities throughout the world. Having an accurate understanding of the current situation and historical trends is key to measuring and solving these issues. However, many cities in both developing and developed nations lack good data and good data management provided through smart spaces technologies. Manual observation yields only a very small snapshot of what is taking place in any given area. Without the holistic view of a society that can be gained from using an electronic or automated approach to collecting, analyzing and drawing insights from data, it’s difficult for agencies to get apples-to-apples comparisons and collaborate with each other to achieve their goals.

Data analytics is foundational to Andhra Pradesh’s success in overcoming its challenges and achieving its objectives. For example, the elimination of waste and corruption are important steps to moving toward significant positive social and environmental change. By using data analytics, Andhra Pradesh can track goods or food rations to ensure that they’re not lost or stolen and that the people they are intended for don’t go hungry.

A role model for data-driven social transformation

By ensuring the program has put the right solutions in place, Andhra Pradesh’s leaders have the insights and tools they need to improve their citizens’ lives. These aren’t just hopeful utopian visions without a plan, however — they’re based on measurable information, holistic management and results. By taking a data-driven approach to creating a better life for its citizens, Andhra Pradesh is not only setting the bar for what positive social change means, it’s also providing a model for how other communities around the world can use pragmatic approaches to solve real human and sustainability challenges — to transition our global civilization to one that will not only survive the decades to come, but will help all people thrive along the way.

Andhra Pradesh can be a role model for states, cities and nations around the world to take a data driven approach to improving the lives of their citizens, while creating thriving and sustainable economies. In order to achieve these goals, other governments can gather data to:

  • Asses the needs of their people — what are their reported and measured challenges they face and the scale of these challenges? What are the factors that create these challenges and how can they be adjusted?
  • Set measurable goals and track key performance indicators that are regularly reported on and used to evolve strategies to improve performance.
  • Create accountability by assigning clear roles, achievable expectations and which goals each person or team will seek to improve, with defined rewards for good performance and learning mechanisms to reverse poor performance.
  • Engage the public for feedback and ideas for improvement, while providing transparency into how the data is used and how the organization is performing.
  • Practice agility to change programs or approaches that aren’t working and learn from results regularly, allowing lessons learned to affect strategies, tactics and actions.
  • Share lessons and challenges with sister cities, states and nations to gain new perspectives, ideas and resources for improving life and sustainability for people today and in the future.

With these data-driven approaches to improving lives, economies and our environment, we can achieve the UN’s Sustainable Development Goals together and create a thriving, fair and sustainable society for future generations.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


July 19, 2019  2:15 PM

How 5G, edge computing and IoT create the perfect storm

Bruce Kornfeld Profile: Bruce Kornfeld
5G, 5G and IoT, 5G technology, Edge computing, Internet of Things, iot, IoT edge computing, IoT strategy

In just three decades, cellular network advancements have introduced analog cell phones, text and voice messaging, GPS location capabilities and breakthrough speed improvements with each generation. First introduced in late 2018, the 5G wireless technology is on the horizon with speeds approximately 20 times faster than today’s 4G cellular networks.

With enhanced speed comes a natural influx of data creation at any location on earth at any time. Edge computing has been wildly helpful for quickly processing, storing and analyzing that influx of real-time data produced by a variety of devices. With sophisticated broadband now quite literally in our pockets — along with that calculator your algebra teacher said would never be there — every human with a smartphone could be considered an edge site.

Verizon estimates that about 8.4 billion connected devices are in use — up 31% from 2016 and it will increase to more than 20.4 billion by 2020. As IoT continues to drive the data explosion at the edge, and 5G enters the scene with promises like lower latency, improved speeds and higher capacities, virtualization and powerful edge computing technology must keep up. Today’s edge computing was designed for today’s technology, but there are many innovations that will go mainstream in the next one to two years, including 5G, open source hypervisors and containers. How will the modern IT admin keep up, simplify and automate operations?

Edge-based 5G: Can you hear me now?

5G will make data processing dramatically faster and more efficient. More information than ever before will need to be analyzed quickly, thanks to millions of devices producing data while connected to the 5G network. Prior to 5G, edge computing sites had, of course, used wireless technology whenever feasible, but sites today typically require a traditional, costly wired Ethernet internet connection to access the WAN. This connection is in place to transmit a small portion of data back, such as daily sales transactions, to a home office, cloud or data center for monitoring, backup, archive and data analysis. In some cases, 5G connectivity could eliminate the need for a wired connection, so more edge sites will pop up and be not just affordable, but easier to build, deploy and manage. This will lead to new IoT use cases, data collection and monitoring applications never before imagined.

In a report from Gartner, researchers predicted that by 2024, 60% of countries will have 5G network service provisions available from at least one cloud service provider. As 5G adoption takes place at the edge — where IoT requirements are growing fast — the need for high availability, low cost and simple management becomes even more important because these sites are typically small and have no IT staff to help. Just as 5G needs the edge to be successful, the edge needs 5G to run more efficiently and cost effectively.

IoT reimagined and automated with edge-based 5G

IoT today focuses on how machines and connected devices communicate with each other. In the real world, IoT cannot stand on its own; the network always serves a specific purpose and delivers an operational improvement for an organization. While edge is strengthening the convergence of operational technology and IT, perhaps IoT is also morphing to assume a broader definition: infrastructure operational technology.

5G has greater implications on IoT than simply improving phone speeds; the latest network will increase processing power to connected IoT devices and systems across various enterprises. With 5G embedded in IoT devices, new forms of operational technology and process control will be introduced, alongside new levels of automation that some industries have never seen before. Edge computing, 5G speeds and endless IoT data combine to create automation. Real-world examples include:

  • Traffic control: In smart cities, real-time monitoring becomes more widely available due to faster speeds. Automated decision making improves traffic alerts and rerouting processes, to reduce collision rates and road rage, saving lives.
  • Factory operations: It will become more widespread for food processing plants to automatically monitor temperatures to avoid over- and under-heating, eliminating waste and spoilage, and to deliver higher quality products, saving labor costs and money.
  • Autonomous vehicles: Centralized monitoring could potentially track semi-driverless cars in real time, eliminating the need for a human co-pilot. Centralized co-pilot centers assume human responsibility to monitor directions, upcoming traffic patterns and emergency alerts, ensuring driver safety.

The perfect storm

While 5G is not quite ready for prime time, it is rolling out now. In a Gartner report, researcher note that 5G could become an important enabler of edge computing, but won’t reach its full potential until at least 2023. To prepare, IT administrators should start thinking about how they will incorporate 5G in their IoT strategy to speed transmission and create new applications. Organizations should research and shortlist partners to ease the transition for their future ecosystems now.

Edge has kept up with the technology races and there are many offerings that are simple to operate, not dependent on specific hardware and can encrypt at the software layer to alleviate security concerns in the software-defined 5G network of tomorrow. The future is almost here and promises to be cheaper, faster and better connected than ever before.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


July 19, 2019  10:13 AM

High-productivity architecture creates a faster path to ROI for IIOT

Abhishek Tandon Profile: Abhishek Tandon
IIoT, industrial internet of things, Internet of Things, iot, IoT platforms

One of the biggest dampeners for industrial internet of things projects today is the lack of ROI. The journey from device to analytics to results can take years, and stakeholders have very little to show against the millions of dollars of investment, causing them a lot of distress.

A finished IIOT mechanism is a sum of many parts, such as sensor deployments, device-to-cloud integrations, communication protocols, security, analytics and application interfaces that need to be designed and integrated. Unless all of the pieces come together, the business objective will not be fulfilled. What’s the point of having a cloud strategy and storing petabytes of data when the recipient of the analysis does not get a glimpse of the outcome? What good is a predictive model if the only people who can understand the outcome are data scientists?

High-productivity architecture is the answer to many problems

Another aspect that affects ROI is the time to first outcome. Due to the large number of mini-projects within a large IOT project, the first outcome — which decisions can be based on and tested — can take several months, if not years, to observe. The large quantity of investments that go into IoT projects have a chance of nothing to show if the technology pieces do not fit well. This is a major cause of concern for executive sponsors, as they are subscribing to a potential blackhole.

Organizations can solve these problems by incorporating high-productivity platforms — which support straightforward management and automation of connected devices — in the architecture, which helps development teams prototype and create a product faster.

High-productivity architecture

High-productivity architecture (HPA)

Rather than talking about one application platform, the idea of high-productivity architecture stems from the fact that a series of platforms need to be interconnected with independent modules integrated together in a way that solves the IoT problem in totality. For example, the IoT reference architecture by Azure and the C3 Integrated Development Studio both use highly productive platforms to build high-productivity architectures. The modules for device integration, data storage, analytics and consumption integrate with a focus on result-oriented iterations rather than lengthy software development cycles. Modules automate repetitive tasks at scale to remove the complexity of DIY and still ensure that technology teams can tweak them to suit their case and reach a shorter time to first outcome.

The key to the success of a high-productivity architecture is having a modular, repeatable, automated, integrated and business-oriented approach to problem solving. This ensures faster turnaround, easier buy-in and the ability to integrate with outside technology seamlessly, saving the effort to build the entire architecture from bottom-up. Moreover, lean teams can deliver greater progress in a short period of time, creating savings. The focus needs to be on fast development centered on a defined problem statement and use of already available platforms to solve a business case rather than hope for an outcome using a DIY strategy. This is where high-productivity architecture can really help.

Stacking devices to data to analytics

The typical high-productivity architecture that organizations use to solve an IoT problem would look something like this:

  • No-code for sensors’ data connectivity and process systems to integrate in a data lake on the cloud.
  • Easy to configure business rules engines to create event warnings based on prior knowledge.
  • Solution-oriented machine learning approaches identify and surface patterns with minimal rebuild and high reusability.
  • Integrated low-code application framework to connect to data sources and devices, provides secure authentication, engages services through push notifications and integrated business logic.
  • Highly engaging web and mobile frontend to talk the language of the field engineer and assists him in making faster and better decisions

Each of the modules are essential for a working IoT mechanism and have a level of automation which will expedite the integration and prototyping of the entire technology.

High-productivity architecture is designed to ensure that energy and investment is spent on deriving value for business. It is designed for faster iterations and more focus on outcome. Finally, it has a shorter learning curve for technology teams and employs the highest quality technologies so that there is no compromise.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


July 17, 2019  10:39 AM

Machine learning tips to build a facial recognition tool

Jonathan Fries Profile: Jonathan Fries
ai, Algorithms, Data, Data scientist, Machine data, Machine learning, machine learning algorithms

Even just a decade ago, it was hard to believe that computers would be able to drive cars or easily recognize pictures of a cat. The programming and AI tools around at that time struggled with computers seeing the world around them and processing that information accurately.

In the last 10 years, machine learning has slowly grown from a field of research to a mature technology with real-world applications that are used by some of the top organizations and multiple industries today. At the most basic level, machine learning creates algorithms that solve some of the most complex and interesting problems that technology organizations face.

Many of these problems have moved from science fiction to established fact. Some have become positively easy, such as handwriting recognition, which is now the “Hello World” of machine learning. Through Exadel’s own experience developing machine learning programs for clients, we wanted to share some of the do’s and don’ts of using machine learning technology.

On the surface, the result of the machine learning process looks much like traditional programming because the end product is a programmatic algorithm that processes information. When it comes to the how of machine learning program creation, it actually looks quite different.

How to create a machine learning model

First, a few don’ts of creating a model: don’t define requirements, design a system or algorithm, write code, test or iterate as you would with traditional software development.

With machine learning you must characterize the problem in a way that makes it susceptible to machine learning, and then you must understand if you have the data that can help solve that problem. Next, you define a model, train the model with training data and test the model, hoping that your training resulted in a high probability of success. If it didn’t, you tweak your model and retrain.

Facial recognition app development

One of our clients came to us for help developing an app to make it simpler for secure check-in to an office space. The client had requests to simplify the visitor check-in process and avoid duplicate data entry. When someone checks into the office, they must enter a few pieces of information, including name and phone number, into a tablet at the front desk. For reasons of privacy, this needed to be re-entered every time because we can’t simply provide a list of all the people who have previously checked in. Re-entering this information was repetitious for visitors, but important for the client to know who was in the office and how to get a hold of them. In order to automate this process and to provide security for the information, we decided to use facial recognition to identify visitors and understand if they had been in the office before. If they had been in the building before, we would have a picture on file and could identify them when they took a picture again. We decided to use machine learning, open-source tools and open source projects as a baseline. Not surprisingly, we sought out existing tools to develop this application.

In the existing app, when a visitor first comes to the office, they fill in the information on a tablet and the tablet takes their picture. The check-in tool now has a profile and an image that can be used to recognize each individual.

To create this facial recognition system, we used some off-the-shelf machine learning and computer vision (CV) components:

  • Python: generally the language of choice for machine learning today.
  • Tensorflow: an open-source machine learning and neural network toolkit. Tensorflow is the go-to library for numerical computation and large-scale machine learning.
  • scikit-learn: Simple and efficient tools for data mining and data analysis.
  • scipy: a free and open source library for scientific and technical computing.
  • numpy: a Python library supporting large, multi-dimensional arrays with a large library of functions for operating on these arrays.
  • OpenCV: an open-source library of functions aimed at real-time computer vision.

These are all very common tools used for machine learning projects. We’ve been working with and adapting open source code to tie all of these components together, including the face recognition using Tensorflow GitHub project.

Developing the code and tools to do facial recognition is important, but, as mentioned above, the core of machine learning is to train the model until the results on test data — which has never been evaluated during training — provide a high-enough level of success to say that the developed neural network algorithm can recognize people in the setting — in this case, checking in at the front desk.

Data is very important here as well. Best practices indicate that you should have training data, validation data and test data. Organizations use training data, data that your model learns from, to train the model. Machine learning specialists use validation data to review the trained model. The machine learning specialist may then change or tweak inputs, based on this validation. This is part of the iterative process of developing the model. The machine learning model never sees test data except in the final testing steps. It is the gold standard that is only used once the model is fully trained. It may be used to compare the success of two different, trained models.

The pre-processing and training processes look like this:

  1. Find the face: Find the face within the image. Real world images contain more than the face, so you first must isolate the pieces that comprise the face.
  2. Posing and projecting faces: Even the best computer algorithms work better if every image has the same proportions. We needed to align the face within the image frame to improve its use with the machine learning model.
  3. Calculate embedding from faces: A human describes the difference between faces using visual human-readable characteristics, such as nose size, face width or eye color. We use neural networks that automatically determine machine-readable features.
  4. Use embedding for training model: The step where we are training the model with images or using the trained model.

Once we have trained the model and tested it, we can deploy it so that it can be used by the tablet program to check newly created images to see if they match anyone who has visited the office before.

We created a web API that the tablet application uses to send in a photo to potentially match the new image against the image database.

Machine learning is still a relatively nascent technology, but its applications are starting to become more pervasive. As we start to better understand the best practices and uses for machine learning, organizations must have the skills ready to keep up with the competition.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


July 17, 2019  9:30 AM

IoT is fuel for the fourth Industrial Revolution

Mohamed Kande Profile: Mohamed Kande
fourth Industrial Revolution, IIoT, IIoT strategy, industrial internet of things, Industrial IoT, Internet of Things, iot

I’ve been promoting the progress of the fourth Industrial Revolution for a few years and I’m excited to see the momentum of the 4IR conversation in the market. However, it’s still an enigma for many organizations that need help to figure out where to start.

The 4IR isn’t just about technology. Although the 4IR does encompass different types of technology, including artificial intelligence, IoT and robotics, the non-tech arena is equally important. We must remember to keep the workforce, cybersecurity, resiliency, trust or distrust of technology and business model innovation top of mind. Marrying the tech and non-tech areas is key to success in the 4IR.

­

Success can take many forms — such as more efficient operations, better customer engagement, more productive employees and the creation of new businesses — but the primary goals are to solve business problems and generate new opportunities.

What differentiates this revolution from the rest

Like the three previous industrial revolutions — steam, electricity and digital — the 4IR represents a giant leap forward for business. Unlike the first three revolutions, during which people connected to machines, in the 4IR, machines connect to other machines and people. That’s a profound difference.

Why? Because all those machines that are connecting to other machines create vast amounts of data, far more than people can produce on their own, leading to a crucial 4IR challenge: trust.

With all the breaches that have taken place, distrust seems more prevalent than ever. Banks, retailers, social media platforms and organizations that routinely ask people to trust them to protect their data face a lot of skepticism these days. The 4IR goes way beyond that; it basically asks humans to trust millions of devices and machines and all the information they produce and distribute. This is a critical issue for organizations to address if they don’t want to lose out on the opportunities 4IR offers.

Consider the economic challenges and prepare a strategy

Despite the need to overcome the trust gap, several economic challenges reinforce that now is an ideal time to move forward with a 4IR initiative. Nearly 30% of the CEOs in PwC’s 22nd Annual Global CEO Survey forecast a decline in economic growth in 2019, up from just 5% who felt that way in the prior year. During the same period, optimism among North American CEOs dropped from 63% to 37%. That’s unsettling.

How should your organization deal with these tough business challenges? Enterprises that build a technological and cultural framework that supports the 4IR will be far better prepared to ride out any seismic changes that may unfold.

Start to scope out the 4IR world, but move at your own pace

As you begin considering your 4IR strategy, be sure that you develop an understanding of the essential eight technologies. In particular, pay attention to IoT, the glue that binds all connected tools, technologies and systems in a 4IR organization. These connected devices go beyond merely boosting efficiency and cutting costs; they can also disrupt an organization or an entire industry.

But there’s no need to start by tackling every 4IR technology at once. Instead, begin with a manageable use case and build up and out from there. For example, asset trackers can instantly identify the location of lost hospital equipment and can reduce the number of lost, stolen and misplaced devices and machines. Other types of sensors can identify and count valued customers, so they can be given an exceptional experience and reduce wait times. Hotel occupancy sensors can determine when people are physically present in a room, so the lighting and temperature can adjust automatically.

These modest implementations do more than increase productivity, efficiency and customer satisfaction. Organizations can join the 4IR at their own pace, by automating tasks and then scaling up from there.

Don’t underestimate the importance of culture

Since IoT and the 4IR are about more than just technology, you must think about cultural considerations that are part of a connected business model. A vital factor here involves upskilling the workforce, yet approximately 60% of organizations in PwC’s digital IQ survey reported that their employees lacked the digital skills needed to move their organization into the future.

To be successful with the 4IR organizations must establish partnerships that bring expertise and new ideas to the table, but must also bring employees on the journey. It’s about more than tech training and bringing in tech expertise; it’s about reshaping digital mindsets and creating a collaborative culture.

Consider the potential

All of this may seem like a big investment in time and resources, but the potential ROI makes it worthwhile. Wouldn’t you applaud an investment that resulted in better service to customers, enhanced employee productivity, the development of new products and services and the creation of new, innovative business models? I would and do. Already, the effects of the 4IR on business and industry are clear and we’re just getting started.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


July 16, 2019  5:33 PM

IoT investment has to begin now

Nima Negahban Profile: Nima Negahban
Internet of Things, iot, IoT data, IoT market, IoT use cases

It’s been widely reported that the IoT market is expected to generate significant market opportunity – upwards of $267 billion — by 2020. Numbers aside, one of the main drivers is the rapid growth of IoT and connected devices consumed at both the consumer and industrial level, leading to an extreme side effect — a massive explosion of data. The amount of data created by all these devices is expected to reach 847 zettabytes per year by 2021.

The IoT potential can be limitless if businesses can identify and understand how these devices and the data they create affect people’s daily lives. Let’s look at a few examples of the opportunities ahead with IoT data.

Consider an energy exploration organization and its ability to uncover viable energy resources. What if they were able to visualize and analyze data from an unlimited source of sensor data points? It would make the process of finding natural resources much more efficient and equip their scientists with information of the highest degree of accuracy. This would change the way exploration is done, reducing the risks to the environment and potentially cutting down the time it takes to complete the process.

We’re witnessing more and more examples of IoT at work. Another example is a large delivery player, using thousands of delivery vehicles and staff to deliver packages to more than 150 million addresses.  The organization can use near real-time location data from connected devices to streamline delivery routes and reduce inefficiencies in distribution methods.

These are two examples of current IoT use cases. There are a lot of smart things out there.  Even a light bulb has an IP address behind it these days. It’s now or never to monetize IoT applications. It’s a good start for businesses to collect and store IoT data, but it’s more meaningful to understand it, analyze it and use the insights to improve efficiency and effect meaningful changes.

The unlimited potential of IoT carries across multiple industries that all stand to benefit from the technology and the data that comes with it, including saving energy, package route delivery optimization and predictive maintenance. A focus on location intelligence, advanced predictive analytics and the rise of streaming data analysis will undoubtedly drive a return on IoT investments.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


July 16, 2019  4:32 PM

Three Es of AI: First an intro to ethical artificial intelligence

Scott Zoldi Profile: Scott Zoldi
Artificial intelligence, Bias, ethical AI, ethics, Internet of Things, iot, Machine learning

This is the first piece in a three-part series.

There is a lot of controversy in business circles as to whether organizations use AI technology for unethical purposes. This blog isn’t about that at all.

From a data scientist’s point of view, ethical AI is achieved by taking precautions to expose what the underlying machine learning model might learn that could impute bias. At first glance, latent features of the model or relationships between data may not appear to be biased, but deeper inspection shows that that the analytic results that the model produces are biased toward a particular data class.

Bias can be imputed through confounding variables

One of the most common misperceptions I hear about bias is “If I don’t use age, gender or race, or similar factors in my model, it’s not biased.” Well, that’s not true. Even though the same people holding this opinion know that machine learning can learn relationships between data, they don’t understand that there are proxies to biased data types in other features that are captured. These proxies are called confounding variables and, as the term indicates, unintended variables can confuse the model into producing biased results.

For example, if a model includes the brand and version of an individual’s mobile phones, that data can be related to the ability to afford an expensive cell phone — a characteristic that can impute income. If income is not a factor desired to use directly in the decision, imputing that information from data, such as the type of phone or the size of the purchases that the individual makes, introduces bias into the model. A high dollar amount on purchases can indicate that an individual is more apt to potentially make these types of transactions over time, again imputing income bias.

Research into the effects of smoking provides another example of confounding variables. In decades past, research was produced that essentially made the correlation, if your smoke, your probability of dying in the next four years is fairly low; that must mean smoking is OK. The confounding variable in this assumption was the distribution of smokers. In the past, the smoking population contained many younger smokers whose cancer would develop later in life. The older smokers were already deceased. Thus, the analytic model contained overwhelming bias and created a biased perception about the safety of smoking.

In the 21st century, similar bias could be produced by a model concluding that, since far fewer young people smoke cigarettes than 50 years ago, nicotine addiction levels are down, too. However, youth use of e-cigarettes jumped 78% between 2017 and 2018, to one out of every five high-school students. E-cigarettes are potent nicotine delivery devices, fostering rapid nicotine addiction.

The challenge of delivering truly ethical AI requires closely examining each data class separately. As data scientists, we must demonstrate to ourselves and the world that AI and machine learning technologies are not subjecting specific populations to bias.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


July 16, 2019  2:48 PM

IoT PCB microelectronics manufacturing calls for intricate tools

Zulki Khan Profile: Zulki Khan
electronics, Internet of Things, iot, IoT hardware, IoT PCB, PCB, wiring

Electronics manufacturing services providers who use conventional printed circuit board assembly and manufacturing models are scrambling to augment their production facilities to handle increasingly smaller boards like those used for IoT devices.

At today’s advanced printed circuit board (PCB) houses, savvy leaders are homing in on newer technologies and merging conventional surface-mount technology manufacturing with the newer microelectronics manufacturing. Why? Because sophisticated IoT PCBs demand a different breed of inspection and calibration to comply with extremely miniature dimensions and that’s where PCB microelectronics lives.

Take for example, the newer higher-powered laser microscopes introduced on the microelectronics assembly and manufacturing floor. These tools are tailored to perform inspection and calibration tasks that legacy PCB manufacturing systems cannot handle because they cannot deal with the minutest details imaginable. Those minute details are the cornerstone of microelectronics manufacturing.

Capabilities of advanced laser microscopes

Tools such as the laser microscopes, perform die or chip, epoxy resin and solder mask bleeding and air bridge inspections. They also calculate Z-axis dimensions and create 3D rendering. Why are these manufacturing tasks so important? All fall under the umbrella of assuring IoT device reliability and operational integrity. These highly advanced laser microscopes check for die surface defects, such as extremely fine cracks or miniscule chipping at the corners of a die. These scopes also quickly spot corrosion, contamination or oxidation.

It’s important to prevent floating die. In cases like this, miscalculating the amount of epoxy under the die for the die attach process results in epoxy resin bleeding. In other words, a die isn’t completely attached to the substrate. The microscopes verify that a poorly produced die attach doesn’t go any further in the IoT PCB manufacturing process.

Microscopes perform inspection for solder mask bleeding. This means the mask may bleed onto the pad where wire bonding is installed. The pad’s size may not be sufficient to perform the bonding. Again, these high-powered scopes inspect and verify this issue doesn’t exist.

The microscopes also inspect air bridges. An air bridge is the air distance created to bypass a component located between two other components. It connects a wire bond from one point to another and passes over the middle component.

Aside from inspections, microscopes calculate length, width and height in the Z-axis for dealing with height restrictions. Sometimes dies are attached in gold surface finish cavities, and they need to be precisely height controlled. This permits a perfect fit in those cavities before attaching wires via wire bonding after die attach. The high-powered scopes are perfect tools to view cavity length and depth and to perform height measurements in the Z-axis.

Finally, 3D rendering of the wire bond pads, substrate height or paste height provide microelectronics manufacturing technicians a clear visual. Technicians can calculate the length of the wire bonds, their loop curvatures and underfill thickness for the proper die attach and accurate wire bonding.

When it comes to IoT PCB microelectronics manufacturing, there should not be any question about the tools needed to achieve accurate inspections and calibrations.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

 


July 15, 2019  1:20 PM

How IoT empowers GPS tracking technology

Ekim Saribardak Profile: Ekim Saribardak
Fleet Management, gps, Internet of Things, iot, IoT devices, real-time location system, Smart sensors

IoT is truly a ground-breaking innovation that affects everything in our lives — from kitchen appliances to major business operations. Things like GPS trackers, sensors and even everyday objects can all be embedded with electronics and internet connectivity to make them smart. These smart objects can connect to the cloud, be remotely controlled and communicate with each other. IoT paved the way for tech companies to create tailor-made solutions for business operations and daily challenges. The radical changes in sophisticated mobile devices, computers and advanced machinery were somewhat expected because of the speed of their development in recent years, but controlling the temperature of your coffee cup via your smartphone was a pleasant surprise for everyone. The ability to create vast networks of connected devices has already deeply affected every industry in the world, leading to IoT being widely considered a new milestone in technological history.

Connected Vehicles

Source: Shutterstock

What the IoT Means for the Future of GPS

Although GPS technology has been around for decades, the emergence of IoT has reshaped the way we use GPS-based applications and devices. IoT technology enhances GPS devices to transmit data remotely and connect to other systems and sensors. Modern-day tracking devices can collect and transmit comprehensive vehicle data, including fuel monitoring, remote temperature monitoring and driver identification.

Take a look at some examples of how IoT connectivity can enhance the capabilities of GPS tracking technology:

Improving logistics and transport operations: The biggest challenge in the transport and logistics industry is the tracking of vehicles and where they are headed. Managing a fleet operation can be a logistical nightmare. With hundreds of moving assets and employees scattered across cities or even countries, it becomes almost impossible to keep track of their movements and plan efficient routes and provide customers with accurate ETAs. The introduction of IoT-enabled tracking devices completely changed the way fleet companies handle their business operations. Field managers now have access to actionable real-time information about their vehicles, giving them total control over their assets to optimize the performance of their workforce. Organizations can use real-time location tracking to make immediate changes to routes in the event of road closures or traffic and ensures they can respond promptly when customers require urgent services.

Providing solutions for people with disabilities: IoT has promising implications for the lives of people with disabilities. It can be frustrating to depend on other people to travel or take care of your daily needs. New innovations are developing rapidly to empower people with disabilities with the help of advanced sensors and IoT technology. Once a seemingly far-fetched idea, self-driving vehicles will soon be available to those unable to drive themselves. Personal GPS tracking devices offer a myriad of tools to assist people and ensure their safety, such as a panic button, event alerts and live tracking. GPS trackers with IoT-enabled sensors provide the necessary navigation tools when moving around and can scan the surroundings of people with visual impairments to direct their movements and keep them safe. Tracking devices also provide additional safeguards in the form of personal locators.

Geofence Zone

Source: Shutterstock

Tracking devices for parents: Losing a child is every parent’s worst nightmare, and parents have every right to be worried. The statistics on missing and abducted children show that our kids are very vulnerable to the outside world. IoT-enabled tracking devices are the best way to give parents peace of mind without restricting a child’s freedom. These devices create a protective barrier around children to keep an eye on them at all times. By drawing virtual fences around specific locations, such as home and school, parents can know their children are where they should be without having to actively watch the tracker. Tools, such as instant event alerts and an SOS button, can be lifesavers in emergency situations, giving parents the chance to rush to their child’s aid.

Safer driving with IoT sensors: Vehicle tracking devices reveal a surprising amount of information about our driving habits. IoT technology greatly improves a driver’s monitoring capabilities, with instant updates about their behavior behind the wheel. With sensors onboard, tracking devices detect and help prevent the driver from speeding, idling, harsh braking and other risky driving practices. For businesses with a fleet of vehicles, fleet management systems integrated with IoT and GPS technologies have unprecedented benefits when it comes to safety and security. With the driver performance reports provided by the management system, unruly drivers can be identified and encouraged to adopt safer driving habits through training schemes, warnings or rewards for safe driving. In emergency situations, managers can locate vehicles and drivers on demand to dispatch emergency services to the scene, potentially prevent serious injuries and reduce vehicle downtime drastically.

Asset Temperature Monitoring

Source: Shutterstock

Superior asset monitoring: Transporting sensitive equipment and cargo is an extremely demanding undertaking. Perishable goods such as vegetables, baked goods and meat products must be kept in a carefully controlled environment when transported over long distances, which means they must be monitored around the clock. Traditionally, drivers had to complete this laborious process manually by inspecting the cargo at regular intervals. These days, field managers use GPS tracking devices containing IoT-enabled sensors installed into each vehicle at relatively low cost to monitor every vehicle and cargo hold simultaneously. Advanced sensors can detect any changes in the condition of the cargo hold and effortlessly transfer the data to the cloud. Field managers can check the state of their goods remotely — from anywhere in the world — and make necessary changes, such as adjusting the temperature and humidity settings or informing the driver about a malfunction.

IoT technology is spreading like wildfire throughout the world. It’s already had a huge effect on many aspects of our lives, and it’s still evolving. GPS-based devices and tracking systems have been considerably enhanced by the emergence of IoT-enabled systems and connected sensors. With IoT’s superior connectivity and the vast network of connected devices, technological marvels like self-driving vehicles, smart homes and even smart cities have become possible.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: