IoT Agenda


May 9, 2018  1:23 PM

Treating IoT security as a first-class citizen

Dipesh Patel Profile: Dipesh Patel
Communications, Encryption, Enterprise IoT, Firmware, Internet of Things, iot, iot security, PSA, security in IOT, TLS

As recent security breaches have shown, malicious actors are becoming increasingly clever and sophisticated. They break in through unlocked doors, such as weak or nonexistent passwords on IoT devices like security cameras. As billions of devices are becoming connected (including many that have tremendous impact on business processes or consumer safety), security should become a primary concern. But are we really doing enough to mitigate threats? Are we learning from the painful recent experiences? According to NXC Group, nearly half of companies with an annual revenue above $2 billion estimated that the potential cost of one IoT breach is more than $20 million.

Organizations have historically weighed the cost and time of focusing on security as a deterrent, citing time-to-market delays, design complexity, fragmented ecosystems and more. But as the world moves towards a trillion connected devices over the next 20 years, we as an industry must change our behaviors and agree that security is no longer optional or an afterthought. Users need to know and trust that their IoT devices are born secure, upgradable and managed end to end.

As an industry, we have a social responsibility to build secure devices and maintain a high level of security throughout their lifecycle. We need to make it easy for organizations to securely build, deploy, connect, manage and update devices. This is why Arm has worked with our partners to develop Platform Security Architecture (PSA), the first industry framework for building secure connected devices. PSA provides a common set of ground rules and resources to reduce the cost, time and risk associated with IoT security today, simplifying the security consideration process for device manufacturers, vendors and service providers.

Strong IoT security starts by empowering developers with the tools and system-on-chip (SoC) designs needed to securely develop devices at the very beginning of design, without slowing down time to market. They should incorporate multiple layers of protection, scaling from software to silicon implementations, such as protecting against physical attack threats, which can occur when an attacker has direct contact with the device SoC or is in close proximity to it. Organizations should build technologies into separate partitions that mutually distrust each other in case an attacker breaks into one part of the system.

However, even the highest integrity devices built with the latest security protocols need updating as both devices and attack techniques evolve. Whether a smart meter or smart streetlight has been installed in the field five months or five years ago, it will need to be updated as attackers become more sophisticated and identify new attack vectors over the device lifecycle. When we consider the number of devices deployed could range anywhere from hundreds to millions, over-the-air (OTA) firmware updates become a critical requirement to mitigate new threats. Key considerations when designing in OTA firmware for secure devices include:

  • Space requirements for storing the newly received firmware upgrade;
  • Properly checking firmware signatures before installing them;
  • Ensuring enough bandwidth to support a firmware download; and
  • Determining how many firmware updates can be supported before the devices shut down if they are running on batteries.

True end-to-end security requires a comprehensive IoT device management technology for protecting connected devices throughout their lifecycle once they are deployed. This includes securely provisioning the device once it’s turned on in the field, managing the updates over the air and securing the communication between the device and the data store. Secure communications should be encrypted and based on widely deployed and tested security protocols such as transport layer security (TLS) and datagram transport layer security (DTLS).

Security is undoubtedly complex and expensive. The industry works relentlessly to consider and try to protect vast attack surfaces, while hackers need to find just one vulnerability to undo all that hard work. It’s time for us to bring security efforts out of the shadows and make it a first-class citizen in our companies. By working together, we can build a safer future for everyone as we move towards a trillion connected devices being deployed over the next 20 years.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

May 8, 2018  3:54 PM

CBRS, LAA and haven’t we been here before?

Greg Najjar Profile: Greg Najjar
Internet of Things, iot, IOT Network, Microsoft, Regulations, White space, Wireless, Wireless connection

There is a cutting-edge technology aiming to increase blanket connectivity, solve for the last mile and provide a new approach to delivering high-quality connections. It is being tested in underserved markets, and technology leaders such as Google and Microsoft are investing heavily in it.

Today, the statements above would be about the CBRS and LAA bands, but five short years ago, they would have referred to white spaces, the unused frequencies between channels allocated to various broadcasting services. The utilization of white spaces hasn’t yet materialized in the way that its benefactors expected, but there are lessons and takeaways that companies in today’s IoT ecosystem should consider from the once popular “next big thing.”

The regulatory battle

One of the reasons that white space initiatives have been slow to materialize has to do with regulations. Many of the TV broadcasters were loath to give up spectrum that they controlled, leaving a challenge for the FCC in terms of creating usable swaths of available spectrum. Similarly, LAA has come into a variety of political and regulatory issues as well. For instance, the New York City mayor’s office has expressed concerns over use of the LAA band due to the city’s heavy investment in Wi-Fi, fearing that there might be interference.

This isn’t to say that usage of CBRS or LAA will ultimately be regulated out of usage, but it is something that companies planning to implement these technologies should be aware of, given the limited growth regarding white spaces.

The last last mile

White spaces were hyped as a solution for providing rural connectivity and solving for the digital divide in the U.S. In fact, Microsoft is still a believer in this approach, planning to provide broadband connectivity to 2 million in people in rural America by 2022. One of the biggest challenges to overcome for the last mile has to do with the actual infrastructure to support connectivity. This can be in the form of fiber that provides backhaul for signals, or the physical towers necessary to broadcast a frequency.

Today, we are seeing some of the same hurdles regarding this issue. Small cell technology, which can be used to create mesh networks for CBRS or LAA, is running into challenges across the U.S. as well. For IoT companies looking to stay ahead of competitor, it’s important to be aware of which bands are gaining access to the infrastructure necessary to maintain connection and which are not.

All of the above

The lack of hype around white spaces shouldn’t be considered as a death knell for usage of the bands, nor should it be a harbinger of bad news for the CBRS or LAA frequencies. Microsoft is still moving forward with white space initiatives, while Charter is rolling out a wireless network based on CBRS. The realities of blanket wireless connectivity, however, will ultimately be more nuanced and inclusive than some sort of “either/or” answer. In fact, the better answer might just be “all of the above.” With more than 20 billion devices expected to have some sort of connection by 2020, creating a network that can support all of them will require a combination of approaches. While CBRS and LAA will be important, white spaces may still have a significant role to play.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


May 8, 2018  1:50 PM

IoT-enabled transportation infrastructure: A step toward a smarter future

Matt Caywood Matt Caywood Profile: Matt Caywood
Commuting, Consumer IoT, Internet of Things, iot, Mobility, smart city, Smart transit, Sustainability, transportation

In recent years, the “smart city” buzz phrase has become more prominent than ever. What that means is up to interpretation, but the first place we have to start is with how we get around. Public transportation and, increasingly, private mobility companies are the lifeblood of a city — without them, it cannot grow.

Beginning our global transformation into an IoT-enabled society has to begin with transportation for two reasons. First, it will make the biggest impact on the day-to-day lives of average citizens. As we have seen in New York City, the recent crumbling of its subway system has had a daily negative impact on nearly 6 million people. Second, public transportation is the foundation upon which other innovations can thrive. Time spent waiting for a train or a bus is time that isn’t spent on other endeavors, slowing down creativity and wasting more than $60 billion every year.

The possibilities for how IoT can improve how people move around cities is nearly limitless; we like to describe its eventual form as “frictionless mobility.” Picture yourself biking to a metro station, leaving the bike directly outside in an open dock, walking down to the platform having paid in advance on your phone, and walking up to the approaching train. No time wasted, fully without inconvenience or frustration.

We may be closer to this reality than you think. More and more transit agencies across the world are developing real-time feeds for their vehicles, a movement spearheaded by Google, developing what is now known as General Transit Feed Specification, or GTFS. With a common way of formatting the data now in place, it is easier than ever for this data to be harnessed and used in a way that can help the average person.

At TransitScreen, for example, we use this data to create live displays at the place where people make decisions about how they’re going to commute that day — whether it be in their apartment building or at the office. They include public transit systems, bike-share, car-share and ride-hailing services so the viewer is able to make an informed choice about which form of mobility is best for each individual trip.

This is just one of many smart improvements that can be made in the transportation space to make our lives and our cities more efficient. These same sensors providing real-time locations could also be analyzed to determine if a particular route needs more or fewer buses. This could avoid the problem of “bus bunching,” which leads to slower travel and lower capacity overall.

Ameliorating these frustrations is essential not just to avoid headaches, but to help our planet itself. The transportation industry is responsible for 27% of greenhouse gas emissions in the United States, according to the EPA. Creating a better experience overall for people choosing sustainable transportation makes them more likely to take it again in the future. The fewer people who drive alone as a main form of transportation, the better off we will all be. Taking advantage of the IoT-enabled infrastructure already in place is the first step to a smarter future.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


May 8, 2018  12:48 PM

The IoT POC: A rock and a hard place!

Rajesh Devnani Profile: Rajesh Devnani
Internet of Things, iot, Machine learning, partners, Pilot, poc, proof of concept, ROI, Startup, Startups

“You’ll never find your limits unless you’ve gone too far.” — Aron Ralston, Between a Rock and a Hard Place

The grand vision of industrial IoT is trillions of dollars in economic value and hundreds of billions in needed investments. That prospect alone is enough to make both the end user industry segments and the technology providers alike salivate at the prospect of what the IoT revolution will unleash. With tech vendors pushing hard to enhance IoT adoption and end users all too willing to indulge on the tech provider’s dime, nothing seems quite as right as an IoT proof of concept (POC) to begin with. The end game thus becomes navigating past the righteous POC phase before entering the territory of IoT at production scale. The IoT POC becomes the gateway which we need to mandatorily transcend on our way to IoT heaven.

There however remains a small glitch! The POC purgatory is holding us captive. It’s akin to crossing a chasm that can’t be breached. The POC approach is emblematic of the “look before you leap” psyche that typically pervades any big ticket technology initiative. It’s an eerie reminder from three decades back at the onset of the ERP revolution, where the initial approach was to begin just with financial accounting functionality (despite all ERP talk about integration capabilities!). It was a while before the course correction was applied by implementing all core modules together. The floodgates eventually opened and we then saw the mass adoption of ERP.

A survey by McKinsey revealed that 71% of IIoT initiatives are stuck in POC purgatory (41% in limbo at pilot stage and another 30% in startup phase). Another study by Cisco indicated that 60% of all IoT initiatives stall at the POC stage and that only 26% of companies have an IoT project that they classify as a complete success. While there are definitely signs of some green shoots, those scalable yet limited IoT initiatives are clearly not enough for the trillions in economic value to be realized over the next decade.

Here’s why isolated IoT POCs are a bad idea and key takeaways to avoid the same.

Missing the forest for the trees: Often, the way the POCs are structured is that they work hard to prove that the technology works. What this eventually leads to is that the plumbing works just fine, but the mains remain off. Focusing the POC on purely the technology aspects without a business metrics yardstick is a sure short recipe for disaster. It does get the juices flowing of the technology folks, but the initiative doesn’t get the management buy-in owing to a lack of an associated business case justifying the ROI.

Takeaway: Conduct a detailed diagnostic assessment to pick the right business use case which addresses key business metrics and delivers the right business impact.

Taking a scatter-gun approach: Very often, IoT initiatives get initiated by the diktat of an enthusiastic but largely hands-off management team. The folks entrusted with driving the initiative missing any leadership guidance end up spawning multiple POCs (often with different vendors!). Quality and focus are sacrificed at the altar of quantity and giving everyone a shot at the freebie POC.

Takeaway: Ensure a holistic approach with a concerted focus (you can’t possibly take four different tech stacks to the production phase). Pick complementary partners wisely after a thorough evaluation and co-create with them.

Inhibiting the law of accelerating returns: Ray Kurzweil’s law of accelerating returns states that while we humans are linear by nature, technology is exponential. Constraining an IoT initiative within the bounds of a narrowly defined POC translates to holding it captive, clipping its wings and not allowing it to take flight — which goes against the law of accelerating returns. The end result is that the end outcome has limited overall appeal since it focuses on a very tiny dimension, works off a limited sliver of data and is poor on meaningful insights.

Takeaway: Ensure that even the POC has a reasonable scale (functionality and complexity), uses adequate data and is supplemented with the right context to be representative of real-life scenarios.

Taking the road to nowhere: Quite often, there is an excessive focus placed on the POC without defining a clear roadmap on how the POC will transcend into a full-blown production rollout. Once the POC is over, the metrics measurement is hazy, the next steps are unclear, the major stakeholders are not aligned and the nuances needed to scale the POC have not been addressed. The POC stays stuck in a quagmire.

Takeaway: Define the roadmap from POC to production and ensure adequate socialization with key stakeholders who will ensure successful transition to production.

The (false) lure of machine learning nirvana: Crawl, walk and run — that’s how life works. Expecting machine learning to give instant gratification is a blunder. Machine learning projects take at least a few months of arduous efforts to collect and curate the data and tweak the algorithms to eke out the relevant insights. Even then there is no silver bullet, no single big-ticket item. It’s all the small analytic victories that add up over time, so expecting a single isolated POC to magically deliver insights is a myth.

Takeaway: Have realistic expectations of what machine learning can deliver and in what time frame. Ensure that the use case has enough fuel (i.e., data) to fire the machine learning algorithms.

In closing, although the industrial IoT revolution is for real, we are bound to witness these early false starts and growth pangs. Significant barriers related to leadership, skills and expertise, ROI quantification, security and implementation still need to be surmounted to realize the true promise of industrial IoT. As with other tech adoption, we will eventually cross the chasm to scale the industrial IoT initiative — and hopefully quite soon. Until then, we are stuck between a rock and a hard place!

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


May 7, 2018  3:43 PM

Blockchain could hold the keys to the IoT’s future

Jamie Bennett Profile: Jamie Bennett
Bitcoin, Blockchain, IIoT, Internet of Things, iot, IoT analytics, IoT data, iot security, Logistics, Supply chain

The internet of things is facing a dilemma. While connected devices are expected to dominate every aspect of our lives in the coming years — and already outnumber humans in terms of a basic headcount — belief in the ecosystem’s security is still lacking, and has a long way to go before businesses and individuals fully invest in its potential.

In a recent poll (note: registration required) of IoT professionals, 54% expressed concerns around current standards of IoT security. Take into account that 56% of these same professionals expect every single device around us to be voice-enabled within 10 years, and you have a tension that’s only set to grow.

Connecting technologies, not just devices

The answer may lie in the complete opposite of the traditional notion of centralized data — the decentralization of data. In other words, blockchain.

Many see blockchain as the tool to bring scalability and privacy to the IoT universe, and 68% in the recent poll expect integration between the two technologies within five years. It has all the signs of a successful marriage too, with geographically distributed IoT devices suited to the similarly decentralized peer-to-peer ledger.

Individual sectors are already seeing blockchain disruption — for example, the music business with the protection of artists’ intellectual rights, and financial institutions in the handling of payments and guaranteeing the integrity of transactions. It’s not far-fetched to see the synergies of multiple technologies as the next step in the innovation timeline.

The specific question for IoT, and those businesses that see such great potential in connected devices, is what type of blockchain can wield the greatest rewards.

The other blockchain

Most people will now be familiar with public blockchains through the stardom of Bitcoin. The democratization of payments has seen blockchain technology become part of mainstream conversations, but private ledgers are largely unexplored.

Private blockchains, which allow only a preselected group of people to maintain the integrity of the ledger, can empower businesses to be more intuitive in the way they manage IoT devices. Most businesses do not operate in silos and have to abide by certain regulations or bodies of authority. It’s not practical to reinvent every industry in the mold of Bitcoin, so permission-based ledgers may hold the key to scaling IoT sensibly, while all the while maintaining compliance with external structures.

Take agriculture, for example. IoT is already helping optimize supply chains and deliver smart logistics — tracking assets from a farmer’s field all the way to the shop floor through a web of connected sensors. A partnership with blockchain, however, can take this one step further, guaranteeing food safety or ensuring the correct farm is given credit by hosting information on a living ledger.

What’s more, farmers must abide to strict codes when it comes to what they can and cannot deliver to consumers, whether that be hygiene standards or conditions of livestock. In uploading a wealth of IoT data to private ledgers, a form of cryptographic auditing is ensured, broken down into simple inputs and outputs.

The joy of blockchain technology is that in coding the sensory data of machines directly onto the ledger, validity can be guaranteed from the first entry onward. In short, IoT and blockchain are self-reinforcing.

Preparing for the machines

Optimizing current systems is one half of the story, preparing for the future remains the other. Industrial IoT is revolutionizing how we manage supply chains and logistics. IoT-enabled devices allow for real-time, predictive maintenance of machines, with the more devices connected to the network contributing to a smarter system. This, however, also raises threat levels, with each new connected device another potential entry point for attackers.

Connected devices have already altered the way businesses face up to security. There’s currently no uniform or overarching standard, so it’s up to individual businesses to take the lead. For those without the resources to design platforms with security in mind from the outset, it can often be an arduous task staying in control. Now imagine when AI becomes commonplace within manufacturing and the rate at which robots exchange data becomes exponential. Private blockchains may just prevent future issues when interactions go beyond single-thread conversations and into multi-threaded ones between machines themselves.

Both IoT and blockchain technologies are here to stay. Their convergence is largely expected, and the potential for them to benefit businesses is immeasurable. On one hand, security concerns can be addressed by distributing information across the ledger; on the other, IoT and blockchain can work together to disrupt many of the processes we have come to accept.

Private blockchains allow businesses to pick and choose the most favorable features of decentralized lists, and maintaining control in what is contributed to it at the same time. The result, if managed correctly, could change the very way we understand and utilize the internet of things.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


May 7, 2018  1:06 PM

How driverless cars will interact with each other (and us)

Haden Kirkpatrick Profile: Haden Kirkpatrick
gps, Internet of Things, iot, IoT data, IoT sensors, self-driving car, V2V

Across America, the autonomous vehicle trend continues to accelerate forward. Driverless cars are hitting the streets of California, and several other states are rolling out regulations about autonomous vehicle testing. Competition is fierce as automakers and tech companies race against the clock to create the perfect machine. But in March 2018, a self-driving vehicle operated by Uber hit and killed an Arizona pedestrian. The incident is thought to be the first pedestrian fatality within the autonomous vehicle space. Unfortunately, it’s a grim reminder that driverless tech is still in its early stages and there is much work left to be done.

However, some numbers paint a positive outlook for a self-driving future. One report claims that autonomous vehicles (AVs) could decrease accidents in the U.S. by a whopping 90%, saving thousands of lives and up to $190 billion annually. Recently, Esurance explored the ways in which data from today’s smart cars will help influence the development of tomorrow’s self-driving technology.

Once this tech proliferates throughout the mass markets, we should expect to see a decrease in the number of accidents. That is just a starting point to a much brighter and more efficient future, however.  Along the way, it is important to understand how the next generation of AVs will interact with each other (and us) to keep drivers safe on the roads.

A quick overview of AVs

Autonomous vehicles need three things in order to function properly:

  1. An internal GPS system;
  2. A sensor system that recognizes complex road conditions; and
  3. A computing system that reads information from the previous two systems and transforms it into action.

AVs come equipped with all sorts of cutting-edge technology to help these systems work together — including cameras to see their surroundings, radar to allow for advanced sight (for example, to navigate through unfavorable weather conditions or in the dark) and laser sensors that can detect objects down to the millimeter. These features, along with incredibly powerful internal computers, are what get you from point A to point B in an AV.

How AVs communicate

Along with being aware of their surroundings, AVs must be able to “talk” with other vehicles. Vehicle-to-vehicle (V2V) communication helps cars share data with other nearby cars, including overall status and direction, such as their braking status, steering wheel position, speed, route (from GPS and navigation systems) and other information like lane changes. This tells neighboring vehicles what’s happening around them so that they can better anticipate hazards that even a careful driver or the best sensor system would miss. This data can also help an AV “see through” another car or obstruction by sending the same sensor data between vehicles.  Soon, your car will be able to see over the vehicle in the left lane that is blocking your view as you try to turn right onto a busy street.

There’s also vehicle-to-infrastructure (V2I) communication, which allows cars to understand and connect with various road infrastructure. This includes traffic lights, lane markings, road signs, construction zones and school zones. Imagine if your car could alert you of a traffic jam or a sharp curve long before you came into contact with it — that will be a reality with V2I technology. In a fully autonomous world, these data sets will combine to help your car find the safest, most efficient route to your destination in real time.

V2V or V2I: An interim solution?

Although driverless vehicles could one day rule the roads, we’re still several years away from this reality. The technology isn’t quite there yet, and consumers are not completely ready to take their hands off of the wheel. However, car-to-car communication could provide a more real-time, positive effect on road safety. And government officials realize this — in 2017, the Federal Highway Administration announced V2V guidance that can improve mobility and safety by accelerating the launch of V2I communication systems. It’s certainly a step in the right direction for emerging technologies.

How self-driving cars can become a reality

A fully autonomous future isn’t out of the question, but it will require extensive collaboration from many different parties to see it through to fruition. Automakers and tech companies must build safe and reliable products that are virtually fail-proof before consumers can begin to trust the technology. City officials should consider smart road infrastructure to help vehicles better anticipate issues and communicate with each other. Local and federal policymakers need to create laws and regulations to protect our safety. Most importantly, these parties will need to gain the trust and confidence of the American public.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


May 7, 2018  11:00 AM

Analytics as a key enabler for a sustainable enterprise IoT ecosystem

Jojith Rajan Profile: Jojith Rajan
Internet of Things, iot, IoT analytics, IoT data, Machine learning, Predictive Analytics, Predictive maintenance

Analytics and intelligence play a key role in the overall IoT system, where collected data is turned into information and augments business decisions with actionable insights. Data collected from IoT devices can help enterprises reduce maintenance costs, avoid equipment failure, improve business operations and perform targeted marketing. Synchronizing an enterprise IoT ecosystem with core engines, such as artificial intelligence, machine learning, predictive analytics and digital counterparts, will be the key to unlocking the potential of IoT at enterprise level.

Artificial intelligence
IoT is all about the data flowing between devices and gateways and to central platforms. According to Gartner research, there will be 25 billion connected things connected to internet by 2020. The amount of data generated by these things will be unimaginably huge. Finding pieces of useful information over this pile of data will be nothing less than finding a needle in a haystack. This is where artificial intelligence will play a big role, filtering that huge cluster of data which will result in intelligent business friendly insights.

In one of the top use cases, the IoT and AI combination can be in the field of security. Artificial intelligence can be used to determine regular access patterns in any vulnerable environment to help security control systems to avoid any security failures.

Machine learning
It’s often necessary to identify correlations between a large number of sensor inputs and external factors that are rapidly producing millions of data points. Considering the frequency at which IoT devices generate data, a computing technique that can make best use of this information becomes inevitable. The evolution of the IoT and machine learning combination is the result of millions of data points generated at a great frequency by IoT devices. Machine learning works on huge amounts of historical data to produce cognitive decisions. Thus, the combination of IoT with machine learning becomes a great enabler of business optimization.

Predictive analytics
Predictive analytics and maintenance can create huge impacts on business economics using IoT data. Predictive analytics facilitate automated, consumable replenishments in the consumer segment. Using predictive analytics based on IoT data, failures and downtimes of the machinery in manufacturing facilities can be prevented. Enterprises can mitigate the damaging economics of unplanned downtime. According to statistics, using predictive analytics can reduce maintenance costs by 30-40%.

Digital counterparts/twins
Using digital replicas of physical entities and systems as part of an IoT architecture allows organizations to start simulations and compensate ecosystems as and when needed. Digital twins can generate predictions and insights into the operations of machines which will be very useful when part of existing business processes. Triggering appropriate remedial business processes and workflow is one of the purposes of digital twin projections.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


May 3, 2018  2:58 PM

Diversity of data: The key to secure 3D face authentication

George Brostoff George Brostoff Profile: George Brostoff
3D, Authentication, Biometrics, Data Management, Facial recognition, IoT data

3D face recognition systems are poised to deliver fast, accurate and secure authentication on mobile devices and other applications. By pairing today’s tiny, accurate 3D cameras with powerful AI software, we are entering a new era in providing secure access to smartphones, laptops and tablets. While these two elements are key to the adoption of 3D face recognition for authentication, there is another aspect that, in my opinion, too often gets overlooked.

The fact of the matter is that to deliver fast and accurate 3D face authentication, you need to have a great deal of very diverse data. A huge repository with lots of similar data is not going to cut it.

Big is not always best

One of the major factors that too often gets overlooked when tech companies are developing biometric-based technologies — especially ones that use face recognition for authentication — is the fact that we are all now operating in a global business community. A quite telling article in the New York Times not long ago by Steve Lohr basically decried the current state of mobile face recognition. The reporter’s key takeaway on the present situation was that it works well if you are a middle-aged white male.

Given that the world is flat and continues to get flatter (great book by Thomas Friedman, btw), this is not an acceptable situation. People on the planet have faces of all different shapes, sizes and colors. Plus, tech research firm Gartner predicts that by 2021, 40% of smartphones will be equipped with 3D cameras. Add to this the fact that the number of mobile phone users is forecast to reach 4.77 billion in 2017 and pass the five billion mark by 2019. The net impact of inconsistent, face-based authentication could be significant, as people increasingly expect to easily access their devices using this approach.

Lots of diverse data to the rescue

Having worked in this space since 2006, I have to say that the inconsistent performance of both standard 2D and the new 3D face authentication is too often a result of simply defined or poorly curated databases. This is one of the reasons why face authentication has struggled to achieve mainstream acceptance.

Delivering accurate, consistent and fast 3D face recognition requires the factors I mentioned above: exploiting today’s tiny and accurate cameras and using advanced AI algorithms to capture, manage and rationalize the torrent of data generated. For perspective, today’s state of the art 3D cameras capture 30K-40K data points per scan every time they look at a face — but that is not all.

Image source: WLFDB

Many companies claim that because they have a large database, their technology is going to be accurate. Which is, frankly, inaccurate. More important than size is the diversity of the database for the specific intended use cases. Millions don’t matter. Databases can in fact be statistically significant with as few as 250 persons if the appropriate factors are captured.

In order to deliver an acceptable level of 3D accuracy, the data needs to:

  • Include people of different races and different genders, with and without glasses;
  • Represent people in different positions;
  • Be acquired over long periods of time;
  • Be captured with the same types of cameras used for recognition on various end user devices; and
  • Represent expected use case environments — for example, will people be only sitting in a well-lit office or might they be lounging on the beach in bright sunlight?

My recommendation has always been to focus on quality and diversity of data rather than just the size of the database. Over the past 12 years, our team has been able to generate and use a database of millions of people. More importantly, it includes images of users of different races from all over the world acquired with cameras and in environmental conditions and poses representing how they are actually using the technology in real-world situations.

I encourage organizations interested in exploiting the power of 2D or 3D face recognition and authentication to be sure their proposed approach has access to a meaningful and diverse data set. It will ensure consistent results and, in turn, help drive overall customer satisfaction when deploying a face authentication system.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


May 3, 2018  2:08 PM

Finding the IoT sales experts of the world

Francisco Maroto Francisco Maroto Profile: Francisco Maroto
IIoT, Industrial IoT, Internet of Things, iot, Sales, selling

After years of evangelization and waiting for the promises of the internet of things to come true, it seems that we are finally close to reaching the trough of disillusionment phase, when we begin to forget all the hype generated so far and focus on reality — a harsh reality that involves selling IoT and not selling smoke anymore.

The time to sell IoT is now

The sale of IoT is perhaps more complex than the sale of other disruptive technologies, such as big data, cloud or AI, and maybe as complex as blockchain today. In the article “Welcome to the first ‘selling IoT’ master class,” I commented how it should be the evolution of M2M and IT technology vendors to sell IoT. However, many of these companies still have difficulty forming and finding good sellers of IoT.

The truth is that nowadays it does not make any sense to sell IoT as a technology. Enterprise buyers only want to buy technologies that provide measurable business outcomes while, on the other side, many IoT vendors want to sell their portfolio of products and services that have been categorized under the umbrella of IoT either as quickly as possible or at the lowest possible cost.

During last five years, I have been analyzing how IoT companies sell their products and services. Some of my customers, including startups, device vendors, telco operators, platform vendors, distributors, industry applications and system integrators, asked me to create IoT sales material to train their sales team on selling their IoT solutions and services. Sometimes I also help head hunters or customers searching for IoT sales experts.

Based on this varied experience, I have launched “IoT sales workshops” to help companies train their internal teams to sell IoT. Here are some of the lessons I learned:

  • There is a time to act as an IoT sales generalist, and a time to act as an IoT specialist.
  • You need to adapt IoT storytelling based on your audience.
  • Being an IoT expert is not synonymous with being successful in selling IoT.
  • You need to show how companies can get more out of IoT by solving a specific business problem.
  • Make it easy for the customer to see the benefits of your IoT product or IoT service, as well as the value it adds.
  • Given the complexity and specialization of IoT by vertical, explain the need to focus more closely on business cases and IoT business models, as well as the ROI over three to four years before jumping into technology.
  • You need to be patient; IoT selling is not easy and takes time to align strategy and business needs with the IoT products and services you are selling.
  • Build a strong ecosystem and adoption of end-to-end IoT systems easier by collaborating with partners.
  • Train IoT business and technical experts to get better at telling stories. Design a new marketing and sales communications playbook. Keep it simple. Build your narrative from the foundation up — one idea at a time.
  • If you want an IoT sales expert, you need to pay for one — don’t expect miracles from external sales agents working on commission.
  • IoT sales is a full-time job. You will not have time for other enterprise activities.
  • Selling IoT to large enterprises takes teamwork.
  • Be persistent. Do not expect big deals soon.
  • Be passionate, be ambitious and be disruptive to sell IoT.

Summary

I do not consider myself an IoT sales expert, nor a superman of sales. In fact, I have shied away from classifying myself as a salesperson, even though over time I have given weight and value to this work that once seemed derogatory to me.

Selling IoT is not easy. In a few years, we will forget about IoT and sell newly hyped technologies. But in the meantime, you need to be prepared for disillusionment — long sales cycles and a lot of work with sometimes poor results. However, I don’t know — maybe by 2020, if you persevere, you will be awarded as “best IoT sales expert” and will finally earn a lot of money.

Just remember: Be persistent, be passionate, be ambitious and be disruptive to sell IoT.

Thanks for your likes and shares.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


May 3, 2018  10:37 AM

History-making technology: How IIoT can transform America’s trucking industry

Rob Bradenham Profile: Rob Bradenham
Fleet Management, Industrial IoT, Internet of Things, iot, IoT analytics, IoT data, Predictive maintenance, Service providers, SRM, Supply chain

The history of technology in the trucking industry has had many watershed moments. The advent of diesel engines, which had been mainly used in large stationary operations, brought a large increase in hauling power and fuel efficiency. In the 1970s, electronics began revolutionizing how those engines were controlled, again boosting their efficiency.

Today, this $700 billion industry is once more on the cusp of a technological revolution. This time it’s the industrial internet of things that is poised to drive dramatic improvement in asset performance and utilization.

One of the biggest promises of IIoT is predictive maintenance. But IIoT success requires more than just implementing technology like sensors and telematics — it requires that the technology is integrated into a connected process that aggregates, organizes, filters and prioritizes data into a usable format. There is also significant value in IIoT beyond predictive maintenance — ranging from improving the service process, maximizing the lifetime value of an asset and optimizing purchase and disposition decisions based on real data.

Connecting assets is as easy as installing vehicle telematics systems to get data regarding the condition and operation of the asset. But connecting that data with the management of that asset means connecting assets, service locations, fleet management systems, diagnostic tools, OEM data, call centers and remote service providers. This means that all the related parties in the service supply chain are able to share and collaborate, which drives dramatic gains in asset performance and uptime. Collectively using the data from vehicles to inform and enable an improved approach to service management ensures that the anticipated value of IIoT will be realized.

Today’s IIoT reality: Disconnected data, little service value

The reality for many fleets is that commercial vehicle service and repair operations are still hampered by disconnections. Management information systems are siloed and unable to communicate with each other. Stakeholders, including fleets, service providers and OEMs, are unconnected and forced to make do with data that is incomplete, incorrect, inconsistent, unstructured or out of date. This is the challenging reality of early IIoT implementations for many organizations — a flood of unusable data that hampers the service process rather than speeds it up.

How can fleets, dealers, service providers and OEMs enable IIoT to live up to its potential? They can start by fixing the disjointed ecosystem that currently exists and improving the information sharing and collaboration that is essential to enhancing efficiencies and uptime. This requires thinking about IIoT in the context of a service relationship management (SRM) tool. SRM links all the stakeholders involved in service management, ensuring the right people get the right information, in the right place and at the right time.

Realizing IIoT’s promise: Lower downtime, optimized service processes

There already is evidence that IIoT adoption can positively impact asset performance and service and repair operations. The diagnostic and analytic capabilities enabled by IIoT provide great intelligence to service teams when they are available at the point of service. In commercial vehicles, remote diagnostics allow for data collection, both as an issue is reported as well as prior to the issue occurring. This leads to greater accuracy and the ability to provide proactive support. Connected vehicle data also provides for more effective root-cause analysis and enables more robust case-based reasoning, helping address downtime, fuel efficiency, maintenance costs and service optimization issues.

One commercial truck manufacturer already equips all its vehicles with remote diagnostics that provide around-the-clock monitoring and detailed analyses of critical fault codes, enabling identification of emergent issues for action, improvements in planning for repairs, and streamlined diagnostic and service procedures. On top of this, the manufacturer has been investing time and resources into building a consistent, measurable process to improve uptime for its customers and dealers through a combination of IIoT deployment and SRM implementation — and has integrated its telematics and analytics tools into the SRM platform. The results speak for themselves: Service providers in the program have seen reductions in both diagnostic time and service event downtime (36% and 29% less, respectively).

SRM as the on-ramp to IIoT promise

A connected service process allows everyone involved in service events to effectively communicate and collaborate. Assets return to service more quickly, increasing profitability.

Incorporating asset data in an SRM platform can provide real-time transparency and visibility and enable connected service technology that maximizes uptime and reduces costs across an entire service ecosystem. Further, when advanced data analytics are layered on top of high-quality, consistent data delivered by IIoT, it can provide insights that inform service process improvement for stakeholders across entire service supply chains.

An SRM tool unifies those supply chains to improve uptime, ensure consistent network-wide service delivery, create and strengthen customer relationships, reduce warranty and support costs, and lower goodwill expenses. Using an SRM platform addresses the four C’s of effective service event management:

  1. Connectivity that facilitates seamless data flow between assets, service points, OEMs and fleets;
  2. Communication that enables contextual information sharing and collaboration at the point of service;
  3. Controls that provide tools to reduce risk, increase efficiency and improve decision-making; and
  4. Consistency across service networks and repair processes, including shared service histories for real-time decision support and post-event reporting that drives accountability and process improvement.

The commercial trucking industry is a critical part of the U.S. economy — the American Trucking Association estimates it moves about 71% of the nation’s freight by weight. Using IIoT to improve its efficiency promises to spread beneficial affects throughout supply chains for industries as diverse as food production, manufacturing and retailing — all while improving the bottom line and building strong customer relationships.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: