As far back as 2015, in one of the earliest macro-assessments of IoT, the consultancy McKinsey quantified the market opportunity in nine categories. They ranged from consumer homes to wide open spaces and included home automation, factory, personal monitoring — health and fitness — and smart city applications. Each category, of course, can contain many hundreds and thousands of individual applications.
When the IoT market began to take shape, the initial focus of solution providers was on connectivity. There is a raft of market studies on the billions of potentially new connected devices for technology providers and connectivity service businesses to target. However, the work involved in connecting physical and virtual things to the internet is just a start. An increasingly important objective is to handle the data that comes from connected assets. That, in turn, means finding ways to connect, manage, analyze and, eventually, share data among different organizations.
In addition to the novelty of large-scale and distributed data management, the characteristics of IoT as a heterogeneous system of systems introduce an additional layer of complexity. Industry attention is now shifting from connectivity to new approaches to understand and interpret data. Heterogeneity introduces other new challenges, but there are issues in making data interoperable. This may be across application silos, between partners in a supply chain or between vendors of interchangeable devices and sensors.
The value of data models
A good data model can solve these issues. In other words, IoT data modeling offers an approach which could more efficiently describe, interpret, analyze and share data among heterogeneous IoT applications and devices.
Fortunately, several data models already exist. Many of them have been developed by different standards development organizations. Some of them are for specific IoT vertical applications or domains. For example, Smart Appliances REFerence provides a shared model for home appliances. Data models from the Open Geospatial Consortium are more for geosciences and environment domains. The Open Connectivity Foundation specifies data models based on vertical industries such as automotive, healthcare, industrial and the smart home. The World Wide Web Consortium Thing Description provides some vocabularies to describe physical things, but does not have much focus on data.
In contrast, Schema.org operates as a collaborative community and it aims to provide more general and broad vocabularies and schemas for structured data on the internet. A collaborative approach to integrate and unify various data models is critical and necessary to work across IoT application and organizational boundaries. It requires cooperative efforts among different industry bodies.
OneM2M’s role in data model standardization
OneM2M, the global standardization body, was established to create a body of maximally reusable standards to enable IoT applications across different verticals. OneM2M focuses on common services, such as device management and security, that are needed in all IoT applications. The oneM2M standard takes the role of a horizontal IoT service layer between applications and the underlying connectivity networks. In doing so, it masks technology complexity for application developers and hardware providers.
Amongst its various standardization efforts, oneM2M takes a collaborative approach in developing its data model-related specifications. For example, one of its technical specifications, TS-0005 for management data model, is the result of a collaboration between oneM2M and the Open Mobile Alliance.
Another specification, TS-0023, laid the groundwork for a Smart Devices Template that was first applied to create a Home Appliances Information Model and Mapping. In the next release of oneM2M, Release 4, the underlying data model principles will be extended to support information modeling and mapping for vertical industries, such as smart cities.
New ideas for IoT data models
The focus on frameworks to manage data models across application verticals and domains sets the stage for the next phase of the IoT industry’s growth. This topic will be featured on the agenda at the forthcoming Internet Engineering Task Force (IETF) 104 Meeting on March 23- 29, 2019 in Prague, Czech Republic. Two IETF working groups, Constrained RESTufl Environments (CoRE) WG and Thing-to-Thing Research Group (T2TRG) will explore new ideas around IoT data models as they affect IoT application layer issues.
As one of oneM2M’s representatives at the IETF meeting, it will be interesting to hear about the latest progress on the CoRE and T2TRG activities. The meeting will also be an opportunity to explore new developments and possible collaboration opportunities around the use of data models for IoT management and security.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
I had the opportunity to take an active role in this year’s discussions at the World Economic Forum’s (WEF) annual meeting at Davos. Selected as a WEF Technology Pioneer, it is both exciting and validating to see the themes we evangelize come into focus with live debates and energetic conversations in the snowy, sunny Swiss Alps.
Consumer diversity in 2019
A lot of the discussion at Davos this year focused on the diversity of the environments in which consumer product brands and manufacturers are operating — for instance, balancing the need to respond to Gen Z’s values-driven purchasing behavior in contrast to the experiential needs of aging populations in developed markets. Or the tapestry of data management regulations across the world — Europe’s GDPR in one corner, China’s regime in another, and the relatively loose framework in the U.S. in another.
The fundamental message is we now live in a world where:
- consumers expect localized and personalized experiences;
- the effectiveness of data integration and data flow has to navigate national rules; and
- the competence of instrumentation and data management has moved from being a cost-center issue to being a value-driven determinant of competitive advantage.
Global impact of IoT
Specifically in 2019, we are seeing a couple of key drivers influence the global impact of IoT in the consumer products industry:
- consumer demographics and behavior; and
- digital-driven competition.
With this in mind, following are five trends to watch in 2019:
1. Transparency of product provenance and new models of localized and personalized retail and service delivery.
Meeting the divergent needs of Gen Z and Gen X is challenging the consumer products industry. Gen Z consumers are making values-based purchasing decisions, demanding transparency from brands. At the same time, Gen X and the aging population of retiring Baby Boomers are demanding ever more convenience. Brands that can deliver both transparency and personalized consumer experiences are going to rise to the top.
2. Increasing rules and competencies to gather and make use of real-time product data on a global scale — while protecting consumer trust.
When it comes protecting and managing consumer data, global brands are faced with a tapestry of rules and regulations. For instance, Europe led the way with personal data protection and privacy with the GDPR, which is great if a brand is only operating in Europe. But if a brand is global, it not only has to worry about the GDPR in Europe, but also a completely different set of rules in China, somewhat laissez-faire rules in the U.S. and so on. Despite the global differences, what is clear is regulations to protect consumer data and trust are inevitable. There simply isn’t going to be one consistent rule set. Global brands must navigate different rules in different geographies.
3. Real application of artificial intelligence to production and delivery processes.
We are increasingly seeing consumer product brands apply artificial intelligence to production and delivery processes to ensure supply chain integrity and to enhance speed to delivery. Not only are machine learning and AI transforming today’s supply chains and manufacturing plants, but the technology is being applied at scale.
For instance, machine learning is helping brands detect counterfeit products and protect against parallel trade at scale. Machine learning is also being applied in manufacturing plants to optimize the sourcing of raw materials and to predict movement in inventory.
We will see continued adoption of AI and machine learning as the technology becomes more accessible. This is important because the competitive advantage is exponential. There is a real sense that the countries, companies and platforms driving AI have a disproportionate amount of power in the emerging world.
4. Scaling up of new operating models for manufacturing to compete with agile and dynamic value chains.
At Davos, there was a lot of talk about the need for supply chains to operate in a much more dynamic fashion to meet real-time consumer demand. Not only do consumers expect real-time delivery of products and services, but they also want them customized to meet individual needs. The traditional model of high-scale manufacturing, churning out gazillions of the same widget, is no longer applicable. Manufacturing needs to adopt new models to close the gap between design and delivery in an era of instant gratification and mass customization.
5. Urgent — and necessary – high-impact response to sustainability and, specifically, to climate change.
We all need to be taking steps in a sustainable direction. Working together as a global economy to build consumer awareness and mandate supply chain transparency is the only way we will achieve a truly circular economy.
The good news is that brands are investing downstream in their sourcing, manufacturing and distribution processes to improve their capacity for sustainability. The challenge is making this information available to consumers. With strong consumer awareness, sustainability within the supply chain becomes a competitive asset for brands, strengthening brand trust and authenticity for consumers and winning market share. Today, consumers prioritize brands that promote sustainability as they seek ways to make an impact with their personal behavior. See my WEF blog post touching on this theme.
The future of IoT for consumer product brands
For my company’s mission to organize the world’s product data, it’s an encouraging picture. There are a lot of issues to work through for digital transformation in the consumer product industry to yield its full value potential, but the framework is coming into clear focus in 2019.
The ability to have traceability of every product item through the supply chain — into the hands of the consumer and beyond — is table stakes. Direct connection with the consumer is the only way to compete in an experience-driven consumer world. The fidelity and real-time application of data gathered during the product’s supply chain journey is the keystone to competitive advantage for consumer product brands — for efficiency, integrity protection, transparency, compliance and sustainability.
In the months ahead I’m looking forward to sharing more thoughts on the future of IoT for consumer product brands including:
- its impact on sustainability in manufacturing;
- blockchain in the supply chain; and
- consumer trends where these innovations are being applied.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
Working with senior leadership teams across multiple market segments over the past six months, such as automotive, energy, transportation, manufacturing and government, has been nothing short of amazing to experience. It is easy to see why the global industrial IoT market is expected to reach a value of $922.62 billion by 2025 according to recent a Million Insights report. This growth is due to the worldwide rise in IoT technology development and implementation in the past few years.
I have been able to personally witness some of this explosive growth of software development and industrial IoT development over the past six months at Deasil Cognitive. With the core team being together for over 20 years and having extensive experience in collecting, moving, buffering, queueing and processing data, and building frameworks around the implementation of bleeding-edge technologies in artificial intelligence, machine learning and blockchain, the team is able to rapidly develop applications, quickly integrate new sensors and data acquisition devices, and implement game-changing sysetms for these market segments.
I have watched the Deasil team significantly improve productivity and ROI across multiple client projects, using Kubernetes, Docker, Golang, Cassandra, Kafka and Elastic, to name a few. The team is developing highly productive, stable, clean and faster applications and the results are beautiful and innovative, including IoT management systems, IoT implementations, mobile applications, business intelligence and data management platforms.
Some of the projects include figuring out how to use the massive amounts of data being collected on vehicles to provide more personalized services to drivers. These types of systems are designed to give customers a more personalized experience while also driving more customer loyalty and new revenue streams for the automotive industry. Applications for the energy sector have included streaming data from numerous sources to provide insights on where to allocate funds for potentially catastrophic risks, reduce product loss, property damage and even environmental risks.
As an authorized government software development partner, we have been able to identify ways to help the Department of Energy analyze Synchrophasor data. Synchrophasors are time-synchronized numbers that represent both the magnitude and phase angle of the sine waves found in electricity, and are time-synchronized for accuracy. They are measured by high-speed monitors called Phasor Measurement Units, or PMUs, that are 100 times faster than SCADA. These types of applications will help greatly mitigate risks, improve quality of service and reduce unforeseen downtime for utility companies.
Industrial IoT and software development technologies don’t always have to be so complicated, though. Figuring out ways to help companies digitize data that is being collected across the globe — where there are many mistakes and costly inefficiencies — is a great first step in helping companies begin their industrial IoT journey. In just being able to digitize the data, customers can begin to see a real benefit of reducing waste. The second step is where the magic happens. This is where they can begin to easily understand whether vendors are meeting the deadlines on delivery of thousands of components used in the field or whether or not customers are using their buying power across the many projects and vendors available to get the best pricing for the company. Data automation platforms, like Acuity, help enable companies to greatly improve operations, customer experience — with on-time delivery of projects — and notably improve their profits. Combining this kind of application with live operations is nirvana, especially where early warning signals can be tied to vendors to, for example, drop ship replacement parts, engines or pumps to reduce downtime.
What do I see coming down the pipe for industrial IoT and blockchain? For one, we are merging machine learning with blockchain. Machine learning relies on vast quantities of data to build models for accurate prediction. A great deal of the overhead incurred in securing this data lies in collecting, organizing and auditing it for accuracy. This is an area that can significantly be improved using blockchain. By using smart contracts, data can be directly and reliably transferred straight from its place of origin and improve the whole process significantly by using digital signatures.
I am looking forward to sharing more soon.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
As technology vendors, cloud service providers, analytics firms, network equipment makers, industrial equipment OEMs and telcos compete for leadership in IoT markets, the next few years will be critical. Bain believes the market for IoT hardware, software, systems integration, and data and telecom services could grow to $520 billion in 2021, more than double the $235 billion spent in 2017.
To reach that scale, however, market leaders will need to continue to make gains while expanding their industry-specific offers. Incumbents across categories who cannot move quickly enough to address customers’ needs are likely to get leapfrogged by more nimble competitors. Device makers, in particular, run the risk of seeing software and analytics competitors capture the value of solutions, leaving them to deliver lower-profitability hardware components.
The right actions vary, of course, from one company’s situation to another. But three themes are nearly universal for IoT vendors.
Focus on getting a few industries right. Customizing use cases for each industry and packaging systems intelligently are emerging as keys for success. Leading vendors are targeting their technologies on fewer industries than before — a welcome change that will allow them to deliver offerings better suited to customers’ needs. They should continue to narrow their focus: More than 80% of vendors still target four to six industries — too many to build depth rapidly (see Figure 1). Focusing on two or three verticals allows vendors to gather significant industry expertise, which can help them maintain a competitive edge against the more generic offers by cloud service providers.
Develop end-to-end systems. As vendors gain experience implementing IoT systems in specific industries, they can develop cost-effective end-to-end packages with partners — something that buyers have been clamoring for. Many IoT deployments require customization, usually based on the industry. More than 60% of customers say the technologies they buy are at least 25% customized. When vendors explore the particular use cases of an industry, they learn about the different data sets required, the sensors measuring it and how to process it to glean valuable insights. From this, they discover what’s transferrable to the next customer. They can then create standard packages, reducing customization requirements, shortening sales cycles and increasing the likelihood of success.
Several companies are showing how to adapt technology to markets in this way through partnerships and acquisitions. IBM Watson, for example, develops proofs of concept with customers and applies those lessons to develop industry playbooks that include multiple use cases. IBM Watson’s concept work with Samsung, for example, informed later work with elevator company Kone and French railroad operator SNCF. Verizon chose acquisitions as a way to gain deep expertise in telematics, buying Hughes Telematics, Fleetmatics, Telogis and Movildata to boost its fleet management.
Prepare to scale by removing barriers to adoption. Customers and vendors alike believe that progress has been slower than they had expected in overcoming the main barriers to adoption (see the Bain Brief “How providers can succeed in the internet of things“). However, vendors have more experience with operational considerations today than they had two years ago, and customers have a better understanding of the necessary investments. Customers also have more realistic expectations about the returns they can expect.
Leading vendors will translate that experience into repeatable playbooks that address their customers’ concerns about security, integration and returns on investment. Understanding these pain points is the first step; addressing them and baking them into end-to-end offerings will position technology providers to deliver cost-effective IoT systems that can scale.
Welcome to the final post in this three-part series where we’ve been looking at the disconnect between developers and car manufacturers, and the roles played by each group. In this post, I’ll discuss the challenges still left in this delicately balanced ecosystem and the future of connected vehicles as a whole.
Despite improvements in the relationship between carmakers and the development community, there are still a number of challenges ahead if the connected car industry is to flourish and meet customer demands on services.
Manufacturers remain cautious toward the open app ecosystem
Despite great advances being made in the areas of data security and user privacy, a major challenge that persists comes from the automakers themselves. Essentially, manufacturers still have privacy concerns about an open app ecosystem. This fear stems from historical security breaches whereby malicious apps have had access to sensitive data. These kinds of security breaches damage the carmaker’s brand and, in turn, erode customer trust, which takes considerable time to rebuild. Approaching an open app ecosystem with caution is certainly advisable, however this approach can cause delays to the process of bridging the gap between developers and carmakers.
But it’s not just malware or security breaches that carmakers are afraid of. A desire to maintain apps and services of a consistently high quality is also a strong motivator. Manufacturers need their apps to be a true reflection of their brand and to match the existing expectations of their customers. For these reasons, the carmakers themselves like to retain a great deal of power over app development for their connected vehicles.
Finally, and understandably, it is no quick process gaining trust in such a long-established and traditional industry. This is not necessarily a bad thing; it’s simply the way it is. Developers need to ensure they work closely with carmakers to protect customers and ensure that their apps are not only brand-compatible, but also of the highest quality.
Payment models for services need more deployment
In many ways, apps created for connected cars are hitting the sweet spot of the market: tech-addicted millennials who are buying their first cars. These young people won’t be satisfied with a run-of-the-mill vehicle that simply takes them from A to B; they will be demanding a car that provides them the entire connected package and a driving experience that connects seamlessly to the rest of their online lives. For many of these, driver apps are seen as the obvious way to receive, send and share content such as music, navigation, news and recommendations — exactly the sorts of content and services apps within a connected car could offer on the move.
If they’re willing to build the right apps to target this particular group of young people — a group typically renowned for having a disposable income, as well as the time to spend it — app developers with their nose to the ground could build long-lasting and profitable businesses. In fact, a 2014 Deloitte study found that 52% of millennials — those born roughly from 1984 to 1995 — say they want apps on their dashboard.
However, despite the huge potential from this still largely untapped corner of the market, for many developers and third-party services it is still unclear how a connected car app or service charges its customers and turns a profit, not to mention what kind of payment models a third-party service might be able to expect from a carmaker in the first place. With these financial arrangements still largely unclear to everyday app developers, it’s no wonder that some are put off engaging in connected car app development and sticking with smartphone applications — which are, in comparison, relatively easy to monetize — instead.
Another financial reason which may be holding back regular app developers from getting involved in connected app car app development is the length of development time and testing time. These processes can be significantly longer for connected car apps compared to smartphone apps and for those developers working on a budget they simply don’t have the time to invest.
Carmakers need to make transition smooth for developers
In a more general sense, when we think of how to connect the worlds of development and carmakers, carmakers need to think practically about what they can bring to the table in terms of technology. Discoveries and innovation from third parties which lead to new products and services can only be made if there are ample software tools available for developers, as well as those crucial open APIs.
These resources come directly from individual car manufacturers and their availability is, ultimately, what will dictate the success or failure of their individual connected car enterprise. The more open APIs that are accessible to developers, the more varied services they can offer their customers. The more carmakers that realize this and respond to it with the right tools and software, the quicker we will see changes in connected car app development software.
Conclusion: Bridging the gap
Although we have come a long way in the last few years, there is still some way to go before we have fully bridged the gap between carmakers and third-party services and developers. Despite huge advances in technology, in order to bring these two parties together there needs to be more than simply the technical environment. There has to be a strong desire on both sides for collaboration and compromise, with initiatives coming from developers and carmakers, as well as incentives for all involved. For optimum collaborations and for companies to invest the time to start working with car data, there has to be minimal friction.
As we have seen, data privacy and security is still a huge concern for carmakers. Gaining trust of carmakers and consumers by proving the technology is fully secure will make it easier going forward to collaborate. This is already happening and the case for an open ecosystem is getting stronger by the day.
As things currently stand, the future for connected cars and the services that work in them is looking extremely positive. With the additional support of accelerators, incubators and hackathons, small businesses are getting the support — financially and in terms of technology — that they need to build and test their applications, plus the exposure to major car companies to get their projects off the ground and into cars. Increasing customer expectations are also putting pressure on the carmakers to shorten development and testing time, which again prompts them to promote and support connected car app development.
Imagine a day when a driverless car is roaming on the road with happy and contented passengers enjoying the seamless integration of entertainment, music, road safety, live traffic information and updates, and connectivity to other vehicles. It sounds too good to be true, but soon this once seemingly unrealistic imagination of yesterday is going to be the trending fact of tomorrow.
In this era, when connectivity of people via digital and social media has proved to be a boon, who would not be amazed if their cars are connected to themselves as well as to its colleagues on the road? For a safe ride with no hassles of travel, like getting stuck for hours in traffic and arriving late to the office, users will get a health alert of the car before the chaos actually happens — motorists will be more than enthralled to incorporate this technical revolution in their vehicles.
Automotive giants across the globe are continuously striving to provide the above-mentioned amenities at affordable prices. An application like IoT in the automotive industry is going to have its own plan or course of action, a scheme to be implemented at consumer and commercial levels. But the basic attributes of these new ideas can be seen spreading their wings on the seven most eye-catching needs of consumers, which the automobile industry should bank on.
1. Safety at its best
Whenever a car embarks on a drive, the first wish travelers on board is safety. Connected cars using IoT technology will be intelligent enough to detect probable geographical spots of collision by amalgamating the mechanics of wireless mesh-networked cars and other technologies.
Rearview cameras and proximity sensors also aid in safe and judicious driving and parking. A composite technology comprising of crash avoidance, like forward-collision warning or lane-departure warning, combined with the necessary backing of communication and damage-control alerts will be vital in the worst-case scenario of a crash.
2. Crisis identification
How wonderful would it be if we could sense problems before they happen? Though life does not give such a provision, today’s IoT connectivity gives insights on vehicle health by initiating informative inputs before embarking on a road journey. Be it the exposure of the principal parts to wear-and-tear or the potential breakdown of a crucial component, sensors fitted in the most vital elements will indicate elementary parameters, like temperature and the position of the engine, well ahead of time to make the owner aware of potential issues that may arise. It is advantageous financially and practically to have a premonition of the events likely to occur so that they can be avoided.
3. Receptiveness and serviceability
In the unfortunate event of accidents, IoT-enabled cars with collision-recognizing abilities can act as senders of life-saving messages, using cellular networks to alert emergency medical aid services. Moreover, manufacturers can improve customer service by providing relevant apps that are best operated on smartphones. Drivers can get the updates and alerts, such as service due dates, the nearest fuel stations, updates on general maintenance of the vehicle and so forth.
Apart from end-user convenience, manufacturers can collect consumer data to improve manufacturing processes. This data, blended with dashboard tools, provides results like enhancing lifecycles, lowering the cost of maintenance, minimizing response times in case of an emergency and more.
4. Structuring traffic movement
By implementing IoT to monitor traffic control, movement on the roads can be modulated by getting live updates of traffic and area-wide congestion. Destination and in-transit parking spots can be monitored, thus making the driving experience less cumbersome in major cities.
5. Entertainment with information
The present-day technique of selling information with a hint of entertainment has taken shape in mid-sector and high-end vehicles by means of a third-party instrument, like a smartphone connected to a cellular network. Google, Apple and others are providing services by pairing with many known giants of the automotive industry.
Features like maps, music and more are played or operated via integration with an external device, but in the years to come, external connections will become extinct as vehicles offer their own seamless software that will stream in-vehicle packages.
6. Rising need to track
Tracking is a need of every individual, whether commercial or consumer end user. Knowing where vehicles and their contents are will be critical to track fleets and goods, recover stolen vehicles, track in parking lots, and even for parents to track speeding trends reckless teenage drivers.
GPS-equipped facilities can satiate vehicle safety needs, as well as regulations and legalities of traffic bodies of states, national highways and other major connecting roads. A self-driving car agency can locate its vehicle sitting in Delhi, even in the uneven and rugged terrains of the Leh Ladakh region, thus abiding to the regulations of over speeding. The plugging in and unplugging of such devices can also be reported.
7. Information and technical know-how
“Connected car” is a phrase that is selling like hotcakes nowadays. We have to understand that the concept is not based on a single parameter, but is an outcome of many technological factors entangled in each other. However, the basic idea is to connect a vehicle to everything possible. The modus operandi is based on two pillars: device to everything and device to device.
In device to device, vehicles are connected to vehicles, just like two friends chit-chatting in the workplace. Vehicles can also be connected to infrastructures, vehicles and pedestrians. Ensuring as much safety as possible can minimize the chances of head-on collisions through the synchronization of traffic signal and pedestrians timings.
Major players in the automotive industry will try to revolutionize the world with the aforementioned technologies, but it will take at least another two decades to completely back them up with necessary infrastructure and to gain end-user acceptance to assimilate the fact that one day driverless cars can really roam the roads.
I recently purchased a Bluetooth speaker. All the prior Bluetooth speakers I owned had two connectivity modes: pairing with phones or tablets via Bluetooth, or physically plugging in an Aux cable.
This model of Bluetooth speaker was different — it demanded a Wi-Fi connection. I was curious and allowed a temporary connection to my home Wi-Fi. First, it downloaded and installed updated firmware, and then it suggested I install a mobile app to utilize its features fully.
An increasing number of IoT devices require this sort of connectivity — Bluetooth is too limited in range and thus does not allow truly remote operation, and of course, who doesn’t want the ability to unlock their smart lock from work to let a maintenance guy in?
Unfortunately, this convenience comes at the cost of security. Most of these devices will use RESTful API calls to communicate back to the cloud, and it is often much harder to secure this kind of communication than a simple user portal. The most common attacks against these devices fall into two categories:
- Attacks against the device itself
- Attacks against the API
Attacks against devices
The primary challenge with device attacks remains identity. How do you identify the device? After all, it’s not a person that utilizes a username/password combination and some sort of multifactor authentication to log into a portal.
A properly secured IoT device requires a robust primary key infrastructure (PKI) with a private key unique to each device to authenticate to the cloud API. If a device lacks these PKI elements, it may be quite easy for a malicious user to reverse engineer the authentication schema used by the device either by gaining access to internal storage or by examining network flows by inserting a proxy in the path.
The knowledge obtained in this step enables an attacker to gain access to other devices managed by the same vendor. While the ramifications may be minimal in the case of a single Bluetooth speaker, it’s concerning when it comes to other connected devices such as smart locks or connected baby monitors.
Attacks against APIs
On the other side of the coin, there are the attacks that target the API itself. To support users spread across multiple geographical regions, APIs must be available to the public internet and thus are available to attackers as well. The attackers can throw the same assortment of attacks against an API as they would against an actual web application or portal. SQL injections and stored cross-site scripting attacks work just as well in RESTful API JSON payloads as they do in HTTP arguments of an old-school web application. Unfortunately, APIs historically do not receive the same level of security testing, and many web application firewalls do not support checking JSON payloads.
Most vendors also let the convenience benefits obfuscate security consequences. If I look at a webpage of smart lock vendors, I find statements like, “Our mission is to make our customers’ lives more simple and secure by offering unprecedented visibility and control over their front door.”
What these statements lack are the details on how they protect organizations from an attack — no mention of independent security testing, no SOC2 reports, nothing. Without those considerations, you may as well be giving your house keys to the internet. You would never leave the keys to your physical front door lying around, and you should ensure the same approach with your connected version.
The internet of things presents enterprises with unprecedented business opportunities, but also major application performance challenges. The amount of data generated by large-scale IoT applications can dwarf traditional application architectures. As a result, IoT platforms may be based on cloud platforms and are increasingly deployed in hybrid and multi-cloud environments. Still, the need for real-time response means data must be ingested and analyzed at massive scale as it arrives.
Consider a company manufacturing perishable food products that relies on a fleet of thousands of refrigerated delivery trucks for distribution. The company wants to optimize delivery routes, monitor the conditions inside the trucks to ensure product quality, and collect truck performance data to enable predictive maintenance. The IoT platform must continually ingest, process and analyze the data from all the various sensors and other data streams in real time.
For such an application to be successful, the company must build a high-performance data infrastructure. The most practical, cost-effective approach to doing this is with today’s innovative open source technologies for in-memory computing (IMC), stream processing and continuous machine learning.
As they do for other industries, open source offerings provide IoT platform and application developers the following key benefits:
- Innovation — The rapid evolution of IoT plus a highly competitive business environment require enterprise systems that can quickly incorporate the latest innovations. The large and active communities supporting the top open source projects ensure rapid innovation.
- Cost — Creating an IoT platform requires unavoidable expenditures on software development, storage and compute. Open source software systems provide enterprises with at least one development area where they can adopt proven, reliable software and rely on commodity hardware, avoiding the high upfront cost of proprietary technologies.
- Enterprise-grade support — Today, many open source systems are available as fully supported enterprise versions that enable companies to control development costs while still receiving the support they need for deploying in production environments.
Today, the top open source technologies that developers can use to achieve the performance and scalability they need for their IoT initiatives include the following:
To ensure application performance, the Apache Ignite in-memory data grid (IMDG) can be inserted between the application and data layers of new or existing applications without major changes to either. The Ignite IMDG distributes in-memory caching and compute across a cluster of commodity servers deployed on-premises, in private or public clouds, or in a hybrid environment. The recently released GridGain Community Edition is an open source, hardened version of Ignite that is ideal for production environments.
The available memory and compute power of the Ignite cluster is available for massively parallel processing and in-memory data storage. The cluster can be scaled out simply by adding nodes with automatic data rebalancing between nodes. The performance of this architecture can eliminate the need for separate transactional (OLTP) and analytical (OLAP) databases. Separate OLTP and OLAP databases require an extract, transform and load process to copy the data to the analytics database, which introduces unacceptable delays for many use cases.
For greenfield applications, Ignite can be used as an in-memory database. The Ignite Persistent Store feature provides backup and recovery capabilities and allows companies to trade off infrastructure costs and application performance. With the Persistent Store feature, the active data set can be larger than the available RAM. The entire operational data set is kept on disk, while only a user-defined subset of data is maintained in RAM. This infrastructure can be built using an underlying storage layer which can use spinning disks, solid-state drives, Flash, 3D XPoint or other storage-class memory technologies.
Many IoT use cases, such as self-driving cars and smart city traffic systems, also benefit from a continuous learning capability that enables a machine learning model to be automatically and continually updated without human intervention. Apache Ignite features integrated, distributed machine learning libraries that have been optimized for massively parallel processing. This enables each machine learning algorithm to run locally against the operational data residing on the nodes of the IMC cluster, which allows for the continuous updating of machine learning models without impacting system performance, even at petabyte scale.
IoT use cases produce a stream of data from multiple sources. Apache Kafka is an open source system for publishing and subscribing to streams of records, storing the streams of records in a durable way, and processing streams of records as they occur. Kafka can be used as a real-time streaming data pipeline that reliably moves data across all the systems and applications comprising the IoT platform. Like Apache Ignite, Kafka runs on a distributed compute cluster that can span multiple data centers. Apache Ignite users can utilize the native Kafka integration between the two products to ingest data streams from Kafka for processing and analysis. Users of GridGain, an enterprise-grade system based on Apache Ignite, can use the Confluent-certified GridGain Kafka connector, which offers more functionality than the native connector found in Ignite. These integrations make it easy to incorporate Kafka stream processing into an in-memory computing architecture, which provides high-performance data processing and analysis.
Kubernetes is an open source system for automating the deployment, scaling and management of applications that have been containerized in Docker or other container technologies. Kubernetes ensures consistency across a server cluster that has been deployed in any location — on-premises, public or private cloud, or hybrid environment. Open APIs enable Kubernetes to manage Ignite and Kafka resources and automatically scale an IoT in-memory computing-based cluster. For IoT applications, this increased ease of management can dramatically reduce complexity and errors and save development time, enabling IT resources to keep their focus on more strategic activities.
Enterprises that want to ensure successful IoT initiatives and derive the maximum ROI from them should include open source technologies when designing their application architectures. Proven, reliable open source systems can lower development, deployment and maintenance costs, while also supporting increased innovation. And the open APIs on which these technologies are based make it easier for enterprises to integrate them with their existing systems, another important potential area of cost savings compared to proprietary systems. Specifically, the combination of Apache Ignite, Apache Kafka and Kubernetes enables enterprises to cost-effectively build and deploy a high-performance, massively scalable data platform that can support demanding IoT applications.
It’s abundantly clear that the internet has changed the world, making many aspects of our work and personal lives easier, healthier, more productive and satisfying. In fact, the internet is an essential part of daily life and an engine for economic growth around the world.
But these benefits come with drawbacks. They range from having our preferences tracked and used for profit, to distributing falsehoods that can influence the outcomes of elections, and, in the very worst cases, the exploitation of children and adults.
The question of whether a connected world is really a better world is very timely, especially in light of ongoing security and privacy breaches, and it’s time to give it a closer look.
The unregulated internet
In a very real sense, we live in the Wild West of the internet. From a legal perspective, the internet is almost totally unregulated. Perhaps more importantly, the effects of the internet on society are hardly understood — very few people grasp the big picture. As a society, we are all learning about the impact of the internet as we go along.
Here’s one example of this ongoing learning: I subscribe to a streaming video service that now knows my tastes and recommends movies that fit my tastes. Initially, this seemed like a great feature. But I began to realize that the consequence of following the recommendations was that I was becoming more entrenched in my own bubble. What I thought was enrichment turned into lack of variety and made me monotonous.
A similar example we are learning more about is the fact that social media on the internet creates bubbles, where like-minded people meet, interact and agree emphatically with each other. Our perceptions become our realities. Our critical thinking skills begin to erode, even about our own ideas, because we prefer to hang out in our bubble of “like minds.” What seemed like a good idea at the beginning has value flipped and become negative.
Social media also sells information it “knows” about us for targeted advertising. This seems like a positive thing, as now we see only ads that influence us to buy products and services we care about. But where is line between influence and manipulation?
Advertising tries to influence our behavior. We are used to that and have even built up a certain resistance to it.
However, there is a dark side of influence — manipulation — that happens when there is an imbalance in information between two parties, and one party exploits that imbalance to take advantage of the other.
Manipulation is nothing new, but the internet has made it more insidious. If data from social media enables entities to know our weaknesses, uncertainties and vulnerabilities, they can profit by selling this knowledge to advertisers. The imbalance of information is exploited by applying big data and artificial intelligence to the information we reveal about ourselves.
Manipulative advertising can be subtle, like phishing, or less subtle, like data theft. When the information of large groups of people is obtained, and those groups are then bombarded with very specific targeted messages, it can have huge impacts. As we now know, it can tilt the outcomes of elections.
The need for a code of conduct
The internet is a new frontier, with no law and order yet established. Invented by great minds and created by great engineers, certainly. But as a society, we are still in the early stages of learning about the impacts on our daily lives. And we still have a long way to go to deal with the consequences.
I recently wrote an email to a friend about a particular subject — buying a certain brand of car — and assumed that the email was completely private. However, to my surprise, that same day I received an advertisement on social media about that brand, and a deal especially for me. Coincidence?
In the physical world, the privacy of correspondence has been protected for centuries. Tampering with the mail is considered a felony crime in the U.S., for example. Why is this assumption of privacy broken when we move to writing an email and sending it over the internet?
Similarly, we know that in the physical world, our houses cannot be entered without a warrant. But if I bring in IoT devices and sensors, including cameras and microphones, can I still consider my home secure? More specifically, am I waiving all expectations of protected privacy? Can the data generated by these devices be freely used for targeted advertising? Is my voice assistant only listening when I call out the key word? Or can my personal situation now be exploited freely all the time?
Can we wait for legislation? Probably not. That’s why a code of conduct for the internet is so important.
What would an internet code of conduct include?
In the Wild West, the code of conduct was pretty straightforward: Let’s pretend we live in the civilized world and comply with the rules of the physical world as if enforcement was in place.
A code of conduct for the internet, the virtual world, should also mirror the rules of the physical world and adopt those rules without any enforcement being in place, while we wait for legislation to (maybe) catch up. I believe these three elements, each founded on the ideal of respect for one another, are key:
It seems to me that these rules will be implemented in the internet sooner or later, for the simple reason that they make sense. They have made sense in the physical world for hundreds of years, why would they not make sense in the virtual world?
Progress is not free
The internet is an amazing invention and we are dependent on it. But we must ask ourselves hard questions about our understanding of its power and impact — both the light and the dark side.
Should using something that is “free” be the same as knowingly waiving our privacy? Have we been too naïve as individuals and as a society? Are we okay with elections being manipulated? What can we do to protect ourselves? The answers may cost us some money, but maybe it’s worth it.
I was asked recently whether self-regulation or imposed legislation of the internet would stifle innovation. My first thought was that it probably would. That’s fair enough, because we have some cleaning up to do. We have gotten a little ahead of ourselves and assumed too much freedom in the Wild West of the internet without understanding the price to be paid.
But I also realized that making the internet a safe place and a fair environment is an innovation challenge in itself. So, in that sense, self-regulation or legislation does not stifle innovation at all. It steers the internet in the right direction. It is just a very important example of market feedback for engineers and innovators to develop the right products.
In my work with Qorvo, we are committed to the idea that a connected world is a better world. But this better world does not come for free. And it should not be dominated by a few large companies that are able to determine law and order as it suits them. It is something that we all must understand, believe in and fight for, to make it right. Having an internet code of conduct is a good starting point.
Change is inevitable. Heraclitus, a Greek philosopher born in 544 B.C., said, “You never step in the same river twice, for it’s not the same river and you’re not the same person.” Humanity has been striving to reshape the world for the better since the beginning of time. The earliest tools mankind invented were shaped out of stone. The latest ones are shaped from technology. Every innovation we create seems to inspire twin currents of fascination and fear — fascination with how it will advance our society and fear that it will render us obsolete. Consider these examples from a recent World Bank report:
- In the late 1500s, Queen Elizabeth I refused to grant a patent to the inventor of the knitting machine, fearing it would deprive her subjects of work. It was the first major stage in the mechanization of the textile industry which later led to the Industrial Revolution.
- In the 1880s, the Qing dynasty fiercely opposed constructing railways in China, arguing that the loss of luggage-carrying jobs might lead to social turmoil. Today, trains in China move 54% of domestic trade by train, more than any other major country.
- In 2018, robot density per worker was the highest in Germany, Korea and Singapore. Yet the employment rate in all of those countries remains high despite the prevalence of robots.
The only constant is reinvention
In the economic game of survival of the fittest, reinvention is a constant theme. Where would Samsung be today if it still sold dried fish? True reinvention impacts both an industry’s businesses as well as its individual participants. For example, Uber is often mentioned as an example of an industry disruptor that displaced jobs for incumbent taxi drivers. However, a recent analysis of Uber’s impact on U.S. cities showed that Uber not only increased the number of jobs for drivers by 50% on average, but the wages for Uber drivers were about 10% higher.
The same is true for technological progress — while it does disrupt the way things were done in the past, it also opens the door for new opportunities and skills. For example, instead of hiring traditional loan officers, a leading fintech platform in China created more than 3,000 data analysis jobs to sharpen algorithms for digitized lending, according to the World Bank Group report. In the manufacturing sector, GE is advancing its own digital transformation by enabling employees to learn the skills needed for future jobs through its “Brilliant Learning” program. In fact, ManpowerGroup reported that 87% of employers across 44 countries plan to upskill their workforce to fill talent gaps. And, for those willing to take the reinvention plunge, there’s a rich array of free education on AI from universities, corporations and massive open online courses, such as Udemy, Coursera and edX.
Augmenting human ability
While innovation often does require new skills, its primary goal is to augment human capabilities, enabling us to do bigger and better things. AI promises to do the same in this era of data explosion, where Cisco predicted that every person will generate 50 GB of traffic per month by 2022. That’s 10,000 times more data than the first hard disk held in 1956. While that seems like a giant leap, it’s not surprising given the proliferation of devices connecting to IoT, the latest contributor to the daily deluge of data. It’s simply not possible for the human mind to process this amount of data. So tasks that depend heavily on processing large amounts of information to achieve better outcomes will require AI help — for example, calculating insurance or financial risks, diagnosing diseases, detecting and pinpointing danger, identifying ideal job candidates and so on. The best outcomes, however, will come from AI+human collaboration as a recent experiment demonstrates. In the test, a clinical AI was pitted against human doctors on a medical exam, and while the AI outperformed the human doctors, the AI+human group performed best of all.
The power of platforms and crowds
Interconnected digital platforms are the underlying foundation of the information economy and AI+human collaboration. Today, virtually anyone with an internet connection can learn and develop new skills, start a company, trade goods and services, crowdsource app testing or solutions to complex problems and even build their own AI through digital platforms. By connecting customers, producers and providers in a one-to-many fashion, these digital platforms have become essential hubs of collaboration and innovation with their own inherent value. In a recent talk, Andrew McAfee, co-author of “Machine, Platform, Crowd: Harnessing our Digital Future,” shared an illustration of how quickly an ecosystem can develop its own value. He said when Apple first opened its app store to external developers, there were over 10 million downloads of those 552 apps in the first few days.
Since then, businesses have come to recognize the market value of these digital ecosystems as their customers increasingly choose products based on the services, content and intelligence they can provide. The market value of a smart home device, for example, is largely determined by the digital services its IoT ecosystem of connected providers can deliver. In many cases, embedded AI intelligence also enables the device to deliver a hyper-personalized experience of those services by learning and adapting to user preferences, voice, behavior and more. These smart devices, whether in the home or enterprise, are returning a treasure trove of insights on user preferences, needs and consumption behaviors that are creating new markets in areas like IoT data exchanges.
Society evolves and rebalances every time innovation occurs, and, despite our fears, the reality is that today a greater portion of humanity has access to better jobs, education and healthcare than ever before. Interconnected ecosystems are key enablers for these innovations and AI+human collaboration. Just as previous technological progress created new jobs that didn’t exist before, advancements in AI will create new jobs in app development, edge device operations, AI ethics compliance, data science and more. Broadly speaking, new opportunities will arise in these areas — framing the problems for AI to solve by providing contextual awareness, training algorithms, ensuring ethics compliance and responding to AI advice or output. And when the suggestions include new patterns or questions that hadn’t been thought of before, it will take human imagination to explore the uncharted waters and discover blue oceans of new markets.