IoT Agenda

Page 20 of 42« First...10...1819202122...3040...Last »

December 16, 2016  1:48 PM

Real-time data management will be the keystone of IoT

David Rolfe Profile: David Rolfe
Data analysis, Data Analytics, Data Management, Internet of Things, iot

So you’ve generated a tsunami of data from your consumer-grade internet of things devices — now what?

One of the odder aspects of IoT is how the “things” get all the attention when we talk. The reality is that in order for IoT to deliver real value, the things have to coordinate their activities over distances far further than Wi-Fi or ZigBee can provide. This implies IoT will involve millions of devices being utterly dependent on some form of controlling intelligence via the internet. So while the cute gadgetry gets the attention, the real value will reside in the back-end systems that conjure value out of bit streams in real time. This, in turn, brings up the question of latency.

Poor latency at mass scale will doom many IoT projects to failure. A consumer who turns on a lightbulb expects it to turn on when the switch is flipped — not a second later. Anyone who has tried to hold a conference call in which one of the participants is 1500 ms behind everyone else will be viscerally aware of how badly humans cope with latency issues, and the gadgetry in IoT will be no different.

To make matters worse, many IoT applications will involve optimizing the behavior of swarms of devices over very short timescales using real-world wide area networks, which are far more idiosyncratic in their behavior than a tame development environment.

Traditionally in IT we can make accurate decisions on small volumes of data quickly, or very large volumes of data several minutes after the fact. Online transaction processing systems tend to think in terms of thousands-of-transactions per second. Conventional analytic technologies can deliver much higher volumes, but with much longer latencies. For IoT to actually work, we need to make millions of very detailed and accurate decisions over very short (< 5 ms) timescales — requiring real-time data management.

Companies such as Push Technology¬†are focusing on improving connectivity by providing reliable “reality-aware” network links, but mass-scale, accurate decision-making represents a challenge to IoT industry. Over the last decade we’ve seen Hadoop and related technologies revolutionize the economics of slower batch systems, but there hasn’t been a similar, widespread change in our ability to make high-volume, accurate, compromise-free decisions on the scales at which we expect IoT to perform, in real time.

When the first data warehouses were deployed by supermarket chains, there was a period of joy as the grocery industry realized it could measure things like exactly how many cans of baked beans were sold in Scotland the day before, followed by a period of frustration when they realized they didn’t have any complete ideas for turning this information into more money. We are now seeing something similar with IoT, with a new twist being the realization that the commercial usefulness of IoT sensor data arguably has a half-life of about 500 ms. Being able to act in real time — within milliseconds — will be key to making money in many IoT use cases.

A further wrinkle is that the actual logic that will be implemented will be a far cry from the “Dog & Pony Show” demos now being used to promote IoT technology. With a product development lifecycle of 12-18 months and consumer expectations that physical devices like fridges will work for decades, the average IoT play will end up supporting at least 7-10 different generations of the same product — and that’s before first movers acquire their competitors.

So what can we conclude from all this? Much of the noise and hype in IoT focuses on the part of the process the industry is comfortable with — high volumes and high latencies, with any decisions being either obvious (such as switching lights off) or made by humans after the fact. But these systems are generally useful for optimizing existing processes. The problem with such optimizations is that the benefits are small, finite and decrease over time due to competition and technological change.

The more radical kind of change, which involves totally new use cases that involve automated systems making millions of accurate and precise decisions per second, is only becoming possible now. We’re moving away from an era where maybe 50% of all IP addresses were ultimately connected to a human, to one in which people will be outnumbered 20:1 by devices.

The real money is going to be in emergent use cases that combine high volumes, single-millisecond latency and the ability to make millions of commercially useful decisions in the trillions of tiny windows of opportunity each day.

VoltDB is ready for the real-time data management challenges of IoT. Are you?

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

December 16, 2016  11:43 AM

Forget Rudolph, IoT is Santa’s new best friend

Sean Riley Sean Riley Profile: Sean Riley
Digital transformation, Holidays, IIoT, Internet of Things, Manufacturing, Supply chain, Supply Chain Management

In the digital age, even Santa Claus is under pressure to deliver an improved customer experience. So he will be grateful this year for a new tool to help him plan his deliveries across the globe. Santa’s 2016 world tour will be faster, more agile and more productive than ever thanks to the internet of things. IoT is helping supply chain managers across the globe to develop more efficient manufacturing and distribution processes than ever before.

An IoT-enabled supply chain provides critical information such as location, speed, product temperature, vibration and product condition. These pieces of information can be used to calculate ETA and take action if products in-transit aren’t maintained in the contracted manner. This information can also be used to synchronize multiple assets together to remove wait times, congestion and, in short, remove an aspect of supply chain waste that practitioners created workarounds for years because the granular level of visibility needed to eliminate them wasn’t available. An IoT-enabled supply chain has the ability to decrease costs, but it is only one aspect of a resilient supply chain.

A specter of the Christmas-that-almost-wasn’t arose recently when a large shipping company, Hanjin Shipping Co., suddenly went bankrupt. Around 90 huge container ships were stranded at sea, during the peak pre-Christmas shipping period. These ships were carrying half a million containers of electronics, clothing and furniture worth about $14 billion. Few ports would let them in, creating logistical problems around the world and giving many retailers a good reason to worry in the run-up to the holidays.

IoT supply chain management

While many may think of supply chains as linear processes — where Christmas goods come directly from the supplier (the North Pole) to sit under their tree — these are actually complex systems that create, receive and distribute products to customers with many checkpoints along the way. In the modern supply network, IoT is the link that allows us to intelligently connect all the people, processes, data and things at each point. This collection of technologies — sensors, servers, analytics engines, etc. — is able to make sense of real-time data coming from each step in the supply chain, allowing managers to evaluate and predict events more accurately than ever before.

A disruptive event such as the Hanjin bankruptcy may be manageable after the fact, but the damage is already largely done — late shipments, lost revenue and unhappy customers. This is why it is important to prepare in advance and deploy technology that makes it possible to identify possible disruptive events before they happen.

In the case of Hanjin, there had previously been rumors about its financial instability prior to the bankruptcy. The issue is that it’s difficult for supply chain managers to act on rumors. It’s even more difficult to ascertain truth from fact in discussions with the vendor. Instead, if a receiving Hanjin shipper or customer received an actionable alert about Hanjin’s financial problems, you could have taken steps to mitigate the potential future risk, such as finding an alternate shipping company, or guaranteeing that critical or time-sensitive inventory such as Christmas stock always leveraged a transportation company with a low-risk profile.

The holiday season is usually a merry time for manufacturers, retailers and consumers, and with increasing IoT adoption, happiness and peace of mind will abound. By connecting all the elements in their operating infrastructure, supply chain managers will enjoy lower costs, greater insights with the ability to better respond, in real time, to changing conditions and changing needs. Ensuring your supply chain has visibility to risks, regardless of their origination, and the ability to manage or mitigate those risks will ensure your company will have a merry holiday season as well.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


December 15, 2016  1:17 PM

Beyond finance: Blockchain’s impact on the power sector

Isaac Brown Profile: Isaac Brown
Bitcoin, Blockchain, energy, Finance, power, utilities

Venture firms throw billions of dollars at blockchain startups, while both startups and enterprises prepare to capitalize on the potentially massive opportunities for blockchain beyond finance. Developers are applying the approach to solve problems in power, supply chain, health care, media and many other use cases. The technology is evolving rapidly and a rich ecosystem of players is assembling. Based on interviews with industry stakeholders and extensive secondary research, we find that the power sector has emerged as an early leader for blockchain development outside of finance, with the potential to cut costs and streamline processes in several types of power industry transactions.

Blockchain is an innovative database and transaction technology with broad potential

Outlining the basics of blockchain and dissecting its value proposition, we conclude that blockchain can be a great enabler but has its fair share of hurdles and limitations:

  • Blockchain is a distributed database technology that securely maintains a growing ledger of data transactions and other information among participants in the network. Developers are attracted to blockchain for its inherent security, data integrity and decentralized nature. The concept of “smart contracts” is a new innovation that furthers blockchain’s value.
  • Bitcoin was the first implementation of blockchain technology. Two other major blockchain infrastructure platforms have emerged: Ethereum and Hyperledger. A wide range of products and projects are being developed on top of bitcoin, Ethereum and Hyperledger, while some choose to build their own blockchain solutions from scratch.
  • Financial services is the most developed application of blockchain technology. Use cases in power, supply chain, agriculture and health care are beginning to take form, while the applicability in other sectors is less clear.
  • Blockchain still has several obstacles that lay in its path to broad adoption, including lack of talented programmers, fallibility of smart contracts, averseness to using cryptocurrencies and the increasing regulatory environment.

The power industry will find blockchain to be a useful tool in the right applications

In the power industry, we examine the potential for blockchain in three key power applications, finding differences in value proposition, challenges to implementation and adoption timeline:

  • Blockchain has a lot of potential to modernize wholesale electricity transaction processes as well as the financial markets that support them. However, it faces an uphill battle against influential, conservative stakeholders, who will take a wait-and-see approach to the technology.
  • Peer-to-peer electricity trading is emerging as a high-profile application of blockchain, allowing neighbors to sell their excess power to each other. Blockchain can provide a high-fidelity, middleman-free settlement system for these trades, but poor economics and regulatory barriers will keep peer-to-peer networks niche in the near term. The application of blockchain to “transactive energy” — in which participants buy and sell power in small and frequent transactions in reaction to fluctuating electricity prices — closely matches peer-to-peer transactions in its value and challenges.
  • Meanwhile, environmental attribute markets, such as those that exchange renewable energy credits (RECs), are already implementing blockchain, but only in the least-regulated applications, where both the barriers and benefits of implementing blockchain are relatively low.

What is blockchain?

Blockchain is a distributed database technology that securely maintains a growing ledger of transactions (defined as transfers of data natively tracked by the system) and other information among participants in the network. Developers are attracted to blockchain for its inherent security, data integrity, decentralized nature, and its ability to simultaneously provide both public openness and effective anonymity. Bitcoin was the first implementation of blockchain, which proved its robustness and overall value.

Bitcoin, Ethereum and Hyperledger are the three main games in town

Some blockchain developers choose to build custom blockchain solutions with their own toolsets, but there are few true experts who are capable of building blockchain solutions from scratch. As such, most developers leverage the available tools from bitcoin, Ethereum and Hyperledger. Depending on the specific requirements of a given project, these three platforms have strengths and weaknesses. Bitcoin does not offer native smart contract functionality, although companies like Counterparty and RSK are trying to extend that functionality into bitcoin; Ethereum and Hyperledger both offer native smart contract utility. As mentioned, Hyperledger is the only option of the three to provide permissioned blockchains, which offers advantages for users who only admit known participants into a blockchain project. Bitcoin is widely regarded as being incredibly secure, although concern is beginning to emerge over the consolidation of the mining pools; Ethereum’s reputation is in question after the Dao situation and an ongoing series of denial-of-service attacks; the jury is out on Hyperledger as there have not been enough scale deployments.

Beyond these three dominant core infrastructure providers, the blockchain value chain is rapidly becoming rich with a wide range of entities, many whom build on the core three (bitcoin, Ethereum and Hyperledger), as well as many who build their own solutions from scratch. This ecosystem includes horizontal platform solution developers, vertical solution providers and enterprises that are developing products or contributing code to other projects.

Listen to a podcast on blockchain from Isaac Brown and Katrina Westerhof.

Blockchain participants

Developers are targeting multiple enterprise use cases beyond finance

While many startups and major ICT players are developing horizontal blockchain platforms, plenty of startups are developing industry-specific enterprise blockchain solutions; meanwhile major enterprises from specific verticals are exploring the impact blockchain can have on their industries. Blockchain will likely have an impact on every industry, but there are several industries where the specific relevance is beginning to take shape.

Blockchain in power, supply chain, agriculture, health care

We analyze three potential use cases for blockchain in power transactions

Power transactions come in many different flavors, as electricity changes hands several times between generation and consumption. Here, we use case studies to analyze three types of energy transactions — wholesale, peer-to-peer and environmental attributes — exploring the value of blockchain, the solution architecture, the barriers and problems to be solved, and the outlook. As an example, a blockchain solution for peer-to-peer transactions is outlined below:

Blockchain streamlines power transactions

Stakeholders should investigate blockchain, but be realistic about fit and readiness

Blockchain is an excellent tool for the power space, streamlining transactions that have long required manual work and enabling new types of transactions that would be impractical otherwise. Much of the compatibility between blockchain and power transactions is related to the ease of measuring electricity with metering infrastructure, which allows data and markets to coordinate in a robust way. However, power incumbents are generally conservative players that are difficult to woo or uproot.

Overall, peer-to-peer applications provide the greatest value over alternatives, but are held back by regulatory barriers and a limited market size for the time being. Blockchain has a lot of potential to modernize wholesale transaction systems, but it faces an uphill battle from influential, conservative stakeholders. Meanwhile, with low barriers to blockchain adoption but also relatively low value that can be gained, REC markets are already seeing some adoption, but are unlikely to feel much impact from the technology. Other applications like EV charging and retail have little to gain from blockchain. These findings are summarized below:

Blockchain in power transaction applications

Contact us for more information.

Lux Research analyst Katrina Westerhof contributed to this article.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


December 15, 2016  12:35 PM

Industry 4.0: Breathing life into the cyber-physical factory

Dean Hamilton Profile: Dean Hamilton
IIoT, industry, Industry 4.0, Internet of Things, iot, Manufacturing

Industry 4.0 is the wave of the future for the manufacturing of things.

Originally conceived as a vision of the German government in 2006 as part of its High-Tech Strategy 2020 Action Plan, Industry 4.0 is frequently lauded as the fourth Industrial Revolution. The impact could well surpass that of the first Industrial Revolution that began in Britain in the late 1700s and took us from an agrarian economy on to a path for mass-producing affordable goods using steam power, electricity and, eventually, computers and automation. In other words, we’ve progressed from the horse and buggy to the Model T, and now we’re on to self-driving cars!

What will power the smart factories of the Industry 4.0 era? The internet of things, cloud computing and cyber-physical systems (CPS) technologies. Cyber-physical systems are powered by enabling cloud technologies which allow intelligent objects and cloud-based programmatic modules to communicate and interact with each other. These new cyber-physical manufacturing facilities use robotics, sensors, big data, automation, artificial intelligence, virtual reality, augmented reality, additive manufacturing, cybersecurity systems and other cutting-edge technologies to deliver unprecedented flexibility, precision and efficiency to the manufacturing process.

Yet while the Industry 4.0 revolution is forming, it’s important for companies aiming to be at its forefront to carefully consider which platforms are best positioned to deliver the promise of this exciting future, and what capabilities those platforms should possess.

Developing products, business processes and apps within an Industry 4.0 framework requires thinking beyond what any single product or system can be expected to do. In fact, the most exciting aspect of the Industry 4.0 vision is open and evolving industrial systems that can rapidly take advantage of the latest technological innovations. Imagine what the future could hold for an IoT product that has a complementary ecosystem build around it.

Intelligence sharing for smart factories

Up until now, the manufacturing automation landscape consisted of technology and data silos, based on hardware vendors. Companies with a global footprint of factories often end up with a heterogeneous and incompatible mix of automation technologies. And while these individual systems may each collect and transmit, they are not designed to easily make this valuable data available to other manufacturing systems, either within the same factory or perhaps located in another state or country.

IoT cloud platforms provide a powerful solution for harmonizing incompatible connected devices. On the factory floor, IoT compatible gateways provide a mediation layer between the proprietary protocols used by many vendors’ automation systems and the open internet-based protocols that are the foundation of IoT. Data from disparate manufacturers can be normalized in the gateways before transmission to an ingestion queue in the IoT cloud, while edge logic can be pushed to the gateways for local control of connected devices.

Mining potential of the cloud for healthy complex systems

But the real potential lies in the cloud. When cloud-based cybernetic intelligence is linked to global manufacturing operations, machine learning algorithms can identify patterns and extract insight that can optimize operations. As a factory in one region creates more optimized workflows that improve efficiency, those benefits can be rapidly exposed and propagated throughout the global operation. As predictive algorithms identify signs of potential system or subsystem failure in one factory, other factories can act quickly to avoid catastrophic incidents that can ripple through the entire business.

It’s useful to think of modern manufacturing environments as complex and interconnected living organisms, similar to the human body. Ensuring optimal health depends on the ability to:

  1. Rapidly identify exposure to pathogens
  2. Efficiently analyze root causes and potential secondary effects
  3. Develop effective remediation strategies

The first step requires diagnostic tools to visualize data across such interconnected systems, or steps two and three are very difficult to accomplish. And, if it is too costly or time-consuming to employ these tools, the patient is likely to get worse instead of better. So, what tools are necessary to realize an Industry 4.0 vision that will take smart factories to the next level?

The missing link: Programmer-less visual design

One of the most significant impediments to realizing the Industry 4.0 vision is implementing the necessary tools and applications to holistically visualize operations, identify opportunities for improvement and implement changes. Even in an IoT environment, applications must be created to take advantage of all of the data available in the cloud.

The traditional approach calls for hiring an army of programmers to build a “solution.” Not only is this approach costly and time-consuming, but it’s highly inflexible. Living organisms are constantly adapting to their environment; businesses and manufacturing operations are no different. As the environment changes, they must react rapidly. Business optimization is not a once-and-done event, but a constant battle to bring all the forces within a business into equilibrium. If every change to a cyber-physical system costs a million dollars and takes a year to implement, the promise of Industry 4.0 will never be fully realized.

But what if programmers can be removed from the equation? Okay, an entirely programmer-free approach to automation is still some distance off. But an approach that minimizes the use of programmers and maximizes the use of business analysts and subject matter experts is starting to emerge. The key is visual modeling, analytics and orchestration.

Visual modeling tools allow the elements of the system (and the data associated with those elements) to be modeled by the people who understand them best. Models can be refined and expanded without programming effort as the complexity of the system grows. The inter-relationships and dependencies of the component parts of any system are as important as modeling the components themselves. Any visual modeling tool should provide the capability to easily define these relationships.

Once a system has been modeled and the data is ingested to the cloud, that data should be exposed for decision making. Depending on roles, different stakeholders will see different slices of the data to inform their understanding. An Industry 4.0 development platform will allow easy extraction and normalization of data from a wide variety of sources (both real-time and non-real-time), with useful dashboard views and alerts using drag-and-drop technology. Platforms will allow data to be fed to machine-learning pipelines and emergent patterns easily viewed.

Orchestrations that influence behavior of the entire system will be created. These visual orchestration tools will allow drag-and-drop workflow design, built from a palette of programmatic components. These workflows can be tested and deployed as micro-services using a DevOps methodology.

Finally, with cloud platforms of the future, visual design services will be consumed as a service, and applications can be deployed on the public or private cloud of choice.

With tools such as these, the work of defining and evolving the efficiency of the system will be controlled by the people who know it best.

From IoT clouds to Industry 4.0 clouds

The vision of industry 4.0 is a long way from the days of producing cloth by laboring over spinning wheels by the hearth. Modern IoT clouds are now beginning to deliver the tools needed to make the Industry 4.0 concept a reality. Platforms running on popular IoT clouds are already delivering on this vision of visually designed factory automation to catalyze the transformation of the manufacturing landscape. These new technologies promise to unleash a virtual tidal wave of change throughout the manufacturing industry. And that rising tide promises to lift all boats, as businesses become increasingly capable of rapidly adapting their manufacturing to meet the needs of an ever-changing competitive landscape.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


December 15, 2016  10:53 AM

Internet of People (IoP): The next frontier for IoT

Derek Peterson Profile: Derek Peterson
Data analysis, Data Analytics, Internet of Things, iot, IoT analytics, smart home

With all the talk of the internet of things, it seems everyone is scrambling to jump on the IoT bandwagon. Companies are connecting products to the internet because they can. The question is: why do that if there is no additional value by being connected?

Taking it one step further, we are seeing a huge play for the “end all – be all” IoT cloud platform. Significant money, time and energy are being spent on IoT for machine-to-machine (M2M) communications. While M2M communications is needed, I see the next big frontier of IoT as the Internet of People. Connecting people to the internet is crucial, as this amazing technical resource that has connected everyone and everyone’s things continues to evolve.

Let’s take a look at IoT today and how it is being used. Good examples of IoT solutions are smart home appliances such as home alarm systems, cable TV boxes (DVR) and smart home thermostats that can all be controlled by a mobile phone.

As you can see, there is great potential for IoT, and companies are racing to connect devices, gadgets, databases and applications. The internet has made it dramatically easier and ubiquitous for “things” to get connected. To enable these connections, there are dozens of companies creating IoT platforms.

What’s even more exciting is the next phase of IoT is what I’m calling the Internet of People, or IoP. The Internet of People focuses on personal information collection and has a wide range of applications. Through this connectivity of people and things, we see devices getting smarter — or perhaps you can even say people are getting smarter.

As part of the evolution of IoT/IoP, businesses are creating all types of analytical engines to understand what some once called big data. With the hope to inspire consumers to share more information and eventually make a purchase, IoP solutions will have access to more data and leverage a new level of data.

However, organizations are missing a very key element in their study of IoT/IoP data, and that is the human sentiment. Data is just data and it does not necessarily report or communicate a feeling. Even the best written email can be misunderstood. So how do you derive a feeling from data? How do you rate the importance of an event that is triggered from a sensor without the human sentiment?

For the first time in history, people are willing to put their lives online and communicate publicly (online) how they are feeling (e.g., Facebook, etc.). This data reflecting actual human sentiment is found in people’s social media usage/posts and their daily internet searches. The Holy Grail will be for those companies that can understand social media data and marry that with sensor data about human sentiment and to produce real-time actionable results.

With the right predictable analytics, companies will be able to gain a deeper understanding into consumer buying patterns and better understand what triggers a purchase. This combined data is not only useful for retailers, but also for healthcare and public safety too.

I anticipate we will all be hearing about the Internet of People more in the coming year. The key takeaway here is that IoP is not just more sensor data, but it is a learning process or a study of how people interact with sensor data.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


December 14, 2016  4:32 PM

Amazon Go seeks to break big-box retailers

Mark Bunger Mark Bunger Profile: Mark Bunger
ai, Amazon, Internet of Things, iot, Machine learning, retail, Retail IT, retailers, Sensors

Early this week, Amazon announced Amazon Go, aiming to transform the ages-old brick-and-mortar retail experience. The official news broke via multiple channels, including a well-produced YouTube video showing shoppers entering a stylish grocery store. Central to the concept is the absence of any physical checkout system; shoppers check in upon arrival and browse as they normally would. Amazon says it uses a combination of computer vision, machine learning and artificial intelligence (AI) to track users and items throughout the store. When a shopper picks a product, this array of in-store sensors and back-end analytics automatically tabulate the final bill (deducting it from their Amazon account, of course), allowing shoppers to be on their way — “Just walk out technology.” According to initial reports, Amazon has actually built a 1,800 square foot test site in one of its Seattle buildings. While today it is only open to Amazon employees, it says it may allow the public to shop in “early 2017.” To put the store’s scope in perspective, it is the size of a modest home, whereas most of the U.S.’s 38,000 supermarkets are 50,000 square feet or more — more than 25 times the size of Amazon’s store.

Amazon’s project is ambitious, but no one piece is remarkable — it’s the system integration that makes it compelling. The retail segment of the market has been by far the earliest adopter of advanced indoor sensing (AIS) — a toolbox of technologies which allow user-centric interaction with buildings (see the Lux Research report, “Advanced Indoor Sensing: The Next Frontier of the Built Environment“). This can range from people tracking to messaging, indoor location and more. Retail-focused startups like Aisle 411 and Point Inside have helped digitize thousands of retail stores, such as Walgreens and Toys “R” Us. They use a mix of beacons, electronic maps and sensor fusion to help guide shoppers through a store, and ultimately increase “basket size.” The former is even working with Google’s Project Tango to enable augmented reality capabilities in-store to help shoppers navigate to a particular item and display product information, with eerily high precision. Lowes is using it too — its “holoroom” allows customers to visualize furniture in their own homes.

Amazon can use all of these technologies and more. However, there are some key technical details which are hugely relevant:

  • Sensor fusion is possible but largely unused. While the idea of pulling data from shoppers’ smartphones is often discussed, using it is difficult without running into privacy issues. Retailers that have used this capability to date rely on personally identifiable information, by requiring shoppers to sign into a frequent shopper account, allowing them to track movement, purchasing history, etc. This “check-in” is core to tracking users as they move through a store. It is highly likely Amazon is using a mix of Bluetooth beacons coupled with visible light communication to enable location tracking throughout the store, although these technologies are both available in the market today, from lighting companies such as Acuity Brands. There is no mention of RFID, and while it is possible that Amazon uses it, this concept does not depend on it.
  • Computer vision is unproven in this type of application. While several startups such as PointGrab have leveraged low-resolution video for sensing, few have used high-end computer vision outside of the residential segment. Residential security cameras which use computer vision for functions like facial recognition are still in development, or work poorly. Costs are also high for the devices themselves, ranging from $250 to $400 per camera, as they require onboard processing capability. Assuming a 50 square foot radius, costs would balloon to $1.5 million per retail site for cameras alone — excluding sensors in the shelves (which would also be required). One industry CEO to whom we spoke told us that there will be edge cases which will be very challenging, and another Silicon Valley veteran proposed fingerprint scanning as a way to deal with this hurdle.
  • AI must be applied to narrow use cases to make sense. And the new Amazon Go retail store embodies how AI can be implemented behind the scenes to solve specific problems and enable amazing innovations in the process. Amazon claims its new store uses “deep learning,” a method which depends on large clean data sets (see semi-supervised learning). It is unclear what the extent of Amazon’s use of AI technology is at the moment, but the likening of its implementation to that of self-driving cars in its advertisement hints at some of the potential applications of deep learning in the Amazon Go store. For instance, the use of ConvNets is widely applied today to process imagery for tasks such as facial and object recognition, which can ultimately be used to monitor an individual as they navigate within the store and help detect any items that the user picks up. Moreover, one can easily see how tracking user purchase history can be used to supplement information obtained from sensors to better inform the system of what might have been picked up by the user. Other possible applications of deep neural networks include using user purchase history as input data for recommendation engines to help suggest items of potential interest to customers, or even for anomaly detection, whereby suspicious user behavior can be detected to prevent theft. Overall, though the use of the terms “artificial intelligence” and “deep learning” may ultimately be marketing ploys, there are some very interesting foreseeable applications of deep learning in the Amazon Go ecosystem. And given that this news comes shortly after the announcement of Amazon AI, it is highly likely that Amazon has the tech to back up its words and successfully apply AI to make the digital grocery store of the future a reality.

Retailers are inherently secretive about the economics of their operations, but they are experimenting with these technologies because it offers operational improvements coupled with a better user experience. Many AIS technologies do not have a clear ROI — and neither does the Amazon store. The key for Amazon will be for it to leverage these capabilities for adjacent use cases in the future, such as using robots to help restock shelves, delivering in-store personalized nutrition planning, and other future cases.

Lux Research analyst Alex Herceg contributed to this article.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


December 14, 2016  2:43 PM

The time has come to redefine the internet of things

Emil Berthelsen Profile: Emil Berthelsen
Data analysis, Data Analytics, Internet of Things, iot, IoT analytics, M2M, Machine data

Consider the various definitions of the internet of things, and then compare them to definitions of machine-to-machine solutions. There are substantial similarities between the two, and yet subtle differences have characterized the attributes of each. Figure 1 illustrates some of the differences. In reality the evolution is not quite as well defined as this would suggest; to a great extent terminology is less important than the idea that one environment, which can be defined as M2M, has given way to another which can be termed IoT. But perhaps the time has come to redefine the internet of things.

Adoption of IoT

In a recent survey of U.S. enterprises by Machina Research, over 81% of companies had either implemented or would deploy IoT technology in the next two years.1 This clearly indicates a growing awareness of the benefits and opportunities in IoT, and what is very interesting is that 69% of these enterprises plan to develop the means to capture and use product-generated data, including 80% of manufacturers.

IoT definition, M2M definition

Figure 1: The changes in M2M to IoT Source: Machina Research, 2014

This focus on data, and ultimately analytics as applied to the data, has become the real story in the internet of things. As enterprises explore new insights, opportunities and value from this emerging technology, data and analytics have become invaluable functions in IoT. Technology vendors and analytics providers are busy developing and enhancing their capabilities in data management and analytics, particularly artificial intelligence and machine learning to meet these needs. And data and analytics has remained far from static — a key development has, as an example, been the combination of analytical tools with domain expertise, pushing analytics to new levels in such diverse areas as predictive maintenance for oil and gas pumps.2 At this time, IoT has become more about data-centric solutions than perhaps the more traditional product-centric view of M2M and IoT. But what is a product-centric view?

Product-centric versus data-centric views

In a product-centric view, infrastructures and architectures are defined and described in terms of the connected products or components. The focus in M2M and IoT are the products which include devices, networks, applications, platforms and solutions. Together they deliver product systems and, to date, enterprises and their IoT initiatives have focused on ensuring efficient and interoperable enablement technology layers to deliver such systems.

In a data-centric view, the focus shifts from the building blocks of infrastructures and architectures, and turns to the actual content or data of the system, and particularly, how value is generated from the applications through analytics. A data-centric view begins to look at a totally different set of building blocks; data ingestion, data storage, data processing and aggregation, visualization and data analysis tools. In addition, a data-centric view raises and addresses issues such as data governance, security and ownership significantly earlier in the design process.

There is an argument that the two are closely related, and that the product-centric and data-centric views are sides of the same coin. This is true, but ultimately the two are different and produce inherently different approaches and priorities. More importantly, it has become clearer what limits a product-centric view has, and how a data-centric view will help address these shortcomings.

Redefining IoT with a data-centric view

Why has the time come to redefine the internet of things? The product-centric view of M2M and IoT plays an important role in defining and raising the key infrastructure and architecture questions when implementing IoT upfront, but once these solutions are up and running, the next set of important questions emerge to which a product-centric view of IoT has been ill-prepared and ill-equipped to answer. These questions relate to questions of data governance, security and ownership. It is the ongoing attribute of IoT which enterprises and technology vendors are attempting to cope with, and one way to begin to answer all these questions is to start with a proper definition of IoT, namely being a focus on data and analytics and, more importantly, the building blocks that belong to the data-centric view.

The definitions of M2M and IoT are part of a continuum or wider landscape of upfront and ongoing activities as illustrated in Figure 2. As noted, as the initial M2M implementation has taken place, i.e., that of the infrastructure and architecture, IoT becomes an ongoing enterprise task of data management and value creation through analytics and data monetization.

redefine the internet of things

Figure 2: Components of M2M and IoT Source: Machina Research, 2016

A premature farewell to product-centric M2M?

Have we said farewell to M2M too soon? The product-centric view of M2M (and IoT) remains a critically important, upfront activity for many enterprises, implementing the necessary infrastructures and architectures for M2M and IoT. This needs to continue. What we do need is the market to stop spending more time on a repeat of the product-centric view and move onto the next, and ongoing, stage of IoT, that built around data and analytics, and the crucial building blocks in this approach.3

1 For more information about the survey, read Machina Research Strategy Report “Enterprise IoT Survey 2016” December 6, 2016

2 For additional information, read Machina Research Research Note “Ambyint optimizes performance for oil wells with IoT,” November 3, 2016

3 For a longer and more detailed discussion of the new data-centric view of IoT, read Machina Research Research Note “Taking a closer look at the definition of IoT,” December 6, 2016

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


December 14, 2016  1:55 PM

Considering blockchain technology for IoT

Shanmugasundaram M Profile: Shanmugasundaram M
Blockchain, Internet of Things, iot, iot security

As with any new technologies emerging today hogging the limelight, blockchain is having its own hype around it. Even though it is a revolutionary technology for finance, contracts, etc., many publications are proposing blockchain to be used for IoT. Blockchain technology is a secure distributed ledger to monitor and store transactions and nothing more. If the transactions have to be unique, as most transactions need to be, then blockchain is the answer. This is the reason why blockchain is uniquely suitable for the financial world, where all kinds of transactions occur. It is breakable, at least theoretically, but not possible practically based on present technology. If quantum computers are a reality, then all fingers will be crossed about not how the blockchain will be broken, but when. The whole edifice of blockchain is resting on the majority of the participating nodes in a blockchain network not getting compromised by a malicious attack. Transactions which are suspicious to the network will be rejected. But is this what we want in the IoT world? We need security against malicious attacks even on a single node; if a single node is compromised, the whole network is at risk. This is in direct contrast to what blockchain has to offer.

Here I will be discussing about why we need not force fit blockchain into IoT, which is already complex in a way.

What is blockchain?

When a physical money transaction happens, the physical bill is transferred from the payer to the payee. Hence there is no chance of duplicating the physical currency. But if a currency is in a digital format, then it is easy to copy the currency digitally and use the copied currency for another transaction. To prevent copying of digital currency, a central authority is needed to monitor the transactions and to keep track of who is the owner of which currency value. This central authority is the weakest link of the digital currency or any other contract-based transaction. If the central authority is compromised for some reason, the whole currency and its value will fall on its face. Hence, it is very important to protect the central certifying authority from malicious attacks.

If the central authority is not needed for some reason, then the digital currency becomes fail proof. Blockchain technology is the answer where there is no central authority involved in policing and certifying the transactions. Blockchain is a distributed ledger where all the participating nodes will have the history of all the transactions in place. If any copying occurs in one node, all the other nodes will come to know the duplicate transaction and will reject it. Blockchain is not breakable until the majority of the nodes in the network agrees that a transaction is valid. To break the network, one needs to possess computing power which is higher than all the participatory nodes combined and that is not practically feasible.

Where blockchain might be considered in IoT

Blockchain can track any transactions in an atomic manner and hence can be deployed where an asset is getting tracked. The asset might be transported from point A to point B and there might be infinitesimal intermediate points where the asset has to pass through. In such a scenario, the asset should not be tampered with or diverted or should not be renamed without prior permission. Blockchain, if implemented in this scenario, can keep track of the movement of the asset, considering each one of the atomic movements/registration as a transaction and preventing any duplicate entries.

In another use case, blockchain might be used to track the measurements of the sensors and prevent any kind of duplication with another malicious value in the same timestamp. But the same can be achieved in a variety of ways without involving blockchain. All we want is to secure sensor data from any malicious attacks.

If the data from sensors has any associated monetary value and can be sold, blockchain can be used to prevent duplicate selling. In other words, the owner of the data alone can sell the data and not the receiver the data. The receivers of the data can only use data but cannot sell. This scenario is remote in the IoT world at present, since the sensors are not autonomous and fixing the price for sensor data is tricky.

Where blockchain technology is a misfit in IoT

Going by the first use case above, it will look as though the blockchain acts as a keeper of unique datasets of a particular asset. In other words, blockchain acts as a glorified database, which it is, in this scenario. Blockchain in this scenario acts as a kind of primary key in a database, if we can talk in a most simplistic manner about blockchain.

Blockchain’s main strength is the number of nodes participating in the network so that majority of the nodes are not compromised. In an IoT scenario, unless a sufficient number of nodes participate, the blockchain itself will fail due to any malicious attacks. Hence, blockchain in a wider network does not make sense unless that many nodes are interested in a single asset which is getting transferred from point A to point B.

Can blockchain be used in a private blockchain network?

A private blockchain is a blockchain implementation in a private network. The private network will obviously have a lesser chance of getting attacked and will have a lesser number of nodes as well. However, a private blockchain is not a blockchain in the proper sense of the term since a proper blockchain will have many blockchain networks or nodes operating to monitor and use transactions. A single blockchain network in a private network is a glorified primary key keeper and not of much use to prevent malicious attacks, but only slower due to expensive computational requirements of blockchain.

Conclusion

Blockchain technology should not be considered as a panacea for IoT world and should be used judiciously. Blockchain technology, useful in its own way, has to be weighed with other technologies to secure the IoT network instead of blindly following the cloud. What we need is security against attacks in the IoT world and not against duplicate entries most of the times.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


December 13, 2016  11:54 AM

Big data or bust: The geospatial data connection to IoT

Ron Bisio Profile: Ron Bisio
Big Data, Big Data analytics, Data Analytics, GEOSpatial, Geospatial data, Internet of Things, iot

People around the world touch intelligence powered by geospatial data every day. From enterprise-scale operations to consumer applications on a smartphone, we (businesses and consumers) are increasingly reliant on geospatial data — and we want it in real-time. Today’s demand for data requires gathering information from every possible endpoint, analyzing the data and turning it into actionable intelligence. Geospatial intelligence is now a requirement for the internet of things because it exposes critical spatiotemporal information that businesses and end-users need to make day-to-day decisions.

While there is proven value in geospatial intelligence, identifying the value of big data can be a complex practice. With the emergence of IoT and big data, the industries that operated in silos are now being connected. Big Data is so large in volume and complexity that it often requires advanced tools and skills to manage, process and analyze. Additionally, while we often have visibility into geospatial data virtualization, we don’t always see the sheer number of touchpoints on that data that were required to make it deployable.

A world of sensors

The number of connected sensors is rapidly increasing. There are billions of connected devices around the world. For example, the utilities market uses smart meters and rail monitoring systems use tilt sensors. On the consumer side, smartphone and wearable technology markets have become major economy drivers. Each of these devices, industrial and consumer, are IoT devices, and all of those connected sensors are producing data that is leveraged for intelligent decision-making.

In terms of geospatial big data, positioning systems such as GNSS and inertial measurement units will be part of a broader hierarchy network of sensors, control devices and user interaction. Positioning and visualization technologies, already in use for autonomous vehicles, will soon be expanded into other transportation applications. Freight haulers, for example, can use wireless, internet-connected sensors for position, temperature and other data to track the location and status of perishable cargo.

The intersection of social, business, marketing and volunteered data

As IoT adoption grows, so does our use of geospatial data. For example, if you are leaving your house to pick up a friend from the airport and you don’t want to be late, you can look at an app on your phone to determine the route with the least amount of traffic delays, and get an estimated arrival time down to the minute. As geospatial data becomes more prevalent our personal lives, we are seeing it become more widespread in commercial and industrial settings as well. An early example of volunteered information, commonly referred to as crowdsourcing, was the use of a mobile app during the 2010 Gulf Coast oil spill. It was used by residents and the public to capture and chronicle what they were seeing happening to the land, sea and wildlife in their areas. Using geospatial technology, citizens were taking part in a mass data collection process that was used by scientists, government agencies and non-governmental agencies to assist in the clean-up efforts.

Business decision processes based on spatial information often extend beyond traditional geospatial professionals. Disciplines such as operations, finance, asset/facility management and construction use information on the location of assets and materials for day-to-day management and planning. For example, defining a corridor for an electricity transmission line brings together factors in engineering, environment, finance and land use regulations. Working from the same data set, each discipline can extract the information that it needs. Comprehensive geospatial information enables the project team to examine how changes in one factor can affect the others. While they understand the value of the geospatial information they rely on, non-geospatial professionals may not know — or care — how the information comes to their desktops.

How is geospatial big data collected?

The tools for collecting data include satellites; crewed and unmanned aircraft systems; mobile and stationary cameras and scanners; and a broad range of handheld and survey-grade GNSS and optical positioning devices. Decision makers want real-time information that reflects current conditions and they rely on technologies such as wireless communications, sensors and application-specific software to obtain field information that allows them to make more informed decisions quickly. Streamlined processes for data collection and analysis are essential to providing timely, accurate information.

Most of the real-time data is collected from a source via wireless communications connected to the cloud. Networks of GNSS reference stations stream data to a powerful server where the information could be merged and analyzed. Then customized data streams could be sent to individual GNSS rovers for use in RTK positioning. Freed from the need for a reference station, surveyors could work quickly and freely over large geographic areas.

The speed, ease and flexibility of this technology helped fuel a dramatic increase in the use of real-time GNSS positioning. Today, cloud-based positioning services support applications in surveying and engineering, construction, agriculture and more. For example, structural or geotechnical monitoring solutions utilize cloud positioning and web interfaces to deliver critical real-time information to stakeholders in remote locations.

Data visualization: Organizing geospatial big data for maximum intelligence

Attempting to utilize the enormous volume and diversity of geospatial big data is like drinking from a fire hose. To handle the flood of data-specialized solutions such as automated 3D modeling and feature recognition software further increase the value of big data by extracting specialized information from large images and point clouds.

Many organizations that can benefit from aerial and satellite data do not have the capabilities to gather it themselves. As a result, they often turn to service providers for airborne photography and image processing. Imagery from satellite systems such as Landsat is available at no cost, but may lack the resolution required for many GIS applications.

Today, through a data marketplace, users can view and select from an assortment of geospatial data available for a given location. The marketplace curates data from a variety of public and private sources, including government charts and terrain models, landsat imagery, and high-resolution commercial satellite photos. Frequent updates to imagery enable users to conduct time-based analyses on natural or built features.

Geospatial data has moved far beyond the days of two-dimensional drawings and maps. Information can be produced and visualized to facilitate in-depth analysis and evaluation. Three-dimensional positions and attributes can be developed according to requirements for precision and detail. By combining multiple datasets, it’s possible to develop 4D models that enable users to view conditions over time. This approach provides the ability to detect and measure changes and provides important benefits to applications such as construction, earthworks, agriculture and land administration.

A fifth dimension, cost, also can be included with spatial information. The resulting model enables users to improve efficiency and cost effectiveness for asset deployment. A construction manager can use visualization tools to create a virtual site and examine options for moving equipment and materials during a project. Similarly, landfill operators can use 5D techniques to manage daily operations and ensure optimal long-term utilization of permitted airspace volumes.

Geospatial data and IoT

IoT and geospatial intelligence are increasingly connected as we rely more on geospatial data for our consumer and commercial needs. Geospatial big data is massive and complex, but with dedicated analysis it is extremely valuable to its user. As we see more sensors and connected technology deployed in our environment accompanied with the growth of critical geospatial data collection and the virtualization of that data, we will continue to see new IoT innovation and possibilities.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


December 13, 2016  10:27 AM

2016 called, it wants its software spaghetti back

Ruben van der Zwan Ruben van der Zwan Profile: Ruben van der Zwan
API, API management, Application Programming Interface, Container, Internet of Things, iot

Mobile! APIs! Tesla! Oops. I just gave away the outline of this entire article. Do not stop reading, though. Many people (like, a lot) ramble on about the importance of software integration, mobile friendliness, and application program interfaces. What they tend to forget, however, is explaining why these concepts are so important. We all seem to know we should do something with data collection and APIs, but only a few of us seem to know why. I have a theory, and it has everything to do with customer trends, IT developments and my favorite car manufacturer in the States. In this article, I gladly share my view on the future of IT, which will be filled with exciting new services for both techies and consumers.

“Big data is everywhere, but we need ways to capture and translate it into useful information.”

Upside-down

APIs have conquered the world and have turned many industries upside down, like those in the IoT space. Thermostats transformed into personal assistants, cars changed into data centers on wheels. The latter was mainly due to the many ideas of Elon Musk and his people. They equipped the new Tesla cars with CPUs, cameras, many sensors and an equal number of APIs. This way, drivers receive real-time information on the weather outside, the driving behavior of others, the braking distances based on road conditions and the fuel efficiency of their own vehicle. Today, cars do not just bring us from A to B; they are starting to become true life companions that safely lead us through traffic, autumn storms and busy work days. They can, but only because of APIs. Big data is everywhere, but we need ways to capture and translate it into useful information that can be properly analyzed. This is the job of the API, which speaks all the device languages in the world and makes sense of the input that sensors and cameras gather. This sense-making is the reason behind the API popularity, as they pave the way for better services, smarter devices and real-time decision making.

“Why would we spend our evenings wandering around the city looking for that perfect pair of jeans, when Google already knows what we want and where to find it?”

Go where the people go

APIs have also set in motion another interesting development. Now that industry disrupters (Tesla, Uber, Netflix) use them to improve their services, suddenly everything seems possible. Customers carefully monitor all the innovations in IT land and are impatiently waiting for the next big invention. It has made them more critical, more demanding and more outspoken. Why would we use our precious time to walk to a bank and wait in line, when an online app can do the trick? Why would we spend our evenings wandering around the city looking for that perfect pair of jeans, when Google already knows what we want and where to find it? Today, consumers go their own way and all companies can do is follow them. I do not think this is a bad thing. I am a consumer myself and I believe IT should serve the end-user rather than trick him into buying stuff he does not need. At the same time, in a society where technological developments happen so rapidly, many companies are having a hard time surviving. Luckily, APIs have not only created a highly competitive environment; they are also the perfect weapon to gain a competitive advantage.

Getting APIs to work

APIs are just like revolving doors, meaning their success depends on many other factors. Thanks to recent developments in the field of battery duration, memory storage and digital antennas, they can now function to their full potential. This is where it gets very interesting. The internet of things is the perfect example (I recently launched an IoT book co-written by John Mathon, founder of Tibco). Now that IoT sensors (and their APIs) are equipped with powerful batteries, they can be deployed by pretty much everyone, meaning even small companies can join in the IoT and API revolution. This also means that a worldwide Wi-Fi network is within reach, just like faster networks, as devices can now choose between a couple of thousands of bandwidths (thank you digital antenna!). These are all major changes that will most definitely change both the world and your business. Think about it. Now that you have APIs, you can integrate all internal processes, your communication with chain partners and even with the government. One of my favorite examples is a Swiss transport company that uses APIs for online goods declarations. Whenever its truck drivers pass a border, information about the company, the driver’s personal details and the truck’s cargo is being sent to customs automatically. This is made possible by APIs and license plate recognition and it saves truck drivers a lot of time. It is so simple, yet so efficient.

“If you really want to disrupt your sector, you have to bring it.”

Success formula

Best practices like Tesla, Uber and the Swiss transport company suggest that API usage is a guarantee for more efficient production processes, more innovate products — and more ROI in the long run. But obviously, you need more than APIs. If you really want to disrupt your sector, you need to bring it. By this I mean that you should find ways to make sense of your software spaghetti and make it flexible, scalable and easy to integrate. To do this, you need three things:

  1. The cloud, so that you can easily store and exchange data
  2. API and identity and access management to monitor and secure everything you own
  3. Full stack automation, to automate your software development (because why spend time on building software when you should put all your time and energy in optimizing every single element?)

These three things are not nice-to-haves; they are essential to your success formula. Another concept I highly recommend is container technology, like the one by Docker, which packages everything you have in a way that you can build every application anywhere and at any time. You will need it. Partners, customers and end-users do not wait for you to finalize your applications, they want them now and they want them to be perfect. So if you want to shorten your time-to-market and still deliver perfect quality, you need to automate wherever you can. Yet again, APIs are your only option to do so. Our VP of Strategy John Mathon wrote a nice blog about containers and Docker: Containers are becoming the Lingua Franca of the cloud.

“When you do not delight your customers with new and better products, you should consider doing something else.”

Big ideas

Quite a challenge, this digital era! I personally do not mind it; I love the current innovation pace and I am impatiently waiting for new exciting things to happen. It inspires me to come up with better products and better services so our own customers can disrupt their sectors (and who knows, the entire world). To be honest, I do not think there is another way of doing business. If you do not innovate, fail to integrate your services and wait for your competitor’s next move, you will not survive another five years. What was good in 2016 is outdated by the time you read this article. It is the new way or nothing at all. When you do not delight your customers with new and better products that make their lives easier, you should consider doing something else. If you do have big ideas about your sector, the world and your end-user, on the other hand, you have a chance of success. All there is left for you to do is connect them to an API!

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


Page 20 of 42« First...10...1819202122...3040...Last »

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: