IoT Agenda


February 14, 2018  4:36 PM

Embedded vision in IoT

Sachin Kurlekar Profile: Sachin Kurlekar
Camera, Internet of Things, iot, IoT analytics, IoT data, Sensor, Sensor data, vision

From computers to robots to artificial intelligence, multiple key technology advances have been inspired by the need to replicate or emulate human intelligence, sensing capabilities and behavior.

Various sensors, such as acoustic, vision and pressure, have been inspired from human hearing, vision and pressure-sensing abilities.

Undoubtedly one of the most important human sensing ability is vision. Vision allows humans to see the environment and interpret, analyze and take action.

Human vision is a remarkably involved complex, intelligent “machine” that occupies a significant part of the brain. Neurons dedicated for vision processing in the brain take up close to 30% of the cortex area.

Making devices, objects and things “see” the environment visually, as well as analyze and interpret, has been a key research area for a number of years.

Technological complexity, massive computation power requirements and prohibitive costs previously restricted vision sensing and intelligence to security and surveillance applications using surveillance cameras. However, the situation has changed dramatically today; the vision sensor market has exploded. Cameras are getting embedded everywhere and in all sorts of devices, objects and things, both moving and static. In addition, computation power available on the edge and on the cloud has increased dramatically. And this has trigged the embedded vision revolution.

The right sensor/camera price point, various technological advances in vision sensors resolution, dynamic range and amount of computation power available to process the video/imaging are leading to mindboggling growth and diverse applications.

Vision intelligence enabled through a combination of classical image processing and deep learning, has become possible in today’s world of connected embedded systems, devices and objects, taking advantage of both edge computing power on the device itself and also cloud computing power.

This has triggered rapid growth in self-driving vehicles, drones, robots, industrial applications, retail, transportation, security and surveillance, home appliances, medical/healthcare, sports and entertainment, consumer augmented and virtual reality, and, of course, the ubiquitous mobile phone. Vision and its intelligence is a storm in the IoT world, and it’s only going to increase.

In fact, no other sensor has had such a dramatic impact. So prevalent has video become in day-to-day life that most people take it for granted. From streaming live video systems to video on demand to video calling, it’s easy to forget the dramatic impact vision sensors have had in a world of internet-connected environments and devices; it is truly the unsung hero sensor of IoT. Combine it with vision intelligence and the whole market takes a new dimension.

The growth in prevalence of embedded vison has its roots in the explosive growth of mobile phones with embedded cameras. Prior to the mobile phone revolution, video/cameras and intelligence remained associated with security and surveillance. But then mobile phones with embedded cameras arrived and this aligned with simultaneous massive growth in computation power for video analytics and intelligence, both on the edge and the cloud. This explosive combination has led to the remarkable growth, and vision sensors started getting embedded everywhere from robot and drones to vehicles, industrial machine vision applications, appliances and more.

There are various types of vision sensors, but complementary metal-oxide semiconductor, or CMOS, sensors by far have had the largest impact and have led this explosive growth of vision sensors in various embedded systems and smartphones.

Sensors are everywhere — and are numerous. Self-driving cars today have more than 10 video cameras, drones have three to four video cameras, security surveillance cameras are everywhere, mobile phones are streaming live video. Video data from these sources is streamed for further intelligence in the cloud, while real-time edge processing is happening on devices and things themselves.

Vision Sensor resolution, dynamic range and number of vision sensors continues to scale up with no end in sight. With massive amounts of video data getting produced by these sensors, naturally amount of computation power required is huge, as its transmission and storage requirements.

Previously, there was a rush to stream video to the cloud for real-time or stored vision analytics. Cloud offered immense computation power, but bandwidth necessary for transmission even after compression was high. Huge amount of storage, latencies involved, and security and privacy concerns are making customers rethink the cloud and instead consider vision analytics at the device/object level and then doing offline video processing in the cloud.

With low-latency, high-speed 5G connectivity promises, there is a thought to distribute real-time video processing between the edge and the cloud. However, it remains to be seen how much of this is possible — if it at all — and whether it makes sense to transmit real-time compressed video to the cloud from millions of endpoints hogging transmission bandwidth.

The importance of edge analytics has market enabled various system-on-a-chip (SoC), graphics processing units (GPU) and vision accelerators. Cloud with GPU acceleration is getting used for non-real-time video analytics, or for training neural networks on large amount of test data while real-time inference is happening on the edge with the accelerators.

With deep learning and optimized SoCs now available, along with vision accelerators for classic image processing, the edge analytics trend is likely to continue, with additional events, parameters and intelligence pushed to cloud for further analysis and correlation. The cloud will continue to remain important for offline stored video analysis, while some systems can still do real-time analysis there.

Vision applications in the real world

Vision and the vision intelligence market continue to evolve rapidly. There are some striking technology trends happening, and they are expected to fuel the next massive growth of vision over the years. Here are a few examples:

3D cameras and 3D sensing. 3D Cameras, or more generally 3D sensing technology, allow depth calculation in a scene and the construction of 3D maps of a scene. This technology has been around for a while, popularly used in gaming devices such as Microsoft Kinect, and more recently in iPhoneX’s 3D sensing for biometrics. Here again we see the market on the cusp of taking off with smartphones providing the needed acceleration for a much wider set of applications. In addition, robots, drones and self-driving cars with 3D cameras can recognize the shape and size of the objects for navigation, mapping and obstacle detection. Likewise, 3D cameras and stereoscopic cameras are the backbone of augmented, virtual and mixed reality.

Deep learning on the edge and in the cloud. Neural networks-based AI has taken the world by storm, and again it’s the computation power available today which is making deep learning networks possible. There are other contributing factors which have led to the growth of neural networks in practical applications and that include massive amount of data (videos, photos, text) available for training and cutting-edge R&D by universities and tier 1 companies and their contributions to open source. This in turn has triggered a lot of practical applications for neural networks. In fact, for robots, autonomous vehicles and drones, deep learning inferences running on GPUs/SoCs at the edge has become the norm. The cloud will continue to be used to train deep learning networks, as well as for video processing of offline stored data. Split architecture processing between the edge and cloud is also possible as long as network latencies and video pipeline delays are considered acceptable.

SLAM in automotive, robots, drones. Simultaneous localization and mapping, or SLAM, is a key component of self-driving vehicles, robots and drones fitted with various types of cameras and sensors such as radar, Lidar, ultrasonic and more.

AR/VR and perceptual computing. Consider Microsoft HoloLens; what’s behind it? Six cameras with a combination of depth sensors. Microsoft even announced the opening of a computer vision research center for HoloLens in Cambridge, U.K.

Security/Surveillance. This article does not focus on this traditionally video and video analytics dominated area. This is a large market by itself.

Mobile phone- and embedded device-based biometric authentication. Biometric authentication can trigger the next wave of mobile apps, and again it’s the camera sensor, combined with video analytics at the edge and cloud, triggering this. As technology matures, it will spread into various Embedded devices.

Retail. The Amazon Go store is an example of using cameras and high-end video analytics. Soon we are going to have robots in aisles assisting humans, all outfitted with multiple cameras and vision intelligence along with other sensors.

Media. Video-based intelligence is already used heavily in the media industry. Video analytics can allow you to search through large video files for a specific topic, scene, object or face.

Sports. Real-time 3D video, video analytics and virtual reality are going to enable the next generation of personalized sports and entertainment systems.

Road ahead, challenges, motivations and concerns

A need for ever-increasing high-resolution video, wide dynamic range, high frame rate and video intelligence has created an ever-growing appetite for high computation power, transmission and high storage capacity. And it’s hard to catch up continuously.

A few companies are taking a different path to solve this problem. In the same way neural networks are biologically inspired, ongoing research and commercialization of bio-inspired vision sensors which respond to changes in a scene and output a small stream of events rather than sequence of images have started appearing. This can result in large reduction of both video data acquisition and processing needs.

This approach is promising and can fundamentally change the way we acquire and process video. It has a high potential to reduce the power consumption as a result of much reduced processing power.

Vision will remain the key sensor fueling the IoT revolution. Likewise, edge video intelligence will continue to drive the SoC/semiconductor industry to continue on its path of video accelerators using GPUs, application-specific integrated circuits (ASICs), programmable SoCs for inference, field programmable gate arrays (FPGAs) and digital signal processing (DSP), accelerating classing image processing and deep learning, and giving developers room for programmability.

This is a battlefield today, and various large established players and startups are aggressively going after this opportunity.

Low-power embedded vision

With the growth of vision sensor and embedded intelligence in millions of battery-powered objects, low-power embedded vision remains one of the prime factors for growth of the entire industry for next era, yet also remains one of the key problems to solve. Building products and systems with embedded vision and intelligence is going to raise privacy and security concerns that need to be handled properly from the design stage.

Despite the challenges, the future of embedded vison in IoT is bright and the market opportunity huge; the companies solving these challenges are going to reap huge rewards.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

February 14, 2018  11:15 AM

IoT and IT can be friends

Ofer Amitai Profile: Ofer Amitai
Data Management, information technology, Internet of Things, iot, IoT data, IoT devices, iot security, IT, Machine learning

While the common belief is that the internet of things is inherently insecure, with the right security measures in place, the technology can be used to enhance network security. Indeed, with IoT visibility technology, safe authentication mechanisms and firmware updates in place, IoT is well-suited for a fruitful friendship with IT security professionals.

Let IoT take the reigns

IoT devices are expert data collectors. Their capabilities to mine data on processes can streamline mechanical and manual processes in a way that not only automates them, but improves performance with machine learning over time. Indeed, IoT is a kind of intersection between the rapidly evolving worlds of automation, artificial intelligence and machine learning, which are key in helping reduce overhead from IT staff. If an issue arises with a mechanical arm on the factory floor, or the office HVAC system is bugging out, IT staff no longer needs to directly address the issue — they can use data from the connected device that controls them to better understand the issue and reach an appropriate plan of action. Being able to address these issues from “afar” not only saves time that could be devoted to more productive tasks, it helps solve issues so as not to affect immediate productivity. In addition, IoT devices can help paint a picture of potentially looming issues and vulnerabilities on the network, which can be factored into larger policy decisions and security strategy. While IoT security concerns should be considered, and steady awareness of potential vulnerabilities maintained, the devices can be used for IT benefit by saving precious time and reducing unnecessary effort.

Gain actionable intelligence

As IoT devices contain an array of valuable data, they can be used in an IT context to better understand the network and optimize security strategy. For instance, through an IoT device, it is possible to monitor network activity — which devices or appliances are taking up the most bandwidth, or where there may be a loophole in the network architecture. By analyzing and understanding the data provided by IoT devices, it’s possible to control for vulnerabilities before they ever occur, transforming that data into actionable intelligence. IoT can be used to provide intelligence all the way to the periphery of the network, as observed by Jim Gray of Microsoft who believes that IoT devices will start working as mini-databases, collecting information on everything from servers to sensors. Though there is the immediate difficulty of processing all of this data, with the right data management tools in place, early adapters of IoT-IT technologies could see drastic improvements in the areas of network and connectivity management.

Understand IoT’s potential and limitations

Any good friendship is based on a mutual understanding of value — the value that you bring to the friendship and the value that they bring to the friendship. While it may seem utilitarian to view friendship in this way, there is a subconscious understanding between good friends that their immediate and long-term value is what is hanging in the balance.

The same is exceptionally true of IoT devices and IT professionals. IoT needs IT to function correctly and safely, and in some cases to have all the data that it collects understood. IT needs IoT to increase efficiency, reduce overhead and provide actionable intelligence. However, an IT professional that blindly embraces IoT without taking its limitations or security risks into account is bound to be let down (as an IoT device is an inanimate object, it doesn’t really work the other way around). Therefore, a clear understanding of IoT security risks, such as low computing power, weak memory, specified OS and limited patch/firmware updates, is essential for the friendship to work and bear fruit.

The first step for IT professionals looking to appropriately embrace IoT technology is to integrate a detailed implementation plan. It may involve a slow “phasing in” of simple IoT devices, like a smart coffee machine or connected security cameras, which can help IT pros better understand what needs to be in place before full-force deployment. In addition, it provides ample opportunity to assess the company’s network security stack, ensuring that the right technologies are in place. Visibility, control and management systems are essential if IoT is to work and live up to its end of the bargain. Following a small-scale deployment, and with the proper device discovery tools in place, IT teams are likely to be ready for a full-fledged friendship with IoT devices.

When it comes down to it, IT professionals don’t have much of a choice when it comes to embracing IoT technology. According to the majority of projections, there will be at least 30 billion IoT devices connected by the year 2020. Any IT professional that cares about their organization’s culture of innovation and security would be wise to befriend IoT now, before they are engulfed by a wave of devices to integrate, see, control and manage. At this stage, IoT is essentially asking IT, “Why can’t we be friends?”

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


February 13, 2018  2:43 PM

Five key factors for building killer IoT visualizations and reports

Dave McCarthy Profile: Dave McCarthy
Data integration, data visualization, IIoT, Industrial IoT, Internet of Things, iot, IoT analytics, IoT data, Reporting, Reports, Visualization

IoT is one of the most talked about digital transformation initiatives impacting industrial enterprises today. Its ability to significantly improve decision-making has exceeded expectations. In fact, we recently found (note: registration required) 95% believe IoT has a significant or tremendous impact on their industry at a global level, and 73% plan to increase their investments over the next 12 months. But without killer reporting and visualization tools, this transformative experience wouldn’t be possible. Teams would drown in their data, miss critical insights and more. So one caveat I always share with businesses new to IoT is to start planning out the reports and visuals they anticipate carrying the most meaning for the business early in the process. Waiting until the last minute could result in frustration or missing the business objective all together.

To avoid unfavorable results, here are five key factors to consider when building reports and visualizations that make it easy to understand what you — and your peers — want to learn from your connected enterprise.

#1. Pinpoint what needs to be communicated. IoT deployments collect more data than can possibly be utilized. You don’t have to look at it all either; you can prioritize data based on your particular business objective(s). Matching the output report to your initial goal, such as improve asset uptime or reduce costs, is critical in extracting the most value from your data. Anything else can be a distraction. So, ask your stakeholders what a) they want to learn from the data, and b) what decisions they want to make from it. Incorporating only those data sets that align with the desired outcome can help you achieve success.

This report on downtime helps an asset manager determine the estimated time to failure as well as identify the affected component.

#2. Choose the right reporting style for your audience (and there will be more than one). There is a plethora of reporting options available natively in most platforms or applications. With some customization, you should be able to get up and running quickly. The reporting style preferences will vary depending on the audience. For example, dashboards and email alerts may be more fitting for the day-to-day needs of asset managers whereas less frequent, higher level reporting may be preferred across the C-suite.

But, be warned that complex representations of data can easily lead to confusion. Which I’ll go into next.

#3. Edit the story. Simplicity is the key to effective data visualization. However, it is very easy to create chaos in reports by underestimating the complexity of your assets and their associated variables.

Let’s consider a simple example from the transportation industry. For a semi-trailer rig, how is fuel consumption (ex., mileage) related to engine horsepower? Generally, you can picture what the relationship would look like — as engine horsepower goes up, so too would the consumption of fuel (i.e., a positive correlation between a single independent and dependent variable).

Now let’s make this simple example more complex: How would you visualize the relationship between fuel consumption, type of route traveled (in-city, long haul, short haul), engine horsepower, truck configuration, time of day and driver experience level, all in the same visual — that’s not easy. But, each of these variables could be analyzed and characterized separately in a very meaningful way.

So, when the usage scenario is complex, the best approach is to break the problem into multiple layers of analysis, each with a simple two or three variable maximum.

#4. Consider enterprise data integrations. Cross-system reports can extend the value teams can get out of IIoT data sets. IoT applications can plug into popular reporting platforms, such as Amazon QuickSight, Power BI from Microsoft and Tableau. Here, asset managers can connect IoT asset information with other enterprise system data to create new data views that can be critical for business success.

For example, a quick-serve restaurant may want to forecast food demand in order to reduce inventory costs and waste. Sensors in entryways and the parking lot provide part of the picture, but it’s also necessary to tie that to historical point-of-sale data, as well as weather and traffic information. This additional context improves the accuracy of the predictions.

#5. Set best practices. Because a variety of teams may need to communicate IoT data to the same audiences, it’s important to set templates where possible so reporting is streamlined and cohesive. Publishing to a shared location with templates will help save teams time and allow them to standardize on what works well.

Regular tweaking of reports will also be required to keep aligned with your maturing IIoT deployment. Some other tips for creating compelling visuals include: orient text horizontally, clearly label data and use some color (but not too much color). The key is to let the visual do the talking. If you’re finding that you need to explain every detail, it’s time to fine-tune.

Effective reporting is one of the biggest drivers behind IIoT maturity. It enables industrial organizations to build on success and quickly spot and fix issues. As a result, businesses are able to reach the latter stages of IIoT maturity — analytics, orchestration and true edge computing — faster. It is also essential for getting the broad number of stakeholders involved in an IIoT deployment on the same page to ensure maximum value from your investments.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


February 13, 2018  2:33 PM

Four tips to prototype your IoT product on a shoestring budget

Tony Scherba Tony Scherba Profile: Tony Scherba
Enterprise IoT, Internet of Things, iot, IoT applications, IoT devices, prototype, Testing

For every hit IoT product, there are dozens that unnecessarily met their demise.

Take Juicero, the creator of an unquestionably cool connected juice machine. Complete with a custom-built scanner, microprocessor and wireless chip, Silicon Valley’s darling juicer crushed sealed bags of fruit and vegetables to produce a drink.

Investors poured millions into the startup only to find that customers could squeeze the bags themselves to accomplish the same result more quickly. In his defense, Juicero founder Doug Evans was quick to point out that he spent years developing a dozen prototypes of the machine.

Some who hear Juicero’s story might take from it that IoT product prototyping just isn’t worth the trouble. True, prototyping an IoT product tends to be more costly and complex than other tech products. If anything, though, it’s more important with IoT products than standalone software. Prototyping is creators’ best tool for judging technical feasibility while also validating the product’s market fit — two areas in which IoT products are particularly likely to be tripped up.

Same results, fewer costs

To be sure, IoT prototyping is well worth the money. But if you’re concerned about costs, rest assured that there are plenty of ways to do it inexpensively and effectively:

  1. Build a simpler mousetrap. Your prototype does not have to be perfect; in fact, it shouldn’t be perfect. Identify and test only your product’s riskiest assumptions. For Juicero, for example, this should have involved testing whether users even needed the device to squeeze their juice. Think of your prototype as a test to see whether your product warrants further investment, not a production run.
  2. Test with a minimum of users. Don’t skip user testing because you think you need a big research department to do it. Find just five users to put your prototype in front of. Use their insights to cheaply tweak your design. For hardware components, in particular, post- or mid-production tweaks could cost millions. Let your prototype testers save you from your own blind spots.
  3. Focus on the service, not the features. Young IoT companies often try to load as many features as possible into their products. Instead, focus your prototype on the service your product will provide. Ultimately, you’ll wind up with a product that’s cleaner, easier to use, and cheaper to produce.For example, why did Juicero bother to embed a scanner and blocking system into its app? Couldn’t it have just used a QR code scanner to log the packets and check their expiration dates? Save money and reduce the risk of being scooped by a competitor by building a no-nonsense prototype.
  4. Manage the device from an app. Another common mistake greenhorn IoT creators make is trying to do too much on the device’s firmware. By their very nature, IoT devices are connected to a mobile or web app. Why not let that software handle interface control and processing instead? Trust me; it’s much easier to update an app than it is to push a firmware update to a hardware device. Taking this route will drop development costs and cycle times down the road.

Yes, prototyping takes time and money. But the alternative is to let the market test your IoT device’s most daring assumptions. Considering the costs of hardware production — not to mention the value of your company’s reputation — that’s one lemon that you don’t want to wait to squeeze.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


February 12, 2018  1:51 PM

Digital twins can be the intelligent edge for IoT

Sukamal Banerjee Sukamal Banerjee Profile: Sukamal Banerjee
CAD, Collaboration, Data scientist, Internet of Things, iot, IoT applications, IoT data, Visualization

If you are involved with IoT, you have witnessed a surge of activity around the idea of digital twin. The digital twin concept is not a new one — the term has been around since 2003, and you can see an example use in NASA’s experiments with pairing technologies for its Apollo missions. However, until recently, technology barriers made it difficult to create a true digital twin. Today, asset-heavy enterprises and others are using the breakthroughs in technology to plan or implement digital twin products or manufacturing processes. We can expect this interest and growth to continue: Gartner predicts that by 2021 half of all industrial companies will use digital twins and see an average efficiency gain of 10% from them.

The most simple definition of a digital twin is an image connected to a physical object by a steady stream of data flowing from sensors, thereby reflecting the real-time status of the object. The data flow stream connecting the physical and digital is called a digital thread. In some implementations, the digital twin not only reflects the current state, but also stores the historical digital profile of the object.

We cannot overstate the importance digital twins will have for a number of industries, especially for manufacturing installations and processes that need close interactions between machines and human. There are two key reasons for this: visualization and collaboration.

Visualization

If you were to measure human senses in bandwidth, sight is the highest. As a result, human decision-making is reliant on being able to see the situation in full and take necessary action. This is why factory floor managers usually had a floor overlooking the factory floor. Today, with manufacturing installations and machines becoming more complex, that advantage of being able to see the processes has largely disappeared. Instead, computerized systems feed data to shop-floor managers to enable decision-making through data sheets or basic charts.

A digital twin can combine the best of both worlds by presenting to decision-makers in real time the data in an exact visual replica, including information previously not available as easily, such as temperature or internal wear and tear. In fact, a digital twin far enhances the efficiency of the visual bandwidth by removing non-critical information, processing basic information into a format much more easily absorbable and providing a more flexible (e.g., 360-degree or micro/macro) view of the system.

Finally, the visual aspect also helps immediately benchmark and compare across historical data or best-in-class data in real time. The potential of this aspect is tremendous as it identifies areas of improvement, shows areas of immediate concern and enables fast decision-making — all in real time.

Collaboration

The second critical aspect of a digital twin is the ability to share this digital view of machines irrespective of viewer’s physical distance. This, therefore, allows a large number of individuals to see, track and benchmark manufacturing installations globally. This ability also removes the delay in reporting alerts to management, removes single points of failure due to human error and makes seeking expert help easier.

A digital twin expands the horizon of access of the shop floor to product managers, designers and data scientists. Armed with this new understanding of how processes and machines are working or not working, they can design better products and more efficient processes, as well as foresee problems/issues much earlier than before, saving time and reducing materials wasted on building physical models. They can also see the gaps between desired and actual, and do root cause analysis.

Digital twins are different from traditional 2D or 3D CAD images in scope and use. While CAD images and simulations consist mainly of the data of dimensions of a single piece of equipment or subparts, digital twins focus on capturing more holistic data of the equipment in terms of how it interacts with other equipment and the environment. This entails measuring the data and configuration of the installation (including space and other dimensions between different equipment) and data of the ambient environment (temperature, pressure, vibration, etc.). This data is fed on a continual basis from the physical to the digital twin through the digital thread. In terms of use, while CAD drawings are primarily used early in the product lifecycle to influence design decisions, digital twins are used primarily for manufacturing and service operations.

So, what should a business considering digital twins examine? First ask, “What do I need to know about my manufacturing operations that will allow me to drive decisions?” This forms the basis of what kind of data to capture and what kind of visualizations to implement. The follow-up question is, “What are the top three to five roles in my business for which I primarily want the digital twin?” The answer to this question can effectively clarify what views to create from the captured data. Digital twins, by definition, are customized to roles to ensure only relevant data is shown, thereby reducing visual clutter. The final step is to create an incremental roadmap to make the digital twin richer over time. This can be done by either adding more relevant data sets to the existing imagery or by providing access to a wider set of roles within the business. A great example of how to build an incremental digital twin is Google Maps. Google Maps today emulates location and traffic data in much more detail and more accurately than it did a decade back. It has constantly evolved over time in terms of richness of data and hence utility.

The benefits will be worth the preplanning that a digital twin requires. Industrial companies that have digital twins will be able to create sustainable competitive advantage due to better products, higher efficiency and faster release cycles (from product ideation to market). The key, therefore, is to start even with smaller projects and keep reinvesting benefits/ ROI to create better or more complete systems in the near future.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


February 12, 2018  12:53 PM

Six challenges facing blockchain and IoT convergence

Jessica Groopman Jessica Groopman Profile: Jessica Groopman
Bitcoin, Blockchain, Collaboration, DLT, Internet of Things, Interoperability, iot, iot security, regulation, Scalability, Security, trust

Although the technology born of Bitcoin has received tremendous attention, investment and development activity in the last few years, there are a range of challenges facing broader enterprise adoption. What follows are a very brief summary of those most related to its adoption in IoT contexts.

Scale

The paramount technical challenge facing distributed ledger technology, or DLT, and IoT convergence is the ability to scale to meet service and security requirements across a dynamic network of devices. These requirements aren’t just precautions; they are foundational to running IoT in mission-critical, high-risk and high [data] volume (sometimes low-bandwidth) environments, such as healthcare, energy, transportation and beyond. This is rapidly pushing IoT data processing, management and analytics to the “edge,” where compute occurs locally, instead of relying on cloud connectivity.

A decentralized consensus mechanism may offer myriad benefits — information neutrality, authenticity, fault tolerance, security, etc. — but today that comes at the price of scalability, especially in an IoT context:

  • It is simply untenable for vast networks of nodes to process every transaction
  • Many (majority?) of “blockchain as a service” architectures are cloud-based
  • Limited bandwidth to support real-time transaction processing
  • Transaction and microtransaction fees can thwart project economics
  • Traditional data storage structures are untenable using DLT
  • Energy waste remains a massive barrier with environmental costs
  • The costs and risks of potential downtime are too high

The silver lining of these constraints is that they are heavily influencing the architectural development and designs of blockchain today. What data are on versus off-chain; what type of consensus defines distributed verification; even a shift away from chains of blocks to other (still decentralized) record-keeping architectures (e.g., IOTA, Zilliqa and Lightning Network are all novel DLT approaches). DLT is not a solution for IoT’s scalability issues, but scalability will define when, how and in what scenarios IoT and DLT converge.

Security

While DLT architectures offer promising security benefits, security remains a significant challenge in the design and deployment of shared architectures. Businesses must not only protect data, contracts, files, devices and networks, but also maintain privacy, authenticate identity, prevent theft/spoofing, and develop governance for autonomous device coordination and settlement. Adding IoT into the equation merely extends these decisions across the topology of the network, whether a large-scale factory environment, a remote field with low bandwidth for connectivity, or within a smart home or retail context. DLT is not a silver bullet solution for IoT security, it merely introduces new design considerations across the stack.

Interoperability

The ability to securely and reliably interconnect multiple networks isn’t just a challenge in the IoT realm, but in the DLT space as well. Although blockchain is not a data integration tool per se, distributed ledgers are inherently designed to offer shared visibility of data. That said, interoperability takes on new complexities in the blockchain realm:

  • Ability to integrate private and public blockchains
  • Design of permissioning and data access across multiple “chains”
  • Ability to integrate across multiple open source platforms
  • Ensuring common standards for compliance adherence
  • Ability to integrate with devices, existing data sets and incumbent systems

Requirements for customization can further fragment the technology’s ability to function harmoniously with other counterparties and architectures. Although a central objective of numerous consortia, including the International Standards Organization, and development activities, standardization is lacking at most layers of the stack, not to mention at the process level or across sectors or geographies.

Multiparty collaboration

While interoperability is typically viewed as a technical/standards hurdle — between data sets, devices, networks, etc. — it is also a deeply ingrained cultural hurdle. Traditional business instinct is competitive, proprietary and walled off, not open, interdependent or shared. This friction has challenged the current IoT market as traditional product-based business models are being forced to data-driven service-based business models, which inherently require an ecosystem to deliver.

The level of collaboration required for the successful and sustainable deployment of DLT is significant and will, in many instances, be entirely unprecedented. The range of intimate interactions between hitherto “strange bedfellows” is vast:

  • Multiparty integration with incumbent systems
  • Security and permissions testing across parties
  • Designing and implementing shared operational and technical frameworks
  • Encoding and implementing shared framework that adhere to regulatory compliance

Blockchain will not just require that estranged participants come together, but also demand multidisciplinary integration to define new laws, rules, liability frameworks, standards, processes, ontologies and definitions. Like IoT, the potential value of any DLT configuration is a function of the “network effect” — needing the network to prove the value of the network. If DLT can support greater control, granularity and trust to data and asset sharing, interoperability also represents a core catalyst for IoT business models involving broader ecosystems (note: registration required).

Regulation

Designing regulations and compliance into transaction execution is no simple feat. Enterprise-grade blockchain deployments will face numerous policy and legal questions — some such structures have precedents that are hundreds of years old, while others are all but entirely unchartered. Chief among these hurdles is the lack of clear monetary regulations and policy associated with digital or cryptocurrencies. Although certain countries and regulatory regimes are leaning into — or out of — the blockchain market, the IoT space is already foggy with legal uncertainties in data ownership, access, privacy and far beyond. DLT is not a replacement for governance, it merely introduces new ways to encode rules and process consensus.

Reputation and nascence

Both IoT and distributed ledger technologies are emerging. Blockchain is particularly nascent in its implementation — virtually all enterprise activity in the blockchain space is in proof-of-concept or pilot phase, and there are few, if any, at-scale deployments using private or permissioned blockchain configurations. While there are many examples of enterprise (particularly industrial) IoT deployed at scale, the simple fact that enterprise IT environments tend to evolve at a glacial pace means that both categories suffer from market dynamics that pose challenges to those exploring their convergence:

  • Shady reputations (e.g., Bitcoin’s role in the dark web; IoT’s security vulnerabilities)
  • Prone to media hype and hyperbole
  • Fragmented marketplace (enterprise vendors and Sis versus startups; across public versus private DLT; growth in certain industries)

Finally, the nature of a technology in which many distributed parties share financial and operational stake is one where a wide range of constituencies influence its development. In addition to innovators small and large, governments and regulators, investors, financial institutions and merchants, consortia, miners, developers and engineers, and even consumers all play an untold role in development and reputation of blockchain. DLT is not a maturing vector for IoT per se, rather its own evolution will impact the architectural evolution of IoT.

While the intersection of DLT and IoT is a critical technological convergence to watch, and one that may provide an architecture for trust in autonomous products and services, businesses must take a sober approach. Blockchain is not a panacea or a solution to all IoT (nor IT) inefficiencies. Rather it is a series of modules and sub-technologies that companies must evaluate against current solutions and the needs and risks associated with integration. The challenges facing both IoT and DLT are vast, and in many cases unprecedented, but if past is prologue, such barriers have a way of pushing development forward.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


February 9, 2018  1:53 PM

Five IoT trends to remain relevant in 2018

Serge Koba Profile: Serge Koba
Consumer IoT, Enterprise IoT, IIoT, Industrial IoT, Internet of Things, Interoperability, iot, IoT applications, IoT data, iot security

If we distance ourselves from all the hype surrounding the internet of things, we will have a well-established sphere that successfully works for businesses and consumers around the world.

Over the last couple of years, I had increasingly active experience of software development for the internet of things — from consumer products for the smart home to integrated enterprise systems for IoT. What’s most important for me as a developer, I love projects related to IoT. I love being a part of it, and so do my teammates.

I believe that the next two years will be defining for the mainstream adoption of the internet of things, and many more products and technologies will be created to facilitate businesses from a variety of spheres.

Trend #1: Growth of industry-driven IoT

The latest tech and business conferences that I attended show that IoT is seen as a part of strategies that helps businesses — or will help — face competition effectively. IoT dominated the recent CES 2018, with focus on the smart home and industrial internet of things. The expo fostered the idea that innovation, which brings changes to the sphere of industry and manufacturing, will help early adopters secure a share on the market. On the contrary, companies that overlook these changes might end up far behind competition. Big players use innovation to find new, drastically more effective ways of running facilities, caring for performance growth and human safety. What’s more, companies focus on business-driven technologies as opposed to general ideas promoted by the overall hype. They have a need to manage data in a way that’s less complex operationally.

Let’s take an example: a steelworks company that is a major player on the U.S. market. For a large factory that belongs to this company, one hour of downtime costs about $150 million and potential equipment issues may bear risks for human health. Any innovation must fully correspond with the existing processes and must not be incompatible with the overall environment. The business owner looks for ways to improve its performance, and here comes the idea: Engineers should use hands-free devices to monitor operations and gather data in real time. But how should it be done? This is where we come into play, consulting, suggesting tech stacks, implementing and delivering the system on an ongoing basis.

Trend #2: Growth of consumer IoT

According to a Gartner prediction, there will be 20.9 billion connected things by 2020. This impressive number owes much to smart home products developed and delivered to consumers. A good example of this is Cujo, a smart and stylish firewall device announced as one of the best IoT products at CES 2018.

Hardware manufacturers and their business partners shape this market. They provide consumers with highly usable, reliable, easily configurable products. Although the current market can be called rather scattered, the third trend marks a step towards unification.

Trend #3: Ever-increasing interoperability

In the near future, interoperability will revolve around key protocols and unifying standards, such as Bluetooth Low Energy, applied in a line of IoT products for the automotive industry. Creators of these standards and owners of product companies have the chance to shape the market in 2018. Meanwhile, software developers are ready to embrace new technologies and make product architecture and code flexible and easily supported. The team must be able to rapidly create or customize APIs to integrate products on the software level.

Trend #4: Evolving technologies to security issues

Business Insider says that businesses and governments will be top adopters of IoT over the next five years, and predict they will spend about $6 trillion on them. All of them will face the same issue: the vulnerabilities of connected devices.

Data security requires an integrated approach from the development company — healthcare is an obvious example of a business domain where privacy is protected legally, namely by HIPAA. Any software developer will say that security does not exist as some vague notion, and a block of project documentation must be dedicated to main identified security threats, as well as security standards and requirements that will be applied to the product.

Trend #5: Software and data remain the heart of IoT

Data can be seen, data can be retrieved. This is easy. But it is way harder to apply data for business impact, as well as to preserve its accuracy and integrity, and this is where we approach data science — yet another buzzword in the tech world. Data scientists are essential to software development companies, creating models used for predictions and facilitation of sales. At the same time, businesses receive sources of information that accelerate business decisions and corresponding actions, with all the agility and flexibility required by modern business.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


February 8, 2018  3:03 PM

Changing the IoT mindset to take a business-first approach

Jason Kay Jason Kay Profile: Jason Kay
Business model, Digital transformation, Internet of Things, iot

The internet of things is taking the world by storm, with analysts predicting huge things for the uptake of the technology, particularly across the enterprise. Persistence Market Research forecasts the worldwide IoT market will be worth more than $260 billion by 2025, highlighting that IoT is showing no signs of slowing. Yet, despite the huge business opportunity that IoT presents, recent research from Cisco illustrates how decision-makers must remain objectives-focused when joining the movement; 60% of IoT initiatives stall at the proof-of-concept stage and only 26% of companies have had an IoT initiative that they considered a complete success.

For some in the industry, this could make worrying reading. IoT technology was conceived with the idea of improving business efficiency and driving value for adopters; however, the research seems to indicate that this intention is not yet being fully realized by all. If IoT deployments are not providing businesses with the value they set out to, the growth rate of IoT is surely at risk, with enterprises becoming increasingly wary of investing in IoT as a result.

The reality is that businesses should not fear IoT investment. All the promised benefits, from increased efficiency to additional insight, are there, and are something that businesses should absolutely take advantage of. However, it is critical that when venturing into the world of IoT, businesses and the technology vendors they partner with take the right approach to ensure the long-term success of their project. This means taking a “business first” rather than “technology first” approach.

There is a requirement for a mindset shift across the entire IoT industry that makes “why?” the first and most important question when defining an IoT deployment. Too often the purpose of an IoT project is defined by the capabilities of the technology; however, just because a business can achieve something with the project doesn’t necessarily mean they should.

Take the IoT systems that involve a rip-and-replace model as an example. For industries such as food retail, there is the opportunity to reap significant benefits and efficiencies from adopting IoT, but a fast-moving environment, low-margin consumer goods and a high cost of infrastructure means a rip-and-replace solution to extracting data is simply not tenable.

Closing stores or suspending operations run the risk of jeopardizing customer experience, customer loyalty and brand image, undermining the potential business value of the technology. As demonstrated by the Cisco findings, the value of IoT must be apparent quickly, with minimal downtime, or the project will quickly fall by the wayside. This ability to release value at a fast pace is fundamental in taking a business-first approach and accelerating the transition into digitization with low capital investment and high return on investment.

Instead of asking the question, “this is the technology, what problems can it address?”, stakeholders should instead consider, “this is the greatest issue, how can technology help?” Businesses continuously face multiple challenges, but it takes real insight and understanding of an organization and an industry to understand which technology applied to which area will have the biggest impact on core purpose and, if addressed, offers the greatest reward.

The IoT market is predicted to continue to explode over the next 10 years, and for the industry to remain on this trajectory, it is essential that businesses and vendors alike consider this business-first approach. Vendors must work with businesses to deliver sustainable, truly valuable IoT deployments that offer scalability, rapid deployment and quick ROI. These proven, repeatable projects that are able to make it beyond pilot stage will allow the internet of things to reach its full potential.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


February 8, 2018  1:50 PM

How IoT botnets affect the ‘internet of money,’ cryptocurrency

James Warner Profile: James Warner
Big Data, Bitcoin, Blockchain, Bot, Botnet, Bots, Cryptocurrency, Internet of Things, iot, iot security

In case you haven’t heard about the term “cryptocurrency” yet, I would like to tell you about this “internet of money,” or IoM. Cryptocurrency is an encrypted decentralized digital currency exchanged through blockchain with a process called mining. Bitcoin, for instance, is the first successful cryptocurrency. Digital currency is ideal for traders who don’t want to rely on central authority.

If someone owns some bitcoins, he can make transactions with them by broadcasting a signed receipt to where the coins are being sent. The transaction is later added to a pool of unchecked transactions. Miners will validate the transactions on blockchain with constant hashing — this requires a lot of computational power.

While an individual can mine the currency, it is not profitable due to less efficient hardware. Cybercriminals circumvent these costs by using hacked devices and computers for mining. This job cannot be completed with single computer as it requires lot of processing power. Therefore, the cybercriminals cluster computers together and form a botnet. Botmasters, those who run botnets, can target IoT with their malware.

Mining cryptocurrencies

Different varieties of cryptocurrencies have unique ways of being obtained. Take an instance of bitcoins, for example. You can obtain them in two ways:

  1. Buying bitcoins online via vendors, or
  2. Getting bitcoins through mining.

Buying from vendors is the simple, straightforward method. Once you pay the seller, the coins will be transferred to a personal wallet.

Mining involves a blockchain, a shared public ledger that includes all the information about past transactions. A computer can be used to locate the next block on the chain; this process requires hashing. Whenever a new block is located, a few bitcoins are generated and added to the finders’ wallet along with the transaction fees made by other users.

Components that complete the cryptocurrency

Coins — There are different coins in cryptocurrency that use some modified names. For example, Bitcoin uses bitcoins, Ethereum uses Ethers and Ripple uses XRP.

Wallet — Every cryptocurrency applies some type of a wallet where coins are stored. These wallets have a public address and a private key. The purpose of the public address is to provide someone coins, while the private key is used to verify transactions being done through an address with asymmetric encryption.

Blockchain — Blockchain is the technology or mechanism most cryptocurrencies depend on. It is a public ledger that has the record of all past transaction data. To find the next block of the blockchain, lots of luck and hashing power is needed.

Complexity — The Litecoin and Bitcoin blockchains change the complexity of locating the next block. The complexity is set in such manner that it takes nearly 10 minutes to get the next block. A transaction remains uncounted until several new blocks added on the chain. This slows down the trading of bitcoin.

HashingHashing is a method to find the next block of a blockchain and varies per currency; every digital currency applies some sort of hashing algorithm to locate the next block on the blockchain. Bitcoins apply SHA-256 algorithm of hashing while Litecoin uses scrypt. Mining these cryptocurrencies requires a certain and different amounts of processing power with IoT and big data, along with different hardware.

How IoT botnets influence the cryptocurrency

Hashing power is needed to mine cryptocurrencies. The user with more hashing power can enjoy more processing power. In other words, having a stronger computer and good hardware will hash more than a weaker or older one. However, more computers will impact the amount of hashes done per time unit. This is why user needs to check that botnets composed of personal computers should use their processing power for mining.

The price of cryptocurrencies will be boosted if more coins are being mined. However, older cryptocurrencies, like Bitcoin, hardly get any affect from IoT botnets as they already require a huge amount of processing power. Younger coins, however, have higher chances of getting affected by botnets.

Botmasters can program their IoT botnet to perform distributed denial-of-service (DDoS) attacks, send spam email and steal personal data. All these things can create good income for the botmaster; however, they are not necessarily all good methods:

  • DDoS — IoT botnets are empowered to execute DDoS attacks. Since they have a greater number of different devices and are located across legitimate devices, it is difficult to deal with them. These attacks require minimum processing power as they are used to send data over the internet.
  • Spam — The amount of spam email today is continuously declining due to spam protection. This is why botmasters find spam email less attractive than other attack tools.
  • Stealing personal data — Regular botnets are expert in stealing data and credentials. However, IoT botnets are not among them as they are not connected to users’ hard drives.

So what do IoT botnets mean for cryptocurrencies? In the end, we can all have valid points that explain why botmasters will rarely apply their IoT botnets to mining cryptocurrencies.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


February 8, 2018  11:17 AM

For IIoT success, manufacturing IT must overcome cloud resistance

Sudhir Arni Profile: Sudhir Arni
Big Data analytics, cloud, Cloud analytics, Cloud Security, Data Analytics, IIoT, Industrial IoT, Internet of Things, iot, IoT analytics, manufacturers, Manufacturing

When manufacturers embark on digital transformation projects to improve factory performance, they often run straight into an IT hurdle: Most big data analytics software is cloud-based, but few factories have begun using the cloud to manage production data.

Most factory IT leaders understand that applications running on a cloud platform align with IT priorities, such as minimizing support requirements, minimizing cost and increasing scalability. However, factories, especially in Asia, have been slow to embrace the cloud. Many factory managers remain wary of the perceived security risks of cloud connectivity. But a close examination reveals that these security fears are not only excessive, but actually crippling. Rejecting all uses of the cloud could severely hinder their ability to gain the benefits of digital manufacturing.

Low security risk from cloud-based analytics software

Manufacturers have a well-justified fear of anything that could disrupt production. Especially in a just-in-time environment, the impact of unintended downtime ranges from damaging to catastrophic. If production machinery is connected to a network at all today, it is usually a local network. Stories of hackers turning home DVRs into botnet nodes and home security cameras into spy cams are bad enough; a Stuxnet virus that actually destroys machines is the nightmare scenario.

So, where is the safest place to combine and analyze data from factories distributed around the world? The major cloud platforms are far more secure than even well-managed corporate networks, through the efforts of the world’s best security experts and the most sophisticated technology available anywhere. The technology includes multiple levels of encryption, private global fiber networks and, in some cases, custom hardware built around custom microchips.

The way individual applications handle data also impacts security. A distinction must be made between IIoT applications that can control devices and those that use the data for analytics and visibility. Also, the analytics applications in most cases aren’t pulling data directly from machines, but rather from a local database or historian that is already set up and is probably already connected to the internet.

Finally, consider the data itself. In the unlikely situation that hackers penetrate one of the major cloud platforms hosting a manufacturing analytics application, the machine data would be useless to hackers. Unlike data belonging to the giant financial institutions that have widely embraced the cloud, the strings of numbers representing factory sensor readings would likely be incomprehensible to nearly anyone who intercepted them, and of little use even to direct competitors.

Multi-factory optimization needs data access, not data silos

Cloud-based analytics systems mesh far better with digital manufacturing trends than do locally isolated applications. Chief among these trends is a drive to create unified global applications built off a single source of truth of real-time data.

IoT in manufacturing plant, on factory floor

Source: iStock.com/durigonale

Many early manufacturing IIoT projects have been focused on individual factories, with applications custom-developed for each problem they’re attempting to solve. Once this happens in a few factories, a manufacturing IT or corporate IT department finds itself responsible for managing a series of resource-intensive applications, multiple large sets of siloed data and potentially heavy demands on corporate hardware, networking and personnel resources.

A single-factory approach to manufacturing visibility and analytics also fails to take advantage of the promise of big data analytics. These bespoke applications aren’t scalable and often consider only a limited snapshot of potentially available data.

Large manufacturers typically use the same machines and processes in many globally distributed factories. Efforts to make those machines perform better and identify the cause of defects are far more likely to succeed if companies can compare the performance of many machines in many environments.

Walling off the cloud will wall off innovation

Manufacturers should harness the momentum of the cloud. The economic and technical advantages of the cloud versus on-premises networks means that the vast majority of software innovation is happening with cloud-based software. This is true not just for new technologies, but also for stalwart data center applications like office productivity and ERP.

Walling off a factory from the outside world will stunt innovation, cutting it off from the rapid development in data-driven technologies that live almost exclusively in the cloud. Before long, refusing to use cloud technologies will feel like swimming upstream in a fast-flowing river.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: