IoT Agenda

Page 3 of 5512345...102030...Last »

August 4, 2017  12:54 PM

Deep learning technologies evolving beyond human capacities

Scott Amyx Scott Amyx Profile: Scott Amyx
ai, Artificial intelligence, Deep learning, IIoT, Industrial IoT, Internet of Things, iot, Neural network

Adoption of artificial intelligence in different fields is growing at a rapid pace. AI-based systems are going way beyond the usual expectations from machines, as they can rival, even better, human capabilities in certain areas. AI can now outwit and outperform humans in various comprehension and image-recognition tasks. Apart from a robot’s ability to survive deadly environments like deep space, deep learning has been widely used to teach AI-based system fine motor skills for doing tasks such as removing a nail and placing caps on bottles.

AI is also helping machines develop their reasoning skills, with the potential level matching that of a PhD scholar. Biologists at Tufts University made a system that combined genetic algorithms and genetic pathway simulations. The system enables AI to devise a scientific theory on how the planaria (flatworms) species can regenerate body parts.

Transforming images into art

Google Brain team has also advanced AI’s capability towards art. The Google Deep Dream program uses a machine learning algorithm to produce its own artwork. The images resemble paintings from the surrealism movement, mixed media works or colorful renditions of abstract art.

But how was the program able to render such artistic impressions? It began by scanning millions of photos for it to distinguish between various shades and colors. It then proceeded to differentiating the objects from one another. Eventually the program made itself a catalog of objects from the scanned images and recreated various combinations of these items. A prompt enables the AI to place the object composites to a landscape, leading to a work of art that appears to be made by a human being.

Deep learning technologies: Getting better than humans

Deep learning is the AI field responsible for these progressive leaps in image interpretations. The technologies employ a convolutional neural network (CNN) to instantly recognize specific image features. This capability has led to CNN finding application in facial identification programs, self-driving cars, measurable predictions in agriculture, such as crop yield, and machines diagnosing diseases. CNNs aren’t your typical AI programs. The deep learning approach utilizes improved algorithms, stronger CPU power and increased data availability. The internet feeds the necessary high volume of data, particularly the tagging and labeling functions of Facebook and Google. These companies use the collective massive uploads by users all over the world to provide the data needed for improving their deep learning networks.

CNNs don’t rely on programming — instead they are trained to recognize the distinctions and nuances among images. Let’s say you want the CNN to spot dog breeds. This would begin with providing the system thousands of animal images and specific examples of their breeds. The CNN would learn to decipher the breeds through its layer-based organization. So when training itself to recognize dog breeds, the CNN begins by understanding the distinctions among the basic shapes. It then gradually moves on to features particular to individual breeds such as fur textures, tails, ears and so on. The network can gradually gather data that concludes the breed based on the recognized characteristics.

CNNs’ complex processing capabilities enable deep learning algorithms employed in IoT technologies that don’t just identify images, but also speeches, behaviors and patterns. Better recognition of pedestrians using deep learning is improving self-driving cars. The insurance industry uses deep learning for a better assessment of car damage. Crowd control can be better through behavioral recognition in security cameras.

Bringing deep learning to everyday living

The industrial internet of things is witnessing a myriad of deep learning applications. Companies such as Facebook even have plans to build systems “better than people in perception,” showcasing an image-recognition technology that can actually visualize a photo for the blind. Other IIoT applications are also enriching gaming, bioinformatics and natural language processing. The computer vision field is also improving vastly through deep learning technologies that also offer user-friendly programming tools and reasonably priced computing.

One of the most exciting areas that is witnessing a lot of action is medicine. AI-based vision systems can rival doctors in reading scans faster or taking a more detailed look at pathology slides, thus performing better diagnosis and screening. The U.S. Food and Drug Administration is already working to have a deep learning approach to help diagnose heart disease. At Stanford University, researchers are working on an AI system that could recognize skin cancer as accurately as dermatologists. Such a program installed on one’s smartphone could provide universal, low-cost diagnostic care to individuals anywhere in the world. Other systems are addressing the assessment of problematic conditions such as bone fractures, strokes and even Alzheimer’s disease.

A progressive partner for humanity’s future

All these deep learning technologies hinge their value on purposeful applications. Today’s vision technologies are performing better than human beings in some aspects, but general reasoning remains a human function. These developing IIoT applications are meant to do separate tasks — in this case, visual recognition and categorization — better than a person, but no AI has been able to do multiple functions at the same time. A deep learning system might identify individuals in photos, but it has yet to recognize emotions such as sadness.

With time, AI systems will develop such capabilities, but for now we must appreciate numerous advantages they provide. They’re not meant to replace human skills but instead remove the burden of low-level tasks from us. Instead, we can focus on other more important and reasoning-based tasks that require human attention. Martin Smith, a professor of robotics at Middlesex University, uses spreadsheets as an example. The software has hastened computations but the analysis still comes from human experts.

The possibilities are just beginning to emerge with AI and deep learning. It is ultimately up to researchers, innovators and practitioners to transform these technological advances to something that contributes to humanity’s progressive goals.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

August 3, 2017  2:19 PM

Can blockchain create secure worldwide digital IDs?

Tiana Laurence Tiana Laurence Profile: Tiana Laurence
Blockchain, Data Management, Data privacy, Data-security, Digital IDs, IoT data, personal information

The current state of identification verification produces two types of problems. On the one hand, accurate and accessible records are necessary for people so that the old-school problem of misplacing a simple card never creates an issue. At the same time, a giant challenge exists in crafting a system that can do this while minimizing fraud, theft and other such issues. Bridging this gap is where many security technologies are currently focused, with a future in digital records seen for things such as government IDs, medical records and other critical documents. This would expand the way personal information can be protected and transmitted, crafting a more efficient system.

Without the proper security measures, though, data is left vulnerable to all types of nefarious exploitation. For example, minorities can benefit from proper representation if this type of data becomes secure and accurate. However, if exposed to the wrong hands, it can also lead to targeted threats and discrimination. In developing countries, exposing identification records to security risks means that falsified records lead to issues such as human trafficking.

Clearly the need is there to find a digital solution, one that is permanent, accessible and accurate — all while creating the necessary security. Such a system needs to create a safe space for accessing basic services, such as health care and education, while protecting identity and defending against discrimination. Blockchain technology has been identified as a possible way to meet these needs, and while there’s still work to be done on making the technology functional and scalable, it provides the traits necessary to be a foundation for such a revolution in identification technology.

In emerging countries, governments are looking at new technology as a means to accomplish more with fewer resources. The country of Estonia recently launched an initiative to use blockchain technology to authenticate e-voting in conjunction with its advanced electronic ID program. Estonia’s ID cards include electronic tokens that enable two-factor authentication in conjunction with a PIN number. For a recent corporate shareholder election, blockchain tech was used to authenticate and record data as part of a pilot e-voting program. With the pilot successfully completed, the group behind the initiative (NASDAQ) is looking to push the boundaries of digital ID capabilities. Everything from voting to smart contracts could be accomplished faster, easier and, most importantly, more accurately with a permanent and transparent solution acting as the backbone.

Remaining issues

Using blockchain as the foundation for legal data transactions in the internet of things age makes sense in many ways. However, there are still many steps to go, with the biggest one being the relative immaturity of the technology. Blockchain has only existed for less than a decade, and while pundits across industries hailed 2017 as the year of the blockchain, that doesn’t fix all of its issues overnight. Organizations have already been built around the idea of making the blockchain accessible to all types of industries — not just digital currency, but any industry requiring permanent and secure records.

In the grand scheme of things, these initiatives are still in relatively early stages. Security flaws are still being tracked and no unified standards exist, which limits the ability to integrate universally. Along the same lines, scalability is a concern. Blockchain developers are examining this issue and recognize that it is the key to widespread acceptance around the globe.

As with any emerging tech, questions about scalability and security provide hurdles to mass adoption and implementation. The good news, though, is that the core traits of the blockchain fulfill the requirements for any type of secure transaction, be it currency, identification or medical records. With that foundation, proponents know where the path is taking them.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


August 3, 2017  11:31 AM

Securing industrial IoT: A trillion-dollar dilemma

Neil Hamilton Neil Hamilton Profile: Neil Hamilton
IIoT, Industrial IoT, Internet of Things, iot, iot security, Security Framework, security in IOT

Businesses should be very concerned with industrial IoT security. Cybercrime is on the rise and could cost businesses upwards of $6 trillion annually by 2021, according to research firm Cybersecurity Ventures. This threat to IIoT is sizable, but it doesn’t have to be.

IIoT presents huge opportunities for makers and providers of industrial equipment and related systems. By connecting machines to the cloud, revolutionary new approaches to customer service and process automation can begin to thrive, predictive maintenance being one of the fastest-growing business lines.

Critical to the success of disciplines such as predictive maintenance or process automation is the ability to connect these machines to the cloud. The majority of machines are not designed with native internet connectivity built in, and certainly not wireless connectivity. They are typically designed to be securely connected to control systems (such as SCADA) which monitor and manage them via fixed cable connectivity.

For machines and devices which could benefit from being remotely connected via a wireless network, the issue of securely bridging the air gap between an operational technology (the machine) and an IT systems (the cloud) is a major challenge holding back progress.

There is a wide assumption, often true, that many firms overlook security when designing industrial internet of things products. Connectivity products are often sold with old software and glaring holes in their operating systems, which ultimately makes it easier for hackers to get ahold of data and sometimes take control of devices. On top of this, customers often fail to implement the proper safeguards that come with technology. As many as half of employees use the same two or three passwords to access confidential information. The result of these issues is inevitably breaches, which in turn makes customers skeptical when they examine integrating IoT as part of efforts to automate key business applications. Research by Forrester argued that for this reason, among others, 2017 is likely to see a wide-scale IoT breach.

industrial IoTAs a result, it is critical for organizations to find a new framework to deliver secure industrial IoT. The security sector has an important role to play. The high levels of coverage and potentially damaging results of breaches has helped to turn “cyber” into a negatively perceived term. The moment someone questions the cybersecurity credentials of a product, panic ensues. Equally, when someone else says they can “fix” cyber-issues, claims are heavily scrutinized by penetration testers from around the globe.

If progress is going to be made, we need to shift this stigma while introducing a better, more secure means for connectivity. Part of this challenge is in complexity; for example, a core application of IIoT is predictive maintenance. In order to predict whether a mobile piece of machinery is going to break down, the IoT device must transfer data via the internet back to the customer who can then resolve the issue. The problem with this, however, is that the data has to go through multiple layers and will ultimately require the aid of a network provider. This type of solution includes multiple levels that need to be secured, making it both expensive and difficult to guarantee safety. As a result, any effort to reduce cost of devices in this example could leave them more susceptible to interception by distributed denial-of-service or botnet attacks.

Simpler connectivity could therefore reduce the threat and likelihood of breaches. The common view is that the cloud is the problem, however, it is in fact the transmission to the cloud where the majority of breaches happen and information is stolen.

Many of the existing technologies have looked to prevent breaches by wrapping existing communication means with security technology. In the home, for example, consumers can purchase network access products that restrict who and what can access devices. The problem these pose in industrial environments is firstly, they can be hacked and secondly, they add complexity. What is required is a means of connection that doesn’t require heavy security products. As a result, a connection that moves directly between device and server that does not allow for interception is the ideal happy medium.

A potential solution could be USSD (Unstructured Supplementary Service Data). This technology, present in all mobile GSM networks, can be used to provide unprecedented security as there is effectively no “internet” present when connecting a machine or IoT device to a cloud system. It is therefore impervious to internet-related security threats such as botnets, distributed denial-of-service attacks and, more recently, WannaCry.

To ensure future growth and evolution of the sector, removing security as a barrier to applications of industrial IoT is crucial. Arguably, IoT has enormous potential to transform how industry operates, from improving monitoring to simplifying processes. It also presents a significant opportunity for the security sector to innovate and develop simple and secure processes rather than simply securing existing ones. In short, hacking is draining businesses of trillions of dollars, but adopting safe and secure technologies can ensure the future growth of the entire IoT sector.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


August 2, 2017  3:16 PM

Semantic interoperability key to realizing IoT value

Chris Drake Chris Drake Profile: Chris Drake
Data Analytics, Data Management, Internet of Things, Interoperability, iot, IoT analytics, IoT data, Semantic data, semantics, Smart cities, smart city

When it comes to interoperability, the tech industry is well-versed on the benefits it can bring. Despite this, BI Intelligence’s U.S. Smart Home Market report in 2016 found that smart home devices were stuck between the early adoption phase and mass-market phase due to fragmentation. This occurs when different equipment and technology are used by the numerous operators and service providers launching IoT services, with well-known drawbacks including overly complex and time-consuming operations, vendor lock-in and reduced innovation, hindering overall progress.

However, these are not the only barriers when dealing with a lack of interoperability in smart systems, especially those deployed on a large scale, for example smart cities.

Why semantic interoperability?

For IoT to deliver true value to consumers, businesses and city planners, the data delivered by smart technology needs to have meaning, so that numerous applications can interpret the data and use it to respond correctly.

This is semantic interoperability — a key factor in the future success of the IoT market. It uses metadata and ontologies to allow different applications to share information that is “meaningful.” Using meta-tagged data ensures all information can be understood and reused. This avoids the need for multiple standalone systems of sensor devices and their applications trying to gather the same data but for different purposes.

To give a simple example, roadside sensors would generate various numbers, such as temperature values in Celsius which might be used for local ice-warning electronic signs. But unless we know what these figures stand for, the information has little meaning. If meta-tagged data is used, though, the user can see what the information represents and what it can be used for. It can also be shared with other apps, for example, ones monitoring and forecasting weather. Semantic interoperability is therefore significant and necessary to many different smart technology industries. As an increasing number of applications are developed, integration costs will rise if data formats require as much integration as communication technologies.

On a wider scale, consider the thousands of potential data sources which could be found in a smart city. While many of these will generate data to be exploited by only one application, wouldn’t a smart city be even smarter if all of this data could be combined, cross-compiled and reused by many applications?

Semantic interoperability and oneM2m

Semantic interoperability was introduced in oneM2M’s latest set of specifications, Release 2, to allow meaningful, secure data distribution and reuse. Building semantic capabilities into the standard now will allow integration to be significantly easier in the future as the number of devices and applications in use increases.

The oneM2M standard enables the posting of meta-tagged data to a oneM2M resource on a gateway, which notifies interested entities, or which can be found by semantic discovery.

Making semantic interoperability a reality

In a small IoT setting, it might not be necessary to attach meaning to what the data represents as it is often implied by apps developed for a purpose. City planners seeking to fully exploit data assets, however, will be greatly restricted without semantic interoperability.

While there will be some initial costs in bringing apps up to speed with semantic interoperability, achieving similar levels of interaction via traditional data integration processes will see costs shoot up exponentially as apps and devices grow in numbers. Information available for multiple uses is also likely to be limited in such a scenario.

With the number of IoT devices increasing every year, cities serious about getting smart know they can no longer rely on traditional methods if their IoT projects are going to deliver true value. Semantic interoperability is just a small part of the standardization, but it will be integral to enabling this new way of working.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


August 2, 2017  12:40 PM

Choosing open source for your IoT platform is smart strategy

Jonathan Marsh Profile: Jonathan Marsh
Internet of Things, iot, IoT platform, Open source, Open source software

You have a great idea for an IoT initiative. Maybe improving your insight into your business operations. Maybe increasing the productivity and satisfaction of your workforce. Maybe building customer loyalty with exceptional experiences. Maybe getting a leg up on the competition with a new digital business model. In any case, selecting your IoT platform is an important choice with long-term ramifications.

The market is awash with IoT platform options today. Some are proprietary platforms, some are cloud-based, some are general-purpose PaaS with IoT features along the edges, and a few are open source. This article is intended to help you think about the options and consequences of your choice — and highlight the strategic advantages of choosing an open source option.

Think long term

The life of a software project gets shorter every year. It’s common for a software package to be obsolete and replaced by a new version every couple of years, and any software older than four to five years is considered a dinosaur. However, hardware devices typically operate over a much longer lifespan. Appliances, automobiles, home and office infrastructure all have expected lifespans measured in decades. How can you assure your customers that the digital components of these devices will enjoy similar lifespans?

With the long term in mind, the problem of lock-in to a specific system and vendor looms larger. Are you confident the platform you choose today will still be available in a decade or two? If not, what are the costs of moving from one platform to another — especially for a diminishing set of legacy customers? Can you opt out of changes in a platform that may have negative impacts on your customers, or require you to invest in costly re-architecture and implementation? What if a platform is discontinued or becomes commercially non-viable for you? What risks to your business could result?

Open source protects you in the long term. It is licensed perpetually and gives you the possibility of locking a system down in a solid working state for an extended legacy support period. Freedom to access the platform’s source code allows you long-term freedom to support the system yourself or to seek out alternative vendors for support.

Think retaining control

As IoT supports your digital transformation, your digital assets — software, systems and data — will increasingly represent the core competitive advantage of your business. Handing control of your core competencies to any third party will increase risks and limit your future options. Your IoT platform is likely to become one of the core business assets that you should own instead of outsource.

With open source (especially a permissive license such as Apache License 2.0), you have many of the same rights that an owner does — you can use, adapt, evolve, make derivatives, develop intellectual property around, redistribute, commercialize, relicense and support the platform. (Note: About the only thing the Apache 2.0 license requires of licensees is to maintain copyright and other notices during redistribution.) Building your core systems on open source retains strategic ownership-like advantages that proprietary licenses or cloud service terms cannot.

Today you may be a “user” of the IoT platform to build your connected product. But tomorrow you might open up your platform to a wider ecosystem, and even evolve into a “provider” of an IoT platform that others can use. Open source licensing terms preserve your ability to commercialize and productize your product as a platform.

“Owning” your platform can free you from:

  • Technical divergence from your platform provider. What if the provider discontinues the platform, or makes unilateral changes to the features, capabilities or qualities of service that impact your ability to serve your customers? What if you find a need to customize the platform in unique ways that the platform provider is unwilling to support?
  • Commercial divergence. What if the provider changes the commercial terms in a way that has a negative impact on your business model, or is incompatible with the long-term assurances you have made to your customers? Or what if the commercial terms don’t change, but your business reality does?
  • Strategic divergence. What if the provider becomes strategically problematic as a result of adverse acquisition, changing market position, reputation or regulation?

Open source is specifically designed to give you options and retain your independence in the face of changes large and small, technological or commercial, incidental or strategic.

Think cost

Cost is relative. Commercial open source is generally thought of as the most cost-effective option, but there are many circumstances that can affect ROI. Many of these factors can change over time. Here are some examples:

  • Cloud-hosted options can accelerate early prototyping and developing, and offer great agility for projects in early and iterative stages — at low entry costs. However, a cloud product offering per-device pricing that is attractive when the number of devices is low can quickly scale to unreasonable levels as the number of devices soars.
  • For organizations lacking operations expertise needed to manage a scalable, highly available on-premises system, a cloud product can be attractive. However, for organizations that already have or find it cost-effective to build the capacity to self-manage open source deployments, on-premises software may offer lower long-term costs.
  • IoT platform features added (presumably at low cost) to an existing IaaS platform might be very attractive when positioned as an incremental cost. However, in the long term, mixing IaaS and PaaS layers limits your ability to migrate to other IaaS platforms or into your own data center and limits your negotiating power. Clean architecture layers allow more possibilities to adapt to take advantage of the lowest cost at each layer.

Migration between different platforms can be quite expensive and time-consuming. Structural capacity to migrate between different deployment forms (cloud hosted or self-hosted open source) is much easier than migrating to a completely different platform.

The best balance between these models is achieved with full fidelity migration between public cloud, managed cloud and on-premises or self-hosted systems. Under this model you retain your ability to adapt as needed to the best price appropriate for your current conditions.

Think security

Open source allows many eyes on code to help detect and address security vulnerabilities promptly. Devices historically followed a “security by obscurity” approach. The revelations of many attacks through remote devices show this approach is insufficient. Open source hardware and software allows many eyes on the design and implementation and makes early detection of vulnerabilities possible.

Commercially supported open source vendors are engaged at a reasonable cost to provide support services and maintain vigilance on security threats and mitigations.

Think contribution

Open source gives you the option to contribute back to the platform, which can have valuable benefits for your business. It ensures you can obtain specific features of importance to you, on your own timeline, under a sweat equity model. By contributing a particular feature of importance to you back into the code base, you are relieved of long-term maintenance of the feature, which gets picked up, improved and maintained by the open source community.

Network effects are at play in open source communities. Each contribution helps build a vibrant ecosystem that can benefit your business. Android is a great example: As a fully functional open source device operating system it allowed many device manufacturers to come up with creative hardware designs and fostered innovation worldwide. The more contributors, the more possibilities emerge in the platform; for instance, the number of devices it supports. You can do your part to protect the vibrancy of the platform with a strategic commitment to participation.

Open source offers benefits closer to home as well. It is well-known to attract and retain quality employees — many top engineers see use of and participation in open source communities as an exciting benefit and good for their careers. They often seek out employers who support participation in open source communities. These employees can help your organization adopt open source distributed development and governance practices that have proven effective at spurring collaboration and sharing that lead to outstanding efficiencies and innovation.

Contributing to open source is viewed as a form of corporate social responsibility, increasing stocks of goodwill among customers and the industry.

Build your future with open source

Open source will be an important force in the internet of things. Many devices already run open source operating systems such as Linux and Android, and pairing these device platforms with an open source IoT platform for device management, security and analytics offers natural synergies. Visibility into the deep code, the development activity and roadmap, and security features can provide insights that will improve your decision-making power.

Business and technical leaders would be well-served by considering their open source strategy with respect to open source, and seeking out open source IoT platforms for proofs of concept, for evaluation matrices and for any IoT project of strategic importance.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


August 1, 2017  1:57 PM

Internet of energy: Extracting value from data silos in utilities

Stuart Gillen Profile: Stuart Gillen
Artificial intelligence, Data Management, energy, Energy Consumption, Energy efficiency, Internet of Things, iot, IoT analytics, IoT data, Machine learning, Service providers, utilities

In the industrial world, and specifically the energy sector, the amount of connected devices, sensors and machines is continuously growing, resulting in the internet of energy, or IoE. IoE can be broadly defined as the upgrading and automating of electricity infrastructures, making energy production more clean and efficient, and putting more power in the hands of the consumer.

Given the vast amount of data the energy sector generates and the increasing number of sensors added, it is the perfect environment for machine learning applications. Artificial intelligence (AI) excels at finding subtle patterns in data sets of all shapes and sizes, particularly under complex or changing conditions.

Although data within IoE is growing at exponential rates, much of that data is traditionally siloed across business units (generation, transmission and distribution, energy trading and risk management, and cybersecurity). Extracting the wealth of data out of each of the silos and putting that data to work is needed to promote a better IoE experience and receive the benefits out of machine learning. Artificial intelligence capabilities can be incorporated to gain insight from all the data uniformly, allowing business units to transform into a collaborative system.

Generation: Prescriptive maintenance of turbines

Generation, the first major silo in the energy sector, is largely dependent on the work of turbines. Turbines consist of thousands of moving parts, and even the smallest disturbance can create major problems, causing unscheduled downtime, loss of power, safety concerns and other issues.

Applying AI and machine learning techniques to prevent unplanned downtime and catastrophic breakdowns is revolutionizing how utility companies operate. A standard approach of subject matter experts (SMEs) developing static, first-principle models places a tremendous burden on organizations to maintain and update them. Furthermore, the static nature of traditional models means operators are only able to view steady state operation of turbines, whereas the meaningful data is transient events like startups and coastdowns.

Transient conditions are where critical issues first materialize, but they are challenging to monitor because they occur over indeterminate lengths of time. Where a static model-based system is unable to solve this issue, an AI-based technology can. An artificial intelligence approach can start analyzing data and providing insights on day one, and continue to improve upon its own accuracy and effectiveness by learning from SME input.

Transmission and distribution: More than just smart meters

For the second silo of transmission and distribution, AI is able to tackle much larger problems. While smart meters and end user control of home appliances have generated excitement, they are not the most challenging big data problems being solved by machine learning.

Three specific areas in transmission and distribution where AI is playing a key role are:

  1. Energy disaggregation
  2. Power voltage instability monitoring
  3. Grid maintenance

In these areas, the collection, ingestion and action upon the data have created efficiencies in expenses and operations for companies using machine learning and AI technologies, as well as for their customers.

Energy disaggregation requires the utilization of machine learning because thousands of energy “signatures” must be analyzed to find patterns of usage. An analysis of energy signatures can predict suspicious consumption values, for example, due to physically or digitally manipulated devices, sophisticated thefts or meter malfunctions.

The second area, power voltage instability monitoring, faces an explosion of dynamic data surrounding minute instabilities in which human analysis falls short. Researchers can utilize machine learning techniques to identify voltage instabilities, thus preventing brownouts and blackouts on the grid.

The last area of transmission and distribution where AI is playing a key role is grid maintenance. While many companies are still struggling to use the data they are collecting, a machine learning algorithm can use the data or features to classify and ultimately predict failures well in advance. Because machine learning algorithms can automatically break features down into additional data and analyze them at machine speed, previously unseen correlations in the data are leading to new discoveries.

Cybersecurity: The modern battleground

The third major data silo in utilities is cybersecurity. The recent and continuous onset of attacks to critical infrastructure makes the need for new cybersecurity methods vital. An AI offering can identify, categorize and remediate a variety of threats including loss of personally identifiable information, zero-day malware and advanced persistent threat attacks.

To a mathematical algorithm, there is little difference between the aforementioned data and cybersecurity data. All input, regardless of source (a vibration sensor or a firewall log, for example), is simply a piece of information with unique patterns to an algorithm.

To combat the cyber front of industrial threats, an artificial intelligence product can automate the threat research process, prioritize threats based on confidence and display corroborating evidence to the analyst, significantly reducing both time to threat remediation and overall risk.

Energy trading and risk management

Energy trading and risk management is the final data silo in the energy sector. In the highly competitive and regulated utility business, there is a clear link between the company’s bottom line and forecast accuracy and reliability. If new techniques can provide more accurate forecasting, utilities can begin to offer better pricing to their customers.

AI techniques are providing insight into this process. With thousands of features from hundreds of sources, there are infinite ways to combine and correlate information. Looking for subtle, transient movements of price data on an hourly or even a second-by-second basis with millions of combinations is where AI excels.

Because utility companies need to buy oil, gas, coal, nuclear fuel and electricity, they are constantly at the mercy of volatile commodity prices. For this reason, utilities are using AI techniques to develop methodologies for market and credit risk aggregation.

With improvements in the sharing of data from data silos, the utilities industry can reap the wealth of new knowledge. From prescriptive maintenance to energy trading to cybersecurity, analytics will play an important role in how energy is produced and provided to consumers long into the future. As adoption increases, AI technologies will continue to learn and adapt, providing more value in the internet of energy.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


August 1, 2017  11:15 AM

Mirai and Amnesia: Early lessons in attacks against IoT, part one

Christopher Budd Profile: Christopher Budd
Botnet, cybersecurity, DDOS, Internet of Things, iot, security in IOT, WannaCry

Since the early days of the internet of things, those of us who work in the world of vulnerabilities and threats have been warning about the risks associated with IoT.

When the Mirai botnet attacks came in late 2016, many felt that IoT attacks were finally here and started looking at the past for parallels. We didn’t have to look far: Over sixteen years after the distributed denial-of-service attacks that took down Yahoo, Fifa.com, Amazon.com, Dell, E-Trade, eBay and CNN in February 2000, here was another massive DDoS attack.

These early attacks came at the beginning of what turned out to be years of large-scale attacks against PCs. So the logical question is: Does Mirai represent the same thing? Are IoT attacks here and are we looking at the beginning of another era of large-scale attacks?

At first glance, this would look to be the case. After all, one thing that enabled the large-scale PC attacks was the lack of truly effective patching against vulnerabilities. It’s notable that major attacks like Code Red, Nimda, Blaster, Sasser, Zotob and Conficker all attacked vulnerabilities that patches were available for when the attacks hit. When we look at IoT, and the fact that in many cases vulnerable devices will never be patchable, let alone patched, it’s reasonable to think that this problem will be even worse. Add to this the sheer scale of IoT compared to PCs in the early 2000s, and not only does it seem reasonable to conclude that IoT attacks will be like those that we saw in the PC era, it already seems like a foregone conclusion.

And the specifics of the Mirai attacks seem to support this conclusion. One thing that made everyone take notice of Mirai was, again, the sheer scale. The Mirai attacks against Brian Krebs’ site was clocked at up to 620 gigabits per second of network traffic, and a follow-on attack against French web host OVH hit a peak at 1.1 terabits per second. As the world was reeling from these attacks, the Mirai source code went public and everyone was bracing for the worst.

But then something funny, and important, happened.

Nothing.

In the months since Mirai there’s been no additional follow-on attacks. The security press has moved off IoT altogether, focusing in the spring and summer of 2017 on WannaCry and then Petya. You’d be excused if you happened to miss Mirai last fall and thought that we were still waiting for IoT attacks to begin.

Much like the dog that didn’t bark in Conan Doyle’s “Silver Blaze” Sherlock Holmes story, the post-Mirai non-events tell us a lot about what the world of IoT attacks on the internet may look like. And it’s looking less dire than it did during the PC-era internet.

Check back to this column for part two in the series, which will take a closer look the Amnesia botnet as another recent example of a large-scale IoT attack that can be leveraged for lessons learned when securing the IoT.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


July 31, 2017  3:58 PM

Hybrid OT: A new role emerging within industrial IoT companies

Jason Andersen Jason Andersen Profile: Jason Andersen
IIoT, Industrial IoT, information technology, Internet of Things, iot, IT convergence, IT professional, IT workforce, Operational technology

In my conversations with industrial companies looking to start or accelerate their journey toward the industrial internet of things, I’ve begun to see a phenomenon among the ranks of industrial technologists that’s not all that different from Darwin’s theory of evolution. Adaptation is the key to this theory, something industrial technologists need to do well as their environment is changing around them.

In the past, there has been a clear divide between IT teams — that control the data center — and operational technology teams — which are responsible for the care and feeding of operational automation systems. These two distinct teams had different skill sets, backgrounds and priorities. Today, in order to bridge the gap that has traditionally separated the two, a new breed of what I like to call “hybrid OT” professionals is emerging. This is where IT and OT responsibilities and skillsets are converging, making the individual who can do both a valuable technologist.

What is causing this shift? There are two big things I see driving this change:

New responsibilities breed new roles — As more computing power and data collection have made their way to the edge of industrial networks, a new combination of skills is required to manage these assets (historically the domain of OT), giving “birth” to the IT/OT hybrid. We saw a similar shift occur with the rise of cloud computing: developers struggled to get IT to respond to their needs, so they turned instead to public cloud services for answers. As developers took this responsibility of securing the IT infrastructure needed to run their applications in clouds upon themselves, the role of DevOps was born.

A generation ready for 21st century challenges —┬áMany OT professionals who have been in the industry a long time are now approaching retirement and a new generation is taking their place. This generation of younger, digital natives is not intimidated by technology — they were in fact raised on it. They see the potential of IIoT and will look to realize its potential as they push intelligence out to the edge and leverage data and analytics in new ways.

The most forward-looking industrial enterprises are the ones that see the value in hiring professionals that are just as comfortable working with servers as they are working with machine tools, packaging lines, pumps and valves. Enterprises actively recruiting these hybrid OT professionals are attracted to the skills they’re seeing that will be valuable in managing both IT and OT technologies. Whatever their background — IT, data science, industrial engineering — these individuals share a passion for the intersection of technology and industrial operations.

New expectations for the technology they use will also come along with the role. System availability has become an absolute necessity for business continuity and is something hybrid OT professionals will expect out of their systems and technology vendors. Similarly, these hybrid OT professionals see the value in the data produced at the edge and will lean on technology vendors to help make data protection a top priority for the enterprise.

When will we see this new breed emerge? The answer is that the evolution will happen soon — much more quickly than it occurs in the natural world. Over the next two to three years, I believe the industry will see a major influx of this hybrid breed.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


July 31, 2017  12:09 PM

Drone delivery is buzzing — but where does it go from here?

Roei Ganzarski Profile: Roei Ganzarski
Amazon, delivery, drone, Drones, FedEx, Internet of Things, iot, IoT hardware, UPS

With the drone industry netting $8 billion last year and consumer spending for on-demand services topping $57 billion, the implementation and deployment of drone delivery fleets is becoming more of a reality each day. From Amazon’s futuristic drone delivery tower proposal to parachuting packages on the horizon, we are nearing a complete delivery transformation. While these innovations will need to be fine-tuned as the commercial drone industry flourishes, several factors, such as operational inefficiencies, continue to work against the Amazons and Ubers of the world, hindering supply chain innovation.

The mainstream drone delivery timeline

We have already seen the start of drone deliveries, but it likely won’t become truly widespread and accepted until 2025. Over the next seven years, we’ll begin to see an overall increase in this new vehicle for deliveries, but a large portion of these test runs, followed by operational deployments, will be heavily focused in rural areas where the safety risks are smaller and logistics are much simpler to manage. Contrary to popular belief, dense metro areas present numerous challenges and risks — think traffic, privacy issues, power lines, high-rise buildings, sudden wind gusts and crowded streets below.

Instead of thinking about smart urban cities, we should shift our focus, at least for drones initially, to smart rural areas. It’s much more likely we’ll see drones used to deliver common goods like food and medication to remote homes and offices versus a busy suburb. Instead of a consumer driving over an hour one-way into town to pick up a prescription, he could choose to have it delivered by drone. Autonomous vehicles and drones could, and should, even partner for optimal efficiency while delivering packages — companies like UPS and Mercedes are already testing a self-driving van and drone combination for the ultimate rural delivery challenge solution. Meanwhile, cities will shift their attention to small and less risky sidewalk bots to help enable the “life on-demand” luxury in metro areas.

Implementation challenges

Before these services can become a reality, we need to start addressing the inherent challenges to implementation. While there are several concerns that need to be addressed, the ones we should be thinking about sooner rather than later relate to the safe deployment of these technologies. The obvious safety element everyone is looking at, which I am sure will indeed be addressed, is that of the single drone: How will it fly safely, avoid obstacles, carry its packages and so forth? However, it is the deployment of a fleet of drones I am more concerned about. Before they can become commonplace, live operational testing is mandatory to ensure we can deploy several drone fleets without overcrowding the airways — a scenario that could completely erode the potential benefits these technologies present, and moreover create a potentially unforeseen safety risk.

To adopt this technology and do it right, we need to start thinking about a higher-level network of connected drones — no matter the parent company. This means taking a look at how they should be deployed, managed and used so not only individual consumers can experience the benefits of these technologies, but society can as well. Whenever a new method is introduced into everyday use, there is a great deal of strategy that needs to go into thinking about which services are implanted where.

Who will be the first adopters?

One benefit drones exhibit is the lack of overhead required as opposed to traditional delivery methods like trucks and airplanes. A large-scale delivery model requires an enormous amount of capital investment including drivers, trucks, space to store equipment, regional depots and much more in order to be successful. This presents a substantial barrier to small competitors trying to enter the market, even if their model is more efficient than the delivery giants. On the other hand, drones are less expensive to buy, store, maintain and operate and because of this, we’ll likely see many small companies launch drone delivery services. Because of the technologies’ flexibility, there is an opportunity for both corporate giants and mom-and-pop shops to operate in this industry successfully. This in itself also poses an added risk of very large numbers of drones all flying around the same airspace in an uncoordinated fashion. Inefficiency and risk galore.

Additionally, companies like FedEx and UPS will also look to break into the drone business in order to stay competitive. As drone technology improves and new businesses emerge, it’s likely drones will slowly begin to replace some of the last-mile routes traditionally fulfilled using trucks and vans, thus resulting in increased efficiency across the board. Just think: if one delivery in a driver’s route is three miles out of the way of every other stop, that’s six miles roundtrip for a single package. The time spent driving to the delivery destination and back is not only an inefficient use of the employee’s time, but a costly waste of company resources (assuming, of course, the package is small enough to be delivered by a drone instead). In the future, delivery vans will likely be equipped with drones to deploy one-off package deliveries, optimizing the driver’s daily route while simultaneously reducing operating costs. In combination with the great distribution networks and infrastructure in place, FedEx and UPS will have a huge leg up on potential competitors looking to enter the drone arena.

Looking ahead: The future is buzzing

Look for Amazon to continue to push the envelope very publicly, but don’t be shocked to see Wal-Mart come out with a surprise from left field. It has the money, it just has to find the right focus and partner — I think it will. I also anticipate seeing brand-new delivery companies emerge and try to conquer the space, only eventually to get bought out by the delivery giants such as UPS and FedEx to try and maintain pace with Amazon.

At the end of the day, there are two technologies that are going to be required to make this all happen, and happen successfully to the benefit of the consumer and service provider: a safe and effective drone at a price point that is affordable, and dynamic network-level, multimodal scheduling software that will enable the service provider to efficiently deploy a fleet of drones and integrate them with other resources.

Finally, don’t limit your thinking to delivery being of only physical goods. Drones will also be used to deliver data. Insurance companies could rapidly deploy a drone to a car accident scene (especially that of a driverless car) or a home disaster to take photos from multiple directions and collect data in near-real-time to increase the accuracy of the data. That same drone could deliver information rapidly to emergency services, such as the police and fire department, to increase treatment quality and survival rates.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


July 28, 2017  1:15 PM

The internet of things and cybersecurity vulnerabilities

Brian Berger Profile: Brian Berger
#eHealth #Healthcare IOT #Wearables #wireless medical devices, Internet of Things, iot, iot security, Ransomware, security in IOT, smart home

The internet of things is defined as the interconnectivity or networking of devices, sensors and electronics to communicate through software. With IoT and the typical computing infrastructure that is very familiar, the change in data, access, connectivity and monitoring requires a cybersecurity approach. We tend to add technology to the existing fabric of our lives without considering how the bad guys will attack. We are seeing IoT in our homes, automobiles, food safety, medical devices, critical infrastructure and manufacturing — just to name a few.

Let’s talk about our homes and us as consumers of IoT first. We have access to some cool and innovative technologies at home. A favorite is Amazon’s Alexa digital assistant. Alexa can turn on lights, change the temperature of a thermostat, change watering days and times on your irrigation controller, manage webcams, and turn on and off the television. All this is amazing, but it raises the question: Have we opened ourselves up to more vulnerabilities at home? An illustration of webcam vulnerability was widely seen in the distributed denial-of-service attack of Dyn in late 2016.

Medical devices are just as vulnerable, now that blood pressure cuffs, glucometers, insulin pumps, pacemakers, ICU real-time monitors and many others are connected to the internet. Home healthcare uses wearables for monitoring of statuses and medication reminders. All of these connected devices have been purpose-built for function with limited security and data protection. We have seen hacks into insulin pumps manipulating the dosing. The seriousness of a personal attack is not outweighed by the threat to healthcare systems and records that can be accessed through these devices, all labeled as, personally identifiable information.

Lastly, manufacturing sensors and devices are a common threat, as they are unmanaged. As seen with Petya, NotPetya and WannaCry, unmanaged devices have been the target for spreading ransomware across networks. The attackers are looking for the easiest entry point and the sensors of unmanaged IoT devices, which have become active targets. Manufacturing under government contracts has been a key target and supply chain SMBs now have required guidance for compliance. Some of the most critical aerospace designs have been stolen through cyberattacks that have a significant effect on our national security, as well as the economy for lost programs from these smaller manufacturers, whereas in food safety, the monitoring and prevention of agroterrorism is paramount to protect our national food supply.

What should we do? The list of actions remains very similar. Make sure all devices are not set to default (i.e., change passwords) — this is a typical flaw in the devices of SMBs as well as consumer devices. Verify all devices and sensors are managed and monitored. Properly segment your network — create an internal, guest and IoT network at a minimum. Some other helpful considerations around a cybersecurity program include updating firewalls, securing remote access, reviewing security configurations, operating system updates and patches, training staff members, improving security policies and changing control procedures.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


Page 3 of 5512345...102030...Last »

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: