IoT Agenda

Page 4 of 59« First...23456...102030...Last »

September 7, 2017  4:12 PM

IoT: Another opportunity to build in security from the start

Gorav Arora Gorav Arora Profile: Gorav Arora
CIA, identity, Identity management, Internet of Things, iot, iot security, IT security, security in IOT

According to Gartner, worldwide spending on IoT security will reach $547.2 million in 2018 and $840 million by 2020. While IoT may be a game changer in many respects, from a security perspective, the game actually changes very little. At its most basic level, security in an IoT system is about having high assurance that the data is protected at all times and originates from devices which are trusted.

The basic fundamentals of information security include confidentiality (keeping things secret), integrity (keeping things trustworthy), availability (keeping things available when they need to be accessed), accountability (someone is responsible for security) and auditability (keeping verifiable records about the interactions in the system). Because IoT is new and novel, there is a tendency to overthink things and to look for new and novel security frameworks. However, these fundamentals remain true to IoT. It may just be that the tools that are used for executing these fundamentals are different, due to IoT’s differences from systems of the past.

The hardest problem in any data transaction is verifying the identity of the parties involved. But once the identities are trusted, everything else is just accounting. By accounting, I mean that we are able to follow a procedure to complete the transaction (which can be anything from updating a field in the database or connecting a rider to a driver in a ride-sharing app). The procedure itself may not be easy, but it’s not nearly as hard as establishing identity of the transacting parties. To establish identity reliably, you need to establish a trust mechanism. Since trust cannot be established in isolation, a chain of trust in the IoT ecosystem is needed.

When enterprises started moving to the cloud, many IT professionals noted that cloud offered an opportunity to build security in, as opposed to bolting it on. This gave rise to the “secure by design” philosophy, where security was part of the blueprint that built the systems. In some ways, this is true for IoT as well. By applying the right security technologies to the IoT ecosystem and using a security-first mindset, we can establish trust and security from the ground up. This will ensure the next generation of connected devices can be used securely and fulfill their potential.

Unfortunately, right now companies do not have an incentive to future-proof their IoT products. Time to market and cost are predominant forces shaping the technologies and services, as well as a lack of security expertise. It may seem like an overwhelming challenge, but it does not have to be this way. Stay tuned for my next blog where I’ll outline the building blocks for a secure IoT.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

September 7, 2017  1:36 PM

Bridge OT and IT worlds through IoT

James Kirkland Profile: James Kirkland
convergence, GATEWAY, Internet of Things, iot, IoT hardware, IT, IT convergence, Operational technology

One of the constant themes that any practitioner in IoT deals with is how to bridge the traditionally separate worlds of OT and IT. I recently drove to New York to visit a friend. During the drive, I encountered a lot of bridge construction, resulting in ample time to reflect on the subject. One of the most interesting projects was a new bridge to replace the Tappan Zee. Incorporating smart technologies, this bridge should mean less congestion and frustration for motorists. It will have more lanes for traffic and a state-of-the-art traffic monitoring system. Due to open this summer, construction projects like this have come a long way since the opening of the Tacoma Narrows Bridge, whose catastrophic demise earned her the nickname “Galloping Gertie.”

Creating combined OT and IT environments can be similar to engineering a bridge that’s both safe and effective. In the case of the Tacoma Narrows Bridge, the desire to meet a particular goal blinded the architects to the real requirements of the situation. The pinpointed focus on flexibility, and lack of taking vertical wind movement into account, produced an environment that was doomed to failure.

architecture-bridge-cloud

Photo via VisualHunt.com, CC 0

OT and IT managers alike should take this as a lesson when combining their environments. They need to make sure there are no inherent flaws in the design of their combined IoT projects. To achieve the right architecture, one needs to understand the conditions and requirements of the environment and design to those requirements. Don’t pick a technology and try to force it on the situation. Let the requirements drive the selection of the technology. In combined OT/IT IoT, it must include security, reliability and communication availability.

Carefully navigating treacherous waters

As pointed out in my previous blog, “How OT managers can improve their batting average,” the information derived through IT working in concert with OT systems offers businesses advantages never before realized. But, in spite of the benefits, companies hesitate integration due to the significant challenges involved. As OT and IT are often handled by separate organizations with different requirements, backgrounds and skill sets, an effective merge requires careful analysis and planning. Also, operational technology often runs vital aspects of an infrastructure. Therefore security, reliability and availability are key issues that must be addressed.

Historically, IT and OT organizations were handled by different organizations, each with distinct goals, budgets and strategies. Their approaches are often different based on prevailing mindsets:

OT organizations IT organizations
  • Implement and support highly specialized control systems for nonstop availability of critical applications. Many control systems are unable to be taken down even for maintenance.
  • Support large-scale, complex systems, often relying on standards-based networking and computing to connect systems from multiple vendors.
  • Environments are often highly distributed or geographically dispersed.
  • Tend to be more centralized, focused on a specific data center.
  • Often need to deal with distinct regulatory or labor challenges.
  • May have virtualized or cloud implementations already.

 

The goals of supporting nonstop systems and highly dynamic, heterogeneous, multivendor environments may seem mutually exclusive. The sharing of resources brings with it security risks. Scalability, availability and security loom as major challenges to a converged OT/IT environment. But while it is challenging, it is not impossible.

The proper foundation for an OT/IT span

A hierarchical intelligent systems architecture, tiered to provide high modularity and autonomy for components, addresses the stringent needs for scalability, availability and security in IoT environments. Using this layered architecture, OT and IT can be successfully merged to deliver a unified IoT architecture that takes advantage of their combined proficiency and knowledge.

As depicted in the figure below, this type of layered architecture is composed of distinct device, gateway and data center or cloud tiers. The device tier includes intelligent endpoints, such as IP-enabled meters, sensors, displays and actuators. The data center or cloud tier includes smart applications and services that manage and automate industrial control processes and workflows. The gateway tier acts as an intermediary between the device and datacenter or cloud tiers.

september-blog-about-bridges

Bridging IT and OT with IoT gateways

IoT gateways are the cornerstone of the converged OT/IT architecture. Specifically designed to close the gap between devices in the field and centralized business and enterprise applications, IoT gateways optimize intelligent system performance by gathering and processing real-time operational control data at the network edge. In this model, data from the devices can be controlled and secured, and data center-level computing can occur closer to the edge. This opens up the possibility of implementing real-time analytics via machine learning, providing insights promoting innovation and business efficiencies. IT at the edge affords the compute and communications capabilities required to process, analyze and produce insights in real time. And it does so where the information is needed most, with delivery immediately back to the point of actuation.

Keep traffic moving over the bridge

The internet of things is transforming OT with new IP-based operational control systems that can help businesses improve costs and increase automation. By aligning and unifying OT and IT infrastructure, systems and practices, enterprises can improve efficiency and optimize business decision-making. The challenge to integrating these environments can be met when businesses address both the technological and organizational requirements. A tiered intelligent systems IoT architecture can address many of the technological requirements, particularly in the areas of scalability, availability and security.

But just as bridge construction doesn’t happen alone, at Red Hat, we work with an ecosystem of partners, like Eurotech and others. With them, we build solutions that can help enterprises align their OT and IT, and begin to transform raw data into meaningful, actionable information that can increase productivity, simplify decision-making and improve business results.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


September 7, 2017  11:08 AM

Making the final leap from IoT insights to decisions

Mark Troester Profile: Mark Troester
augmented reality, Data Analytics, Internet of Things, iot, IoT analytics, IoT data, Machine learning, Predictive Analytics, Predictive maintenance

Businesses looking to take advantage of the internet of things are deploying a wide array of digital devices and sensors, which gives them access to an unprecedented amount of raw data. However, for many businesses (especially small to midsize companies), managing all this data and capitalizing on it to the fullest extent can be difficult.

In my last article, I talked extensively about how businesses can overcome this challenge through a next-generation approach that applies meta-learning to machine learning — a process in which a machine is taught to automatically perform the time- and labor-intensive steps that a data scientist has to perform to build highly accurate predictive models. This would enable businesses to turn their data into insight without an army of data scientists by automating the data science lifecycle.

Acting on predictions

However, that’s only one step in the process. Enterprises can use their analytics to derive the most accurate predictions in the world, but if they can’t act on these insights in a timely fashion, then they aren’t getting the most out of them. In that case, all the time spent preparing data, creating and validating models and putting these models into production will be for naught.

It’s not just about acting on the prediction either, but creating flexible business processes to facilitate turning predictions into actionable decisions. Take one common use case for predictive analytics: predicting equipment downtime using IoT data. Simply nailing down the model to predict machine failure is a comprehensive task in itself. But what if a company deploys thousands of machines across the world, with some operating in remote locations in the field? The company in question also needs business processes dedicated to acting on predictions in every scenario to ensure the problem can be addressed in time, without any disruption.

In this instance, the enterprise not only needs a predictive result, it also needs the predictive result to trigger a specific action. This may be as simple as sending a notification to a field service agent. But increasingly, the process is growing more complex — in the near future, it may involve sending the notification to the service agent, along with repair instructions that can be displayed in augmented reality. The business application logic could also be integrated into the inventory management system to account for the parts used for repair and into customer-facing functions to reschedule service around the repair or shift service to another capable machine to keep up with production. As technology continues to improve, enterprise applications will be able to take on a bigger role in facilitating different actions — but only if businesses can develop the application with these capabilities in the first place.

From IoT insight to decisions

In the aforementioned scenario, predicting machine downtime and repairing the equipment is only a small step in a comprehensive process that impacts everyone involved, from field agents to end customers. Assuming businesses have the predictive capabilities handled, they need to tie this capability into other business processes as well. From an application development standpoint, the business would need to consider:

  1. Multi-channel user experiences: Depending on the location of the machine, service agents may need different tools to make repairs and adjustments. Everything from diagnostics utilities to repair manuals should be tailored to the device used to do the job, regardless of whether field agents are using a tablet, some proprietary mobile tool or even augmented reality.
  2. Complex business logic: The application needs to be able to receive the notification for the faulty equipment and then decide on the right course of action based on other conditions. Enterprises wouldn’t want to dedicate resources to repairing one machine when a more important one is about to fail as well. Everything from time of day to value of the machine needs to be considered.
  3. Integration with other relevant business applications: Making a prediction is only one step; the application needs to be tightly integrated into other relevant applications to ensure the prediction can be acted on promptly, without any other disruptions. From customer management systems to inventory management, integration must be seamless.
  4. Flexibility to support new application workloads: Business conditions are always changing, and the application needs to be adaptable to match that. In the scenario described above, the application pushed notifications forward to the user, who then needed to conduct the service. But perhaps over time, more steps can be handled by the application itself to reduce dependency on human operators. This only happens if the application has the flexibility to support new application workloads from the outset.
  5. Machine learning and predictive analytics: Once a prediction is made and a process is triggered, the application needs to continuously improve its models based on the various production scenarios encountered and the overall results. Changes in the machine environment, changes in operating characteristics and information about the accuracy of the model all need to be fed back into the model so they can be improved.

Using data from IoT and other sources to derive insights and predictions is a critical first step to improving the way your business operates. However, to truly make the most of those insights, businesses also need applications that can take predictions and turn them into implementable decisions. This requires comprehensive business applications with complex business logic, seamless integration, flexibility and a great user experience.

Building such a comprehensive business application is no easy task, but many enterprises across the globe are starting to take this approach — and they’re achieving some great results. As you look to the future and try to figure out how you can improve your analytics process and application development process, keep in mind this integrated approach. This will enable your business to take the next step in the process and go from simply making accurate predictions to actively implementing decisions.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


September 6, 2017  4:26 PM

Waste not, want not: Making supply chains simple with IoT

Neil Hamilton Neil Hamilton Profile: Neil Hamilton
Internet of Things, iot, IOT Network, Logistics, Supply chain, Supply Chain Management, tracking, ussd

Cisco and DHL recently reported that IoT will make supply chains vastly more efficient and transparent leading to a $1.9 trillion impact on the supply chain and logistics sector. Nowhere will the benefits be greater than in the food industry, which loses about $160 billion a year in wasted food. The potential of connected technology to drive efficiencies by tracking the location of anything and monitor its state provides significant value to both suppliers and receivers. Achieving high levels of insight and access is a challenge, and it needs to be made simpler.

The difficulty for almost any industry tasked with improving supply chains through connected technology comes in two areas: cost and accuracy. Necessary investment tends to be high, because of both reliability and coverage issues. If a piece of technology goes wrong, or moves out of range, tracking will go down and be rendered useless. Installation is also expensive, as each monitoring device needs to be built, installed and connected to a network.

Particularly in food and fast-moving consumer goods, we’ve seen wide use of RFID and satellite-based tracking. Both of these are useful, but neither provide accurate and anywhere-in-the-world live monitoring. RFID relies on receivers at stocking and processing locations, and satellite tracking is expensive and can be unreliable. The alternative — tracking through mobile operator networks — makes more sense, but the challenge comes when the chosen network goes out of range, or the supply chain stretches across multiple countries — an even more expensive roaming SIM is required.

iot-supply-chainTechnologies in this space are sold on the basis of being always-on and data-driven, but unless you’re willing to pay expensive subscriptions to multiple network operators, this will never truly be the case. In some sectors, a lack of monitoring can be a serious issue. According to a study by the United Nations Organization for Alimentation and Agriculture, almost one-third of worldwide food production is lost — representing 1.3 billion tons, which would cover an area as big as Canada in farmed land. Imagine what could be achieved if all of this food could be relatively inexpensively monitored as it moves through global supply chains, alerting producers and receivers when food was near to being spoiled?

By making IoT technology more affordable, mass adoption and therefore significant benefits could be achieved in supply chains across the globe. For larger enterprises, customer fulfillment rates are critical. Technology could both reduce risk and consequently cost. Ultimately, coverage and cost of implementation are going to be king for any technology to see wide-scale adoption. For IoT to be truly ubiquitous in supply chain, we need to see technologies which are both accurate and simple to implement.

A lesser-known technology could provide a potential solution. USSD (Unstructured Supplementary Service Data) is a universal protocol which appears in all 2G, 3G, 4G and LTE mobile networks, providing a globally ubiquitous tool for the movement of data. The technology is network agnostic, meaning cost is also reduced, removing the need to agree contracts with multiple operators.

To truly drive IoT adoption in the supply chain, we’re going to need an alternative approach to what we’ve seen in recent years. Costs need to be lowered and technology simplified. A company looking to develop an IoT service that involves communication with devices living on the edge of a network is simply faced with too many complex decisions at the moment, from how to connect through to the safety of that connection. With USSD, there is effectively no internet involved. Hacking is more difficult and costs can be saved in hardware with no need to install microprocessors, in turn reducing power demand in data transmission.

Better supply chain management is critical to future success of businesses across multiple vertical sectors. A constant connection should be the minimum requirement for any technology, not simply an ideal or aim. USSD presents a compelling technology for future implementations of IoT services in the space, enabling a simple-to-configure, constant and reliable data transmission at an affordable cost. In short, better IoT products in the supply chain will provide a strong return on investment and enable a better-connected future for the management of the movement of assets.

Ordering salmon for dinner at a San Francisco restaurant that was caught that morning in Alaska requires a lightning-fast and well-managed supply chain. By implementing a simple, low-cost IoT service, all supply chains could be so efficient and put more than a dent in the nearly $2 trillion wasted every year in the current supply chains.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


September 6, 2017  2:50 PM

IoT Cybersecurity Improvement Act: Kick starts privacy standards, but companies should do more

Rick Spickelmier Profile: Rick Spickelmier
cybersecurity, Internet of Things, iot, IoT devices, iot security, Passwords, standards

The internet of things is in need of standardized and better security measures, and the U.S. government recently took a step toward mandating this.

Dubbed the IoT Cybersecurity Improvement Act of 2017, the proposed bill would require that all devices purchased by the federal government meet certain minimum cybersecurity standards. The proposed security standards put forth in the bill are a good start, but they should be viewed as bare minimum for IoT vendors.

The proposed legislation comes in the wake of crippling cyberattacks over the past year that exploited everyday connected devices such as security cameras, digital video records and baby monitors. Notably in May, a strain of ransomware called WannaCry spread around the world, walloping hundreds of thousands of targets, including public utilities, hospitals and large corporations. In August, Maersk announced that the effects of a cyberattack could wipe as much as $300 million off of its profits for the third quarter of this year.

To make improvements toward protecting against attacks like those, the IoT Cybersecurity Improvement Act establishes commonsense IoT security standards for government vendors, including:

  • Prohibiting hardcoded passwords in IoT devices. Eliminating fixed passwords requires vendors to provide a mechanism to change passwords and requires that devices do not have common default passwords.
  • Requiring industry-standard encryption for communicating to and from IoT devices, storage of collected data in IoT devices and the servers that they connect to, and secure means of updating the IoT devices.
  • Certifying that there are no known vulnerabilities in the IoT devices. If vulnerabilities are discovered after purchase, then vendors must notify customers, patch the vulnerability and/or repair or replace the device.
  • Government agencies using IoT devices must maintain an accurate inventory of the devices.

Additional benefits

Additionally, the IoT Cybersecurity Improvement Act incorporates changes to the Computer Fraud and Abuse Act and Digital Millennium Copyright Act that add a narrow exception for security research. This change notes that the criminal and civil penalties of these existing statutes will not apply to research on types of devices purchased by the government that is carried out in good faith and in a fashion that meets government-set standards. Historically, the government and manufacturers have been known to lean on these existing laws in order to muzzle security work that they find embarrassing.

While the security mandates put forth in this bill are practices any IoT device vendor should be upholding, it will pave the way for higher security standards beyond IoT. As is the case with similar forms of legislation, there will also be a ripple effect that extends beyond the government sector and reaches all industries and the consumer. Ultimately, these proposed changes are a step forward in laying the foundation for more secure and connected world.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


September 6, 2017  11:48 AM

Breaking down walls: IoT in multi-location enterprises

Hugues Meyrath Hugues Meyrath Profile: Hugues Meyrath
Customer service, Internet of Things, iot, IoT hardware, retail, Retail IT, User experience

We are at a critical point in the evolution of the internet of things. Earlier this year, Gartner forecasted that 8.4 billion connected things would be in use worldwide in 2017, outnumbering the human population, and reaching up to 20.4 billion by 2020. As if that’s not impressive enough, the total IoT market will reach anywhere between $3.9 trillion to $11.1 trillion by 2025, according to McKinsey.

IoT is the driving force behind the digital transformation which, consequently, is propelling unprecedented levels of customer demand. In a society increasingly shifting towards automation, customer experience has become the ultimate differentiator. As the digital and physical worlds continue to merge together, it’s more important than ever that companies provide improved customer experience across their physical locations to remain competitive. IoT has the power to transform and transcend the customer experience across all industries, and organizations must apply a high level of strategy when implementing technology and understand exactly how to derive maximum benefit.

Multi-facility operations often think of their technology in three layers: front of the store, back of the store and headquarters. As a general rule of thumb, front of the store indicates anything that directly interacts with the user, and back of the store is associated with everything the user can’t see or the behind-the-scenes operations of a business. The third layer, headquarters, typically stores all data generated from individual locations in the cloud where it’s transformed from unstructured data into actionable insights. In this process, in-store data is combined with online data as well as third-party sources (social media, weather, trends and others relevant to the business). All layers play a role in contributing to customer experience, but the front and back of the store are where improvements are needed most in order to adapt to the experience economy.

IoT for the front of the store

Data is the new goldmine — if properly excavated and analyzed, you’ll be rich with insights. In short, the critical component of IoT is the data it can generate, and more importantly what you can do with that data to improve efficiency and customer experience. Successful companies know that informed, data-driven differentiated experience trumps everything else. One example is retailer Restoration Hardware, which reinvented itself by pioneering the customer-centric revolution, transforming its stores into “galleries” with an emphasis on the aesthetic versus the sales. The goal? To disrupt the furniture market by implementing a customer experience informed by data.

In order to create a quality customer experience, you have to understand your customer — and you can’t do that without data. “Smart” buildings provide infinite value, and using IoT tools, companies can better capture data and understand the metrics that are crucial to their business, like store traffic or even store temperature. For retailers, front-end IoT technology can easily monitor store traffic, customer demand and sentiment in real time — a game-changer when it comes to customizing the in-store shopping experience. IoT can also help retailers target connected consumers in-store, sending customized offers straight to their mobile devices — a much more targeted (and preferred) approach to just blasting customers with mass promotions.

IoT for the back of the store

The “back of the house” is equally important to the customer experience, even if it’s invisible to the customer. Back-end IoT tools and technology also equip businesses with data-driven insights. Combined with proper analysis, reporting and action, this data can provide much greater visibility into equipment and the maintenance ecosystem enabling businesses to make more informed decisions. It can even arm the largest retailers with information from headquarters, and give them the ability to transform stores nationwide overnight (or even in real time) to introduce new products, new displays and layouts to improve customer experience quickly. It’s clear that IoT has the potential to increase efficiency, productivity and profits by collecting, processing, filtering, sorting and acting on the large amounts of data. Of course, none of this would be possible without the sophisticated analytical tools that live in the back-end operations and are responsible for monitoring data and identifying key findings once data is aggregated from disparate locations.

As another example of back-end IoT at work, “smart” equipment can optimize maintenance, extend the life of the equipment and predict when things are going to fail. For example, in restaurants, IoT can be used to remotely monitor ovens and other cooking equipment so that cooking efficiency is optimized and consistent product quality is maintained. Additionally, IoT applications can remind operators when maintenance is required, let them know the correct procedures to take and even order replacement parts or supplies automatically. Better maintenance and increased efficiency ultimately impacts customers and improves their experience.

In the retail industry, IoT is often incorporated in back-end processes like inventory management (for example, RFID tracking chips). To meet today’s on-demand expectations, retailers are turning to IoT in supply chain and logistics management. The process of moving large shipments of perishable, temperature-sensitive goods, like food items or flowers for instance, is complex. IoT presents an opportunity to make this process much more efficient by providing companies with reliable technology that enables them to measure and control the many variables, like tracking assets and preempting disastrous malfunctions — which is imperative to avoid failures in the supply chain.

We’re living in an age where customer loyalty is determined by the customer’s experience, making it more important than ever that companies across all industries incorporate IoT technologies into both their front-of-house and back-of-house strategy. In 2016, 82% of customers stopped doing business with a company after a bad experience — up from 76% in 2014, according to Mary Meeker’s “Internet Trends 2017 Report.” Likewise, as stated in Forrester’s 2017 predictions, “today’s customers reward or punish companies based on a single experience — a single moment in time,” and “one poor experience can trigger an immediate — and possibly prolonged — shift in spend to a competitor.” In addition, Forrester reported that interest in IoT “has hit fever pitch” and that “IoT holds the promise to enhance customer relationships and drive business growth.” Looking ahead, we can expect to see companies increasingly turning to innovative, connected technologies to unlock new revenue, reduce costs and enhance the customer experience.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


September 5, 2017  3:30 PM

AI-driven Robocars are here to burn some serious rubber

Scott Amyx Scott Amyx Profile: Scott Amyx
ai, Artificial intelligence, car, Connected vehicles, iot, Vehicle

Few things can beat the adrenaline rush of being part of a Formula 1 race. The unbeatable concoction of speed, skill and technology makes F1 a spectacular sport. And, not surprisingly, AI is going to make things even more exciting on the racetrack by powering driverless cars.

With the future of passenger cars clearly belonging to driverless cars, it was just a matter of time before someone came up with the idea of driverless racing cars. Enter Roborace, which plans to be the first global championship for electric driverless racing cars. The racing championship will feature 20 cars, 10 teams of two cars each. At the wheel, however, won’t be the superstar F1 drivers, but highly competent AI software smartly controlling all aspects of the race.

What’s inside the Robocar

Based on teardrop aerodynamics and futuristic designs by Daniel Simon, who adds to his credit Hollywood films such as Oblivion and Tron: Legacy, the racing Robocar looks like something straight out of a science-fiction movie. Kinetik, the maker of Robocar, plans to launch four different prototypes. The car features a lightweight carbon fiber body, tires by Michelin and internal computing units by Nvidia.

The power of Robocar comes from a 540 kW battery that powers four motors, 300 kW mounted with each tire. At 4.8 meters long, two meters wide and weighing around 1000 kg, Kinetik expects the car to reach a top speed of 300 km/h (190 mph). The navigation of the Robocar comes from a sophisticated optical system that includes six AI cameras and GNNS positioning, radars, five LIDARs (light detection and ranging) and 18 ultrasonic sensors. Developed in little under a year, the Robocar has an Nvidia Drive PX2 brain that has an open architecture. The AI-driven cars are capable of making 24 trillion operations per second. This entire hardware system takes advantage of deep learning for a 360-degree situational awareness.

The hardware of Robocar is pretty much standard. So, the dominant skill in the race is the programming behind the AI software running these cars.

The future

Robocars can help develop exceptionally safe and efficient technologies for the highways and places where vehicle speed matters. Automobile giants like Ford and Audi are investing large sums of money in driverless technologies. Denis Sverdlov, CEO of Roborace and Charge, wants to create a strong emotional bond between people and AI machines. He foresees that “an emotional connection to driverless cars can bring humans and robots closer together to define our future.”

The challenge

AI for Robocars, like many other platforms, is still in its developmental stages. DevBot is the predecessor AI vehicle of Robocar that can be manned, driverless or both. DevBots were designed from the ground up for the AI development teams to gain firsthand experience in developing software for race vehicles. In a recent Formula E electric car race, a DevBot crashed after miscalculating the corner. It did so while running at a high speed. Such incidents clearly spell out that AI technology for high-speed vehicles still needs a lot of work. Specifically, AI driving these vehicles needs to drastically improve assessment of thousands of factors that come naturally to humans. These include racing conditions, type of surface, how hard and long to brake, taking calculated risk and so on. However, once the technology sufficiently demonstrates that driverless cars are much safer, the adoption of driverless cars will skyrocket.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


September 5, 2017  1:49 PM

The importance of flexibility when selecting IoT technology

Rob Martens Profile: Rob Martens
Consumer IoT, Enterprise IoT, Internet of Things, iot, IoT data, IoT sensors, Sensor, Sensors, Smart Building, smart home, User experience

In my work, I’m frequently surrounded by experts and people heavily immersed in the latest and greatest technology tools and sciences. That makes talking with people who are not as exposed to the internet of things a special opportunity. I always tell them: the fact is that the success or failure of IoT-oriented technologies is all about flexibility.

In my experience, these conversations and interactions quickly become a discussion about operating systems. When the term “operating system” is used, many of us think about our personal computing devices: phones, tablets, perhaps a laptop. But today’s operating systems aren’t necessarily tied to a single piece of hardware or a traditional “technology tool.” They can be ubiquitous, completely cloud-based and interact with devices based on circumstance.

That said, in the more traditional vein we have operating systems for our watches, cars and, I would certainly now argue, for our homes. All of these operating systems appear to be on very different paths to maturity, but in reality they are following a similar map, largely based on the core foundations of their construction.

To understand these paths, start with the basics. For IoT-oriented technologies, the three core contributors to growth/maturity are:

  • The cost and availability of sensors
  • The decreasing cost of data transport
  • The decreasing cost and increasing efficiency of data cleansing/optimization tools

Sensors are affordable and becoming more so daily. These sensors are also delivering more data than ever before. The cost of the pipes and transport of this data continue to decrease and become more accessible. All of this unstructured or “dirty” data is now more easily cleansed, sorted and made available through optimization tools. These are foundational, core trends that feed the ability for IoT operating systems to exist. These core components allow for these maturing operating systems to form, evolve and expand at the rapid pace we are witnessing.

I work for a company that makes security and safety products focused on where people work, live and play. My job is challenging and immensely interesting to me, because we are in the process of bridging a divide. On one side are the typically hardened, tough metal objects meant to keep out the people you don’t want coming in, and on the other are newer digital components that are meant to protect, but can also enable a higher level of convenience, interoperability and function than ever before.

So let’s focus on the evolving home or building operating system for a moment. Some call them smart homes or smart buildings. Here we are finding the fusion of energy management, security and, most recently, convenience. Historically, energy management and security are well understood. We manage the costs of lighting, HVAC, and so on. We understand why it is important to have locks on our doors and to keep our buildings secure. The seed change of IoT in this operating system is now appearing. By installing connected, smart or sensor-driven devices, we are effectively bringing the element of convenience to more people. As we witness the fusion of these historically disparate elements of energy management, security and convenience together, we can provide a more tailored and personalized experience to the user. In other words, we can provide flexibility.

Will my watch, car and home be able to support multiple operating systems? What if I want just one? Or will these operating systems themselves demonstrate the type of flexibility that consumers crave and demand as the technology matures? How will these operating systems evolve? If I choose one, am I stuck?

Great questions, and early on, operating systems have been very proprietary. History shows us that less proprietary systems are typically faster to evolve and change, allowing them to adopt newer technology to positively impact function, availability and costs. Again, in a word, flexibility.

In addition to the influence of traditional market demand-based innovative design and crave-worthy marketing efforts will be the critical elements of security, privacy and the management of data. These elements are potential game changers if not suitably and continuously addressed. There is no flexibility here.

Operating systems will become broader and more transparent as IoT-oriented devices and tools enter our lives in masse. The growth curve is undeniable, and the opportunities are becoming much more obvious to even the least exposed observers. The measure of adoption and, perhaps most importantly, satisfaction that they deliver will be largely dependent on how flexible they are for the user.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


September 1, 2017  11:54 AM

Computer vision holds the key to the smartest home, literally

Gideon Shmuel Profile: Gideon Shmuel
Consumer IoT, Internet of Things, iot, IoT analytics, IoT data, Machine learning, smart home

Computer vision-based user data will improve our lives at home. There are various layers within a house that serve a specific purpose, and when they are combined, they create what we call a “home.” For example:

  • Some are meant to keep us secure: the door we lock, the alarm system and security cameras in our home
  • Some are meant to keep us safe and provide peace of mind: smoke alarms and baby monitors
  • Some are meant to keep us comfortable: climate control (thermometers), window blinds and lighting
  • Some are meant to entertain us: TVs, media devices and sound systems
  • Some are meant to maintain us: food and appliances in our kitchen, the covers on our bed and roof above our head

To reach new levels of security, safety, comfort and entertainment we need to embrace the idea of computer vision in the home, rather than shying away from the idea of sharing personal data. Computer vision allows computers/systems to understand what they “see” via digital images or video. When systems can detect and recognize objects, they can deliver smart actions according to what they were programed to do. One area that has effectively proven how computer vision can enhance our lives is the automotive space. Car systems that utilize computer vision to learn the driver behind the wheel and “see” the surrounding area can alert the driver when he starts swerving out of his lane.

Many consumers are already using computer vision on their smartphones and don’t even know it. Snapchat and Instagram are both using computer vision-tracking to detect facial features and place overlays (filters) in the right places.

Security, safety and comfort

Welcoming computer vision into your home and adding it to your connected devices will enable a new level of convenience to your everyday routine. Your front door will be able to “see” when you arrive and unlock the door for you, or remain locked when an unfamiliar person (face) approaches. The alarm systems will be smarter, able to detect specific members of the family (including age and gender) and who’s not. Taking it a step further, indoor security cameras will send an alert to your smartphone if an elderly family member or guest falls, or if a toddler is climbing up the stairs, on the countertop or anywhere that places the child in danger. Nest, Logitech and other smart home manufacturers have already started to offer these smart security features to consumers as a subscription-based premium service, or already embedded them in their newest products.

When your home can see and has the capability to learn, over time it will be able to adjust lighting and temperature to the number of people in any given room to ensure a comfortable environment and potentially save you money on your electric bill. Your fridge will be able to “see” when you are low on milk and eggs and send you an alert to make sure those items are added to your grocery list.

Entertainment

Imagine a cable box that greets you by name and turns on a tailored interface with the content you like. Imagine not having to watch ads that are not relevant to you in any way, and a TV that is smart enough to block inappropriate content/channels when it detects kids in the room. Imagine a system that notices your significant other is sitting next to you on the couch and suggests content that will be enjoyable to both of you. These are examples of how your home and media reacts and adjusts to you. You become a true master of your home and providing data gives you access to new capabilities and information previously unavailable to you.

Privacy

Embedded computer vision is the answer to the ongoing data privacy debate, but consumers must first understand that there are two categories of computer vision systems in the home:

  1. Activities and services that aim to tailor the home to you (for example, adjusting the lights and temperature, suggesting content on TV to the people in the living room, etc.)
  2. Activities and services that you may want to monitor in addition to security system alerts (for example, sensing who is at the front door to unlock or sound an alarm, and monitoring your children or activities in home).

For the first category, there is no need for the computer vision systems to save, show or share any video and visual content. All that is needed for the systems to work is real-time data reporting to the relevant systems, and the output should be X people in room, age, gender. To customize media content, an output stating the relevant family members in front of media device is necessary, but no visuals are needed. This can and should be embedded, which means no videos or images are sent to the cloud, the camera shouldn’t save any imagery locally and the camera itself shouldn’t be connected to internet — all computer vision calculations should be done in real time on the device itself. If the above rules are met, the computer vision technology is embedded on the device and your concerns about privacy should be resolved because the hackability of the system is extremely low.

Regarding the second category of computer vision systems, you have a need and desire to monitor and see what is happening in your house, so the privacy protection measures with the home security cameras should be implemented here as well. Once again, the deeper understanding and seeing capabilities of these systems do not require processing in the cloud, internet connectivity of the sensor or saving of images. It is you that wants to view it, so the risk with these enhanced capabilities will be no different from the ones faced today in this sector.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


August 31, 2017  3:52 PM

Smart home biometric data: How domotics enable IoT data applications

Raoul Wijgergangs Raoul Wijgergangs Profile: Raoul Wijgergangs
Biometrics, Data Analytics, Internet of Things, iot, IoT analytics, IoT data, Smart Building, smart home, Wearables

When they came to market, fitness trackers and other wearables were the darlings of the IoT future, but they have since lost much of their sheen.

While the wearable market is growing according to IDC, even market-leader Fitbit faces decline in shipments, and there is little improvement in the abandonment rate — many users stop wearing their device after six months of use. The ability to wear a device is what sparked interested in the category, but has also become the barrier for IoT growth and adoption.

While wearables need to be worn to be useful, another technology can provide comparable activity data without suffering from the high abandonment rates. Interaction with devices in the home can be connected to healthcare and other business and government verticals. This collected data can be analyzed by healthcare professionals, family members, insurers, neighborhoods, cities and states for societal and homeowner benefit.

The human body has four to six clinically accepted vital signs which indicate overall condition, four of which are quantified regularly by today’s wearables: blood pressure, pulse, respiratory rate and temperature. The fifth and sixth vital signs aren’t as frequently used because they are subjective or discipline-dependent, such as pain or glucose level for diabetics. These metrics and others about bodily functions, such as sleep and menstruation, may one day be clinically considered among the others, but have limited application outside of healthcare, offering little insight for population-sized, big data analytics for IoT.

By contrast, the average smart home has many vital signs which, when quantified, produce remarkably rich data depictions — not only of human activities, but also habits, preferences, risks and changes to living patterns. This domotic data — the biometrics of the home — can be gleaned from smart home devices such as environmental sensors locks, and lights, whose metrics apply to more than just healthcare. Verticals such as security, energy, insurance, senior living, consumer products and public safety can collect smart home data, or are beginning to explore its opportunities.

There are two primary obstacles for domotic data use in IoT analytics. For one, the global smart home market is still in its growth period, with an estimated 21.8 million of U.S. households owning a smart device of some sort. As homes take on more devices, domotic data will become a de facto metric in assessing market-relevant activities and behaviors.

The second reason why domotic data isn’t being used can be attributed to the understanding of how to use it, and how it can translate into usable, scalable, IoT data. Third parties, such as businesses and government agencies, are developing uses and benefits from this data analysis, rather than by consumers themselves.

The next step for deeper IoT domotic data adoption would be a standardized lexicon, categorized into intelligent inferences scalable by time and population sizes to be customized for specific verticals. The smart home industry is still at the early stage of development, where the medium is the message. The information of door status from a smart lock has been typically understood as the data at this point, but to be useful to IoT, analysis to move beyond isolated events and device status. It needs to begin to interpret how events over the course of time become behavioral information, and then how they can be used for beneficial purposes.

Consider the ordinary smart lock, and the related devices that comprise a home security system — the most common smart home scenario. The device status of an open or closed door or window can signal departures and arrivals in the home, which is useful only to the homeowner. This simple, static information is currently only useful to residents in home monitoring and safety; but when analyzed on a larger scale, this data can play a part in larger impacts.

When examining these same conditions through industry-specific lenses, the domotic data can be used by home insurers. For example, if a family has digital codes as electronic keys for a smart lock, the assignment of additional new keys can indicate when a relative visits or a new caregiver gets access to the house. Other conditions, like habitual visitors, and other renting scenarios, such as Airbnb, can affect home security and its insurance equation.

Additional inferences can be made using the same domotic data to identify a morale hazard — a habitual inattention that incurs risk, indicated by a garage door that’s often left open. Homeowners who frequently forget to lock doors or shut off the backyard water valve during the winter fit this profile, which naturally impacts an insurance premium.

If the same data is scaled to a local region to recognize a pattern, inferences can be made about neighborhood safety, where leaving windows and garage doors open isn’t seen as risky behavior. This finding could impact insurance premiums and local policing; and at the state, above the municipal level, this data could help allocate public safety budgets more effectively. In this way, simple status notifications seen at scale have the potential to impact larger populations.

Other domotic data could yield similar external insights and commensurate responses for local and regional energy usage patterns, water consumption, residential safety risks, structural vulnerabilities in streets and buildings, senior activity and services, and more. Since its market entry more than a decade ago, Z-Wave technology has been dedicated to making these possibilities mainstream, through an interoperable, brand-agnostic approach, which creates the world’s largest ecosystem of smart home devices, functions and data generation. By providing a platform for consumers to build up their smart home with devices of their choosing, the interoperability can push smart home adoption and bring the market closer to these IoT data applications.

As the smart home market matures and both residential and commercial dwellings are outfitted with sensors, cameras and other monitoring and automation technologies, the biometrics of the home can start to make an impact.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


Page 4 of 59« First...23456...102030...Last »

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: