IoT Agenda


January 12, 2017  5:00 PM

IoT PCB considerations for startups

Zulki Khan Profile: Zulki Khan
Hardware, Internet of Things, iot, IoT hardware, PCB, Printed circuit boards

Since IoT devices are so new, you would think that getting an IoT printed circuit board (PCB) project off the ground starts by reinventing the wheel and going through a lot of technical hassle. That is definitely not true.

But it doesn’t mean IoT startups have a clear path to stardom. Facing them is a number of design and manufacturing considerations that are unique to these small products. These considerations must be taken into account for the new IoT product to be successful.

On the plus side, it’s important for IoT startups to know that the basic foundation for a successful new product does exist. This means experience and knowhow involving the design, fabrication and assembly of these advanced products are available. And the best advice is for prudent IoT product entrepreneurs and innovators to heed the advice that experienced electronics manufacturing services or EMS providers have to offer. These companies and their engineering staffs have already performed this work with pioneering IoT companies in Silicon Valley entering the early stages of this emerging industry.

The PCB of an IoT device is a different beast than the traditional one, which is substantially larger and flat. IoT devices, on the other hand, consist mostly of either rigid-flex or flex circuit assemblies, which come with their own sets of design layout, fabrication and assembly considerations and nuances.

Layout

A foremost consideration is to seek out experienced designers who’ve done a lot of rigid-flex PCB designs. PCB space for an IoT device is at a premium. So you want the designer to have firsthand layout experience to effectively design key components on that small space.

Also, most IoT devices aren’t stationary; they incur considerable movement and twisting. Here, the experienced designer plays a major role in calculating bend ratios and lifecycle iterations as a significant part of a design. Other key design layout considerations include signal trace thickness, number of rigid and flex circuit layers, copper weight and stiffener placement. Stiffeners are used on flex circuits to assure components mounted on the flex circuit remain tightly in place to avoid movement.

Another consideration is through-hole component placement in rigid-flex circuits. Why is that important? A majority of IoT devices are based on surface mount device placement. But there can be through-hole components, which are normally placed on either the rigid portion or the flex portion of the board. Through-hole components are normally used to communicate input/output or I/O signals to the outside world. That way, those signals can be displayed using an LCD or LED monitor. Through-hole component placement is an important consideration in an IoT device because when used on the flex section of the board, proper stiffeners need to be designed and implemented for proper assembly.

Lastly in the layout category, the heat that components generate must be considered. IoT devices are becoming more complex with rigid-flex and flex circuits featuring upwards of 12 to 14 layers. Some devices are digital. But increasingly analog devices are being used in IoT devices. Analog circuitry generates considerably more heat than digital ones. This means heat expansion and contraction rate must be considered. In tech lingo, this is referred to as the Coefficient of Thermal Expansion or CTE and the proper management of it.

Fabrication

Choosing the right fabricator is critical and is linked to the EMS company you’ve selected. The fabricator you want must have IoT PCB fabrication experience. Among key considerations here are assuring strong adhesions between layers on both rigid and flex circuit sides, knowing all the critical calculations and having a solid understanding of when current transfers from the rigid side to the flex side.

These fabricators must also possess an in-depth understanding of remarkably small components like 0201 and 00105 device packages, package-on-package, and the use of fine-pitch ball-grid array or BGA packaged devices.

They also should have experience in designing boards with very tight tolerances in terms of footprint for those types of BGA devices, in terms of up-to-date capabilities like laser direct imaging for putting the solder mask on the board. They should have laser drills for via drilling with sizes of 5 mils or under because these IoT devices could be so small that a regular drill size of 5 to 8 mils might not suffice. They may need to go to a 3 mil, which means that you need to have an advanced laser drilling capability in house.

If you are placing via-in-pad, it is a good way to use the small real estate that is available on the rigid-flex board, yet it poses problems for assembly. If vias are not completely planar or flat in shape, it becomes a challenge during the assembly of those tiny BGA packaged devices. That is because non-planar surfaces can jeopardize the integrity of solder joints.

Sometimes via in pads leave bumps if they’re not scrubbed properly after placing the vias and gold finish on top. If there are bumps, then the solder joints in the assembly for those tiny BGA balls in those IoT devices would not be a perfect joint. This might create intermittent connections, which might be a bigger issue to address and fix. It all boils down to which EMS company you are using because they’re the ones who will select the fabrication house to make a successful IoT product for you.

Assembly

It’s important to go to experienced EMS companies that have successfully assembled IoT and wearable PCBs as they have specialized tooling and fixtures already available, which are necessary for assembly to assure components are placed properly, accurately and the printing is performed correctly.

Printing can be a challenge for IoT devices. If it’s a rigid-flex board, then there is a change between thicknesses of the rigid and flex circuit portions, meaning a special fixture is required to keep the complete rigid-flex board planar or completely flat to allow effective printing to be achieved.

Startups should be well prepared to select the right manufacturing partners and EMS companies. This way they can make sure they have enough experience ahead of time to get the multitude of design, fabrication and assembly details successfully performed as they are key to a successful and timely IoT product launch.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

January 12, 2017  10:56 AM

Industrial IoT needs a killer app

Dave McCarthy Profile: Dave McCarthy
Enterprise IoT, IIoT, Industrial IoT, Internet of Things, iot, IoT applications, Predictive Analytics, Predictive maintenance

The hottest tech trend at the moment is IoT. Both media and analysts have fueled the hype that anything and everything will be connected and able to participate in an ecosystem that enables companies to become more efficient in their operations and unlock new business opportunities. However, looking back a few years on initial predictions for IoT adoption rates, the industry has fallen short. Why is that?

Digging a little deeper, most of the focus on IoT, especially for enterprises and industrial companies, has been in the form of platforms. And, there is nothing more vague than the definition of an IoT platform. There are platforms that focus on hardware and connectivity, which represent the most basic aspects of IoT. Others are positioned as application enablement platforms, effectively a set of APIs and widgets. Then, there are data-centric platforms, including analytics and logic.

In all of these examples, vendors are providing basic tools and building blocks. And that is the problem. None of these platforms can improve business outcomes on their own. Instead, they require developers to stitch together the various components into … guess what? Applications.

This is not an unfamiliar story; every new technology starts this way. The killer app drives adoption and in many cases obfuscates the details of the underlying components. The majority of users are looking for the benefits they can receive from technology rather than immersing themselves in the mechanics.

Here are a few contenders for industrial IoT’s killer app:

  1. Predictive failure — For operations teams, unplanned downtime is the enemy. Yet that’s the situation they most often deal with when a critical asset unexpectedly fails. By identifying the leading indicators to a failure and then applying that knowledge to the real-time data stream, it is possible to be proactive. This puts the company in a position to schedule downtime when it is less impactful to the business.
  2. Adaptive diagnostics — Too often, the repair process starts once a technician arrives at the equipment. This typically involves understanding what error codes are currently active and then manually determining the possible root causes using static repair steps. Instead, IoT can automate this process by factoring in sensor data and historical repair information. Not only does this reduce diagnostic time and mean time to repair, but the system can adapt and learn over time.
  3. Condition-based maintenance — Aside from break/fix scenarios, industrial companies have the opportunity to move beyond interval-based maintenance schedules to ones that are tailored based on actual equipment needs. Fixed intervals lead to over-servicing, which is wasteful of parts and labor — not to mention unnecessarily taking the asset offline. Conversely, under-servicing can reduce the overall lifespan of the asset.

From the perspective of an operations manager, the notion of an app that can deliver on a needed business concern is immensely more valuable than a platform that has the potential to do something useful given enough time and effort. The killer app is what will take IoT from being a proof-of-concept project in a lab to a strategic part of the business. After all, industrial companies are not in the market for IoT; they are looking for improved business outcomes.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


January 11, 2017  10:32 AM

The next Industrial Revolution: How IoT will change our world

Andrew Morawski Profile: Andrew Morawski
Agriculture, Connected car, Healthcare, IIoT, Industrial IoT, Internet of Things, iot, IoT applications

The Industrial Revolution marked a major turning point in history as the world transitioned from manual labor to machine production. Virtually every aspect of daily life was influenced in some way through the introduction of machines and the rise of the factory system. Today we are on the brink of the next Industrial Revolution with the integration of the internet of things.

According to a report from BI Intelligence, there will be 34 billion devices connected to the internet by 2020, and nearly $6 trillion will be spent on IoT solutions in the next five years. It’s this incredible growth that gives IoT the potential to revolutionize everyday life similar to the Industrial Revolution of the early 1800s. While IoT has the potential to impact a number of industries, there are a few notable places where we will see the strongest impact as adoption continues to grow: healthcare, transportation and agriculture.

IoT in healthcare

Today we’re seeing IoT integrated into the healthcare industry through the adoption of connected devices. At Vodafone, we’re working with ASD Healthcare, the largest distributor of oncology and supportive care drugs to health systems and specialty pharmacies. Through this partnership, we’re witnessing firsthand the ability to move patient care and monitoring into the home with ASD Healthcare’s Cubixx line of smart medical devices and we’ve seen the extent to which IoT can impact the patient care system. With access to instant data, physicians can reliably monitor patients in real time, ensuring proper treatment. This ultimately leads to better accountability among patients, and could reduce hospitalizations in some cases.

IoT in transportation

The connected car has been an IoT buzzword recently, as cutting edge connected technology is increasingly integrated into the auto industry. In a recent whitepaper on IoT and transportation, Vodafone stated that 82% of automotive leaders are confident most cars will be connected by 2020.

However, the impact of the integration of IoT throughout the transportation industry will have impacts far beyond the physical alteration of cars as they become connected. What will ultimately transform the auto industry is the ability for automakers to become an asset as many industries begin to seek insights from instant data generated by vehicles. The biggest impact on this industry will involve collaborations with retailers, insurers and smart city solution providers. We’re going to see the emergence of user-centric services that will transform data into a massive business value.

IoT in agriculture

The aforementioned BI Intelligence report states that with a growing global population, the world will need to produce 70% more food in 2050 than it did in 2006 in order to feed the population. IoT is currently being leveraged for analytics and greater production capabilities to meet the needs of our growing world, and in doing so completely revolutionizing the agriculture industry. Right now, we’re adopting practices that integrate IoT technology such as smart agriculture and precision farming to improve production output, minimize cost and preserve resources. The adoption of these technologies has been so impactful that the same report projects IoT device installations in the agriculture world will increase from 30 million in 2015 to 75 million in 2020. IoT has become a massive disruptor for one of the oldest industries, completely revolutionizing the way we grow food.

By looking at how quickly IoT adoption has risen across all industries, it’s clear that this revolutionary technology will change the world we live in.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


January 10, 2017  10:56 AM

Four key characteristics of meetings

Daniel Jackson Profile: Daniel Jackson
Internet of Things, iot, Meeting, meetings, productivity

In my last article, I touched on how the internet of things can power better meetings. In order to understand how IoT can help or define our work, it’s important to understand what makes a meeting bad or unproductive in the first place. There are four characteristics that are thought to determine a meeting’s success (or failure). In no particular order they are: physical, procedural, temporal and attendees.

Physical

Physical characteristics relate to aspects of the meeting setting and environment, such as seating arrangement, provision of refreshments and the appropriateness of the space, temperature and lighting.

For example, if the room is designed for six people but eight people attend the meeting, lack of seating or overcrowded work space creates a stressful environment and affects the quality of collaboration, information retention and attentiveness. Similarly, meeting participants’ concentration and focus are inhibited if the room is too cold or too hot.

Poor lighting also impedes productivity. For instance, glare from sunlight on the room display can make it difficult to see content or remote participants. Dim lighting can cause people to become drowsy and disengage from the conversation.

Often the technology in the room is difficult to use and therefore may go unused, making it difficult — if not impossible — to share content or communicate fully. Another possibility is that the technology in the room doesn’t support or facilitate the type of meeting scheduled or the content presented. For example, if someone has a presentation on a tablet but the room doesn’t support wireless presentation, or if the laptop has an HDMI or DisplayPort connector but the room only has a VGA cable, the meeting is doomed to fail before it even begins.

It’s not only about people — it’s about how technology is able to replicate the experience for those not in the room. It’s important that these folks aren’t left out.

Procedural

Procedural characteristics include the use of agendas and ground rules. Studies have shown that most of the problems in meetings occur in the pre-meeting phase. Thus, preparation plays an important role in conducting meetings effectively. Participants perceive a higher meeting quality when they have prior access to a formal agenda, and practitioners suggest including the presenter’s name, the expected action and time estimate for each agenda item. Moreover, the identification of the goals behind each agenda item provides both the meeting leader and the attendees with an orientation during the meeting. Rogelberg, Shanock and Scott (2012) found that attendees enjoy meetings more when they have clear goals and relevant information is shared.

Temporal

Temporal characteristics describe how the meeting time is used, the meeting length and whether the meeting started and ended promptly. Unfortunately, 37% of all meetings start late and 68% of workers find late meetings unacceptable. Beyond the frustration people feel, which inhibits productivity, late start times alone cost organizations about $5.3 billion every year.

To relate this extraordinary number to individual businesses, 15% or more of labor costs is associated with wasted time either searching for a place to meet, locating the meeting place or starting the meeting. For example, once everyone is gathered and ready to meet, it takes an average of 10 minutes to connect personal devices for a presentation or to start a video conference. Those minutes propagated across every meeting throughout an organization every day costs millions annually, not to mention the anxiety and frustration employees experience, which leads to another 20 or 30 minutes of cognitive dissonance (research indicates that it takes that long for the brain to recover from stress and regain full intellectual capacity).

Attendees

Attendee characteristics include the presence of a facilitator or a leader and the number of attendees. Niederman and Volkema (1999) emphasized the importance of the leader in controlling the flow of information, assisting in the decision-making process and helping attendees reach the meeting goal. As the number of attendees increases, the participation per attendee decreases and it becomes even more important that somebody is performing the role of a meeting facilitator.

Meetings are integral to workflow in businesses and educational institutions. They cannot, and probably should not, be eliminated or avoided. In fact, 92% of workers value meetings as providing an opportunity to contribute to organizational success. Therefore, meetings must be made more efficient and productive.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


January 9, 2017  1:37 PM

5G security: A proportionate choice

Herve Pierre Profile: Herve Pierre
5G, 5G technology, Authentication, Communications security, Internet of Things, iot, iot security, Network security

By 2021, 5G could deliver a spectacular range of potential benefits and use cases to network operators and service providers. This will create functional and technical needs for higher speeds, lower latency and greater power efficiency, and herald more actors and device types and greater use of the cloud and virtualization. However, these opportunities also bring more security threats and a greatly increased attack surface.

During 2016, SIMalliance examined the five major market segments for 5G — network operations, massive IoT, critical communications, enhanced mobile broadband and vehicle to X — to identify a wide range of threats, which we used to draw up security requirements and potential mitigations. The result is incisive recommendations that will help 5G meet its potential, securely.

Slices and virtualization — a network of networks

In a 5G future, mobile operators will need to cut costs because of increased data volume levels combined with a decreased average revenue per connection. To achieve this, they will turn to network function virtualization and network slicing, both for cost and technical reasons.

Hence, 5G technology could be built around a “network of networks” involving network slicing and mobile edge computing, with mission-critical elements that must not be shared between network slices to avoid a compromise on one slice affecting others.

This means 5G brings security requirements that greatly add to those of earlier generations, as well as new threats and a greatly expanded attack surface.

Protecting the network

5G network subscriptions will be protected by a network authentication application (NAA) within the device for network identification, authentication and encryption. The device identity and the identity stored in the NAA should be separate and independent from each other, as in earlier generations where the IMEI and IMSI/keys were stored in separate logical entities.

Any secure tamper-resistant entity storing the NAAs must be capable of being (and should be) audited and certified by a third party and functionally tested against a suitable industry-agreed functional compliance suite.

Massive IoT and critical communications in particular pose specific functional requirements that affect security.

Low power, long life and remote provisioning

Some areas of massive IoT may require low power consumption. New efficient algorithms, authentication policies and protocols that consider lower power consumption should be evaluated and a hibernate state may require support.

Secure access to remote provisioning should always be available and must meet requirements for secure out-of-the box connectivity with zero configuration.

In IoT, devices may remain in use for up to 15 years with only periodic connection to the network, oversight and upgrade. Their security must be built to last. Equally, many devices will be simple and low cost, but security must match the value of the data rather than the initial bill of materials cost.

Securing critical communications

In critical communications, solutions must meet requirements for ultra-low latency, high throughput and high reliability.

With standardization at an early stage, robust and, crucially, proportionate security must be built in from the outset and must protect subscribers, devices and their communications as well as the integrity of the network itself, whatever the use case.

Investing in security now is an insurance policy for the future of 5G to avoid hidden costs arising later from countering attacks on insufficiently protected high-value data. It is clear that the wrong decision about security today will prove a false economy in the future.

Learn more in our technical paper “5G Security — Making the Right Choice to Meet your Needs,” downloadable from the SIMalliance website.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


January 9, 2017  11:15 AM

Best job for analytics professionals: Data scientist still topping lists

Don DeLoach Don DeLoach Profile: Don DeLoach
Data Management, Data Science, Data scientist, Internet of Things, iot

Data scientist has been a trending title for analytics professionals the last couple of years — and it’s no surprise since a recent Glassdoor report found that the data science field offers not only a good work-life balance, but also hefty pay and lots of opportunities. This has not gone unnoticed within the industry. The annual Open Data Science Conference took place recently and featured a number of engaging panel discussions on where the data science industry is heading.

Data science is all the hype

The distinction between data science, artificial intelligence (AI), machine learning and deep learning has become blurred. One panel recognized that AI has quickly become today’s favored buzzword, meaning organizations are restructuring their infrastructures to account for the increase in data volume AI devices and machines will generate.

Yet all the hype sometimes gives way to interesting changes in the market. For instance, companies like Uptake in Chicago tout their ability to provide “disruptive transformation” to help more established businesses better deal with massive amounts of IoT data. This coupled with the help of high-end data science capabilities can yield insights that make a very real difference for those companies.

But companies still need to consider whether they should outsource the ability to generate such insights or if they should instead develop these capabilities on their own since no one knows their business and its needs better than themselves.

Coping with change

The technology market is filled with a history of examples of how markets cope with change brought on by innovation. As relational databases became the norm and the need for greater speed and flexibility placed higher demands on programmers, the advent of PowerBuilder and other “fourth-generation languages” in effect made more people “programmers” by abstracting some of the difficulty from the process.

This is the progression we are seeing with data scientists, but it in no way lessens the importance of the data scientist. It does, however, point to a number of technology innovations that both lighten the load as well as decrease the points of entry for some people to engage in this work.

Tools like Logi Analytics’ “data scientist in a box” are aimed at making the process easier. Moreover, certain accommodations with profound implications, such as the use of statistical models to generate high-value approximate answers, can help quickly gain insight into increasingly large and complex data. The time and resource requirements of data at scale makes this more and more difficult since the answer lies within the notion that many questions do not require an exact answer.

For this reason, innovative companies are looking to leverage statistical models over data to augment data lakes and provide for high value approximation — an approach that achieves fast, high-value answers resulting in actionable insight from data while relieving demands on network capacity.

Balancing volume is key

We have come to think that more data is better, and in some cases that makes sense. But, for example, simple edge processing that filters out the inconsequential messages from those that are consequential would likely reduce the daily number of messages in a 1,000-room building that has eight sensors in each room — taking a 24 kb reading once a second from 16.58 Tb down to 100 GB (less than 1%).

But not all data can be collapsed so easily. There will be instances where there are mountains of data from a variety of sources where insight can be gained though proper levels of exploration and correlation.

For instance, a fast-food restaurant owner might want to combine data coming from all different sources — e.g., the IoT-enabled fryer, cooler, lighting system, HVAC system and inventory, as well as city-supplied vehicle traffic data, etc. — to gain insights as to how to optimize his operations based on certain conditions.

Utilizing statistical models for high-value approximation has the ability to turn these multi-hour long-running queries using 100 nodes or more down to a 10 second, one node approximate query to gain the equivalent insight. This approach is certainly not for all use cases, but the data scientist in particular will benefit more and more over time with this approach tied to both machine learning as well as to basic exploratory analytics.

The tidal wave of internet of things, AI and machine-generated data streaming in from smart devices, sensors, monitors and meters (to name a few) is testing the capabilities of traditional database technologies as big data analytics continues to grow at a tremendous rate thanks to the IoT explosion. It is no wonder why the data scientist continues to be one of the most in-demand jobs as companies look to highly skilled individuals with an open mind, creative zest and ability to use different techniques to mine through data.

Emerging technologies will not make everyone a data scientist, but they will help make data scientists much more productive and create opportunities for some of this work to be shared by those now on the sidelines.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


January 5, 2017  3:01 PM

Connecting the connected car: What needs to happen now

Nigel Upton Nigel Upton Profile: Nigel Upton
Cars, Connected car, Connected vehicles, Internet of Things, iot, IoT analytics, Sensor data, Sensors, Vehicle

Every major auto manufacturer is committing to the connected car in some way. What “connected” means is likely to vary according to which manufacturer you talk to, but there’s little doubt we’ll see cars get more sensors, more apps, more in-dash control systems and more automation. According to Gartner, one in five cars on the road will be self-aware enough to discern and share information on their mechanical health, global position and status of their surroundings. Industry-wide growth is expected to stay at 30% annually between now and 2020 as a result.

Less clear is how we’ll take advantage of the technology that’s working its way into millions of new vehicles. Making cars smarter is one thing; we also need to improve the IQ of roads, cities, bridges, garages, traffic systems, and more so that connected cars have things to connect to. At Hewlett Packard Enterprise, we’re helping to usher in a smarter driving experience with a unique combination of technologies, expertise and partnerships.

Car as a mobile device

Beyond the basic concept of a connected vehicle equipped with Internet access, new markets have emerged, such as vehicle-to-infrastructure (V2I), vehicle-to-vehicle (V2V), vehicle-to-cloud (V2C), vehicle-to-pedestrian (V2P) and vehicle-to-everything (V2X).

A recent study by the Centre for Automotive Research highlighted that “the average car now contains 60 microprocessors, and more than 10 million lines of software code — more than half the lines of code found in a Boeing Dreamliner airplane.” Cars are becoming increasingly intelligent, and by 2018 one in five cars on the road will be self-aware and able to discern and share information on their mechanical health, global position and status of their surroundings. This self-awareness, together with the need to be constantly on, requires reliable connectivity and internet of things solutions.

The rollout of 4G LTE, and subsequently 5G networks, will further increase the capabilities of the connected vehicle, and facilitate faster transmission rates and higher volumes of data. Tier-1 communication service providers and telcos are ideally suited to provide such connectivity while needing an IoT solutions partner to address the automotive needs.

Connected Car, Nigel Upton

The car of the future will be safer for passengers and other road users. V2V and V2X communications coupled with high-speed analytics will make this a reality, and introduce heretofore unimaginable conveniences. Just imagine being able to pay for gas and parking charges from your car. And with analytics built in, the connected car will be able to offer not only pay-as-you-drive insurance, but also pay-how-you-drive insurance — rewarding good drivers and penalizing bad driver behavior. When cars are connected, the entire ownership and driving experience is more integrated and engaging.

Collecting, crunching and communicating data

To fulfill this vision, onboard systems need to do more than just compute. They also need to collect, collate, translate and share data instantly to enact what we call microservices.

Think of a driver heading for a bridge two miles away. The wind sensor on the bridge is recording the wind speed and direction. If this data is transmitted to the cloud, the connected car platform can recognize if that wind will impact handling and provide the driver with advance warning of the hazard and in plenty of time to take action.

Or think of how telematics information shared between vehicles, using a concept called swarm intelligence, can give drivers accurate, real-time information on road and weather conditions. You may not know a patch of black ice is stretched across the roadway ahead, but tire sensors connected to the onboard systems in the cars ahead will detect the loss of traction. With shared telematics and swarm intelligence, those vehicles could broadcast sensor data for others to consume and analyze, allowing your connected car to automatically adjust systems and reduce the risk of losing control. Changing gears or increasing traction, for example. In each case, connected car platforms gather and translate data — from devices with connectivity as diverse as 4G LTE or 3G/2G cellular, and low-power wide-area network technologies such as LoRa and Sigfox — to form insight that can then trigger hundreds of unseen but vital actions.

Coming together

Connected car services are still an optional extra for many cars, but they are fast becoming a standard item, even in family sedans and hatchbacks.

Here at Hewlett Packard Enterprise, we recently worked with IAV, a leading automotive industry engineering consultancy to create a fully-functional proof of concept for testing the ways connected sensors from vehicles and infrastructure can optimize the driving experience — from real-time monitoring of weather and road conditions to warn against approaching hazards to checking to see if your garage is already occupied to collecting data about the state of the road surface for the relevant authorities. Connected car services can change how we drive, and keep us safer as a result.

But they’re also like any other networked technology in that the more nodes that connect and share data, the more useful the entire system becomes. That’s why analysts expect such big growth in this market and why companies are trying to simplify connected services development and deployment. A fully intelligent driving machine may not be here yet, but it’s also closer than most of us think.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


January 5, 2017  12:37 PM

Blockchain: An answer to governmental hacking concerns

Tiana Laurence Tiana Laurence Profile: Tiana Laurence
Blockchain, Data governance, Data Management, Data privacy, Internet of Things, iot

Data integrity has often been associated with industries such as medical and finance, areas where dips in accuracy can create ripple effects that impact the everyday lives of people. In theory, government records should also fall in line with that. The old way of thinking is that government agencies are slow moving and have records in dusty old boxes stuffed in a warehouse similar to the one at the end of Raiders of the Lost Ark. However, as government paperwork has gone digital, the truth is that these types of records are as modernized as possible — and because of that, they require the same level of security as bank statements or medical records.

In a post-Snowden, post-WikiLeaks world, this is becoming increasingly vital. In some cases, data integrity has less to do with the actual content of records and more to do with the public trust. This scrutiny has heightened tenfold with the purported hacks into the Democratic National Committee’s email servers prior to the 2016 election. Removing the political aspect of that example and focusing solely on the cybersecurity aspect, such an attack shows the depth, range, ease of attacks that can hold powerful organizations hostage and disrupt business. Beyond the risk of foreign bodies accessing proprietary or classified data off servers, the other concern is the possible manipulation of databases. Every CIO’s worst nightmare.

In a world of evolving cyberthreats, how can governments protect their data? Enter blockchain, a growing presence as we move further into the age of the internet of things. It offers novel solutions to these tough problems.

What is a blockchain?

A blockchain is a distributed database platform utilizing chronologically linked segments known as blocks. The blocks are a list of things that have occurred over a given amount of time. The term blockchain literally means chaining together these blocks. It first came to rise in conjunction with Bitcoin currency, and represents a new type of record-keeping for the digital age, one that is transparent, permanent and publicly vetted for accuracy. The general theory behind it is simple: a block is established when a vetted record is made. The block is secured through one-way cryptography (hashing), and every computer on the network has access to these records. The records are logged in chronological order with the preceding and subsequent blocks, locking it into place with hashes, and should any attempt to hack its network occur, the intrusion is quickly detected, and discrepancies would be recognized by other nodes on the network and restored to proper values.

This level of public vetting ensures that no stakeholder can influence records and changes will always be identified and restored. One-way hashing also enables this public vetting without putting any data at risk. It’s the crowdsourcing model — done with technology to power security — and it’s the next big thing in cybersecurity.

How the blockchain can protect our government

In response to public outcry for transparency, the United States Congress has pushed agencies such as the Department of Homeland Security to prove that they’ve upheld data integrity. Traditional auditing is a labor-intensive process, one that could expose national security secrets and sensitive data. Administrators with the highest levels of security clearance would be needed, but there are only so many available resources. DHS began looking to Silicon Valley for a solution, and the blockchain has demonstrated significant potential by hitting these key goals:

  • Data integrity: With a permanent record constantly being vetted by network nodes, any attempted data changes/removals would be highlighted almost instantly.
  • Classification: Using one-way cryptography, it is possible to vet data without exposing sensitive information.
  • Resources: Because data remains classified and the blockchain vetting process is handled on a public network, auditing can take place without occupying high-level administrators.

Even in a case like the 2016 Presidential Election, where reports of Russian hackers only lifted data rather than attempting to change it, the blockchain could have provided detection and prevention. The most compelling progress made in creating blockchain systems is Hyperledger’s Chaincode and Ethereum’s Smart Contracts. These groups are in the early testing of turning complete programming locked into a blockchain. Hackers would have to attempt to break the blockchain network or execute the code in unexpected ways to gain access. IT administrators of high-stakes material can establish alerts for any attempts like these and take appropriate action, thus forming multiple layers of protection over sensitive materials.

To 2017 and beyond

In a post-Snowden, post-election world, the need for cybersecurity has only intensified. The next paradigm shift will come with the implementation of smart cities, as local governments attempt to integrate the internet of things into infrastructure, commerce and logistics. Because of that, cybersecurity is at the forefront of everyone’s mind, and blockchain represents an efficient, effective and secure way of handling that. The next step is getting Silicon Valley and the Beltway to truly collaborate — and the resulting partnership could finally bring proactive, not reactive, cybersecurity to the United States.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


January 4, 2017  3:34 PM

IoT predictions: IoT security in 2017

Sharon Shea Sharon Shea Profile: Sharon Shea
Authentication, Internet of Things, iot, iot security, malware, Ransomware, Security

Nobody doubted that IoT security was a disaster when, well, disaster struck — the Mirai botnet took down swaths of the internet through a fairly simple, preventable attack.

But experts believe there are going to be more susceptible devices in 2017 than ever — and hackers will be on the lookout.

“Sometime during 2017 we should anticipate the release of an automatically propagating IoT worm that installs a small, persistent malicious payload that not only continues to infect and propagate amongst other vulnerable IoT devices, but automatically changes all the passwords necessary to remotely manage the device itself,” said Gunter Ollman, CSO at Vectra Networks. “The owners of the now locked-out devices will be forced to pay a ransom to the mastermind behind the worm in order to learn the new password, thereby taking the ransomware threat to the next level. To prevent this worm — and future versions – device owners will not only have to preemptively change default passwords of the devices, but also manage the patch level of the kernel software on the device to prevent exploitation of new vulnerabilities.”

Rick Howard, CSO at Palo Alto Networks, noted that while security researchers have been sounding the alarm for years, we need to make sure we’re not missing the bigger picture.

“The thing is, the network defender community as a whole already knows how to prevent about 99% of playbooks that exist on the internet — including the 2016 DDoS attacks,” Howard said. “We have not been diligent as a community to deploy those prevention controls across the entire community. Therefore, in 2017, in an effort to stop future large scale attacks leveraging IoT devices, we’ll see the network defender community begin deploying these controls for better prevention.”

However, beyond the tried-and-true security basics such as encryption and strong authentication, Ryan Lester, director of IoT strategy at LogMeIn, says the IoT security problem needs a new solution.

“IoT brings with it a whole new set of security challenges that can’t be solved by retrofitting current security solutions and following the same old rules,” Lester said. “Companies must think thoroughly about how to manage one-to-many relationships, which is an outlier in today’s more frequent 1:1 device relationship.”

Matt Rodgers, head of security strategy at E8 Security, agreed, adding that traditional tools simply won’t be effective in the connected world.

“In 2017, monitoring an IoT environment with traditional tools will no longer be an option, both cost-wise and technically for the IoT owner,” Rodgers said. “With so many devices doing so many things, an attacker will have a very large surface area to find and exfiltrate personally identifiable information, which will increase the quantity of attacks and further reduce the potential cost of each attack for the attacker.”

Geoff Webb, vice president of strategy at Micro Focus, said that Mirai was potentially just the beginning — things could be a whole lot worse next time around, and it’s time for regulations to come up to snuff.

“With the number of IoT devices expected to reach into the billions, the potential scale of a well-coordinated IoT attack could be used to present a very real threat to the critical infrastructure of this country, online banking, emergency services, and commerce in general,” Webb said. “We should expect IoT security to quickly become part of the national security agenda, and to see governments starting to evaluate the role of legislation and safety standards for internet connected devices.”

Jeannie Warner, security strategist at WhiteHat Security, agreed, “I’m expecting/hoping to see a shift from the term ‘security’ to ‘safety’ as well as an increase in legislation mandating increased rigor of IoT security testing. I think that NIST’s SP 800 or a similar body will form guidelines for a comprehensive security assurance through the integration of dynamic application scanning technology and rigorous device controls testing. New guidelines will ideally force more application security vendors to partner with device control testing labs to support manufacturing earlier in the development process, helping the innovative organization manage risk by identifying vulnerabilities early in development, continue to monitor challenges during testing, and help release more secure product.”


January 4, 2017  3:33 PM

IoT predictions 2017: Revenue, data, latency issues top the list

Sharon Shea Sharon Shea Profile: Sharon Shea
Data lake, Data Management, Data monetization, Internet of Things, iot, Latency, Predictions, Technology Predictions

The internet of things’ growth spurt over the past year leaves many wondering what the next 12 months will bring. Industry experts looked in their crystal balls and offered IoT predictions for the days and months ahead.

IoT prediction #1: Disembodied voices seeking recurring revenue

“Customer experience and engagement will drive business,” said Ryan Lester, director of IoT strategy at LogMeIn. “IoT product companies will rely less on the initial device purchase and more on recurring revenue opportunities, subscriptions and up-sell opportunities.”

One way Lior Blanka, CTO at DSP Group, said IoT will continue to impress users is through the voice interface.

“Amazon’s Alexa and Google’s Home Assistant are only the beginning,” Blanka said. “All of the major players will be making efforts to integrate this technology into their products. With a heavy focus on natural language processing and clarity, you’ll see algorithms and chipsets designed to enable intelligibility for two-way voice communications between users and their devices.”

“Next year will see the proliferation of IoT solutions in any number of product categories that classically have not included technology or connectivity features,” said Mitch Maiman, president at Intelligent Product Solutions. “[But] not all of the ideas will offer a significant enough value proposition to succeed in the marketplace. Expect to see new players, but also expect to see many startups fall by the wayside. Competing in this space is expensive, and there are a lot of ideas out there that lack sufficient value to keep consumers engaged.”

As the commonly cited statistic goes, nine out of 10 companies will fail within their first four months of operations. As such, Michael Beamer, president at goTransverse, knows that companies must start with a clear view of monetization, especially in IoT.

“Companies who haven’t yet figured out how to bill for these new products and services will be left behind by those who have,” Beamer said. “Without an aligned go-to market strategy, the groundwork that has been laid merely becomes a blueprint without infrastructure. Figuring out what model works, articulating ROI and understanding how an IoT initiative impacts the entire business will become mission-critical for companies looking to emerge victorious in the competitive world of IoT.”

“Increased volumes in device shipments mean costs will continue to come down,” said Dermot O’Shea, joint CEO at Taoglas. “As a result, more business plans will make financial sense. There is a great buzz around the industry now and investors are scrambling to get in to the latest and greatest IoT opportunities.”

IoT prediction #2: The data lakes will be drained

McKinsey and Company made headlines earlier in 2016 when it estimated that only 1% of data collected from IoT is ever used. While later estimates show this number increasing, it isn’t nearly where it should be. The promise of IoT hinges on its data — so what will make it more useful and consumable?

First, Adam Wray, CEO and president at Basho Technologies, recommended that organizations stop letting data lakes be holding ponds for dank runoff. “Rather than a data lake-focused approach, organizations will begin to shift the bulk of their investments to implementing solutions that enable data to be utilized where it’s generated and where business processes occur: at the edge. In years to come, this shift will be understood as especially prescient now that edge analytics and distributed strategies are becoming increasingly important parts of deriving value from data.”

Rich Catizone, CTO of Morey Corp., further sees data at the endpoint as an opportunity for increased intelligence.

“If you can take action at the endpoint, rather than shuffle data around from the cloud to the gateway, you can save time and money in data collection and storage. This also sets the stage for the proliferation of machine or ‘active’ learning,” Catizone said. “Once our devices become more peer-to-peer based rather than client-to-server, they can begin to collectively track instances and become smarter by auto-correcting their own behavior, bringing forth emerging insights that are new and novel.”

Mark Bregman, CTO at NetApp, predicted that this edge intelligence will really take off provided open platforms are used. “An open platform provides integrated and simplified access to data protection and management services and enables new approaches to data modelling and analytics that will eclipse the advances we’ve seen to date,” Bregman said.

IoT prediction #3: Latency is the enemy

“Interconnections will become very important for instantaneous access to networks, clouds and working in a multi-application environment that enables the success of IoT,” said Tony Bishop, vice president of global vertical strategy and marketing at Equinix. “The increasing number of real-time IoT apps will create performance and latency issues. It is important to reduce the end-to-end latency among machine-to-machine interactions to single-digit milliseconds.”

To address these concerns, Christian Reilly, CTO of Workspace Services at Citrix, said networks must evolve with the times.

“New devices and workflows will augment existing systems,” Reilly said. “2017 will be a pivotal year in which networks become smarter to adapt to the combinations of devices and data.”

Roei Ganzarski, president and CEO of BoldIQ, found room for skepticism in his 2017 projections: “Not enough will be done on the integration of [smart devices] in the next few years since it is less sexy and creates less news and media coverage. Thus adoption of these will be slower than people anticipate.”

If Ganzarski is right, 2017 won’t be the first year when predictions ran well ahead of actual timelines.

And what about the security of all this? Check out what the experts’ IoT predictions for security in 2017.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: