By 2025, 41.6 billion IoT devices are expected to generate 79.4 zettabytes of data and the collective sum of the world’s data will be 175 ZB, according to IDC. Essentially, close to 50% of world data will be generated by IoT. By 2025, 70% of data generated by IoT applications will be processed outside the conventional data center, according to Gartner.
Edge computing in IoT
Considering the amount of data generated by IoT, it is a no brainer that the data needs to be processed closer to the data generation point. This new model of computing is known as edge computing, which provides significant advantages compared to the conventional cloud computing model.
Edge computing is well-positioned to take the challenges of IoT head-on. Latency issues found in cloud-computing is mitigated by edge computing local data processing. Dependency on edge computing becomes pronounced when there is an unreliable communication channel to the cloud for data processing. Edge computing brings long-term efficiency to data processing in IoT applications, which is inevitable.
Elements of edge computing
A few elements of edge computing include:
- Computing devices. Machine learning algorithms running on computing devices process data generated by IoT devices. Computing device can be a small form factor server or an embedded system-on-chip board.
- Data storage devices. Data can be stored locally for analysis at a later time, or to understand real-time data behavior. Data can also be designated to a central data center.
- Communication infrastructure. IoT devices exchange data with computing and storage devices over a reliable communication infrastructure.
Edge computing also requires other technology, such as regulated power supplies, optional battery backup and optional cooling systems.
Some edge computing sites are remotely located, and it’s possible that each site might not have qualified IT staff. If this is the case, it becomes essential to have the ease and reliability of connecting to devices on these sites. Connectivity to these devices will provide the IT staff the ability to manage and control devices remotely.
Devices might malfunction, and as a result edge computing applications running on those devices will likely malfunction as well. IT staff might want to look at logs, statistics, alerts and resource consumption patterns on the devices. On many occasions, IT staff might want to upgrade the device system software and applications, apply security patches to the devices or update the learning model of machine learning applications. IT staff might also need to change the configurations, restart devices, restart applications, or delete and modify logs and statistics to bring failed systems back to a normal operating state.
There are four important considerations IT pros should take into account while creating a reliable remote connectivity solution for an edge computing.
Security is one of the most critical aspects in any design. Security must never be an afterthought; it must be part of the solution. The remote connection channel must be secured using strong encryption and authentication algorithms. A public key infrastructure technology is one of the most adapted solutions. SSH/SSL based tunnels are also popular solutions for remote connectivity.
Edge computing devices by themselves must not open up additional endpoints — or ports — that are exposed to the public internet. This can cause serious security vulnerabilities and can also increase the attack surface significantly.
Organizations must follow strong security policies for usernames and passwords on each edge computing device. Note whether the devices are remotely located outside the enterprise IT network or data centers. While the edge computing devices can be physically secured, the interfaces on them — wired or wireless — may be exposed to attackers. Systems are only as secure as the weakest secured device in the edge network.
Identifying the devices in edge computing sites can be tricky, especially if the devices are connected over a Global System for Mobile Communications network, or are behind a network address translator or firewall. The edge computing devices will not have a globally addressable IP address, and the remote connectivity solution must address this. The solution must also provide an easy way of mapping a device’s ID to the endpoint for connectivity.
Connect at scale
Most of the edge computing solutions involve large scale deployment of devices across multiple edge computing site locations. These sites might be geographically dispersed, and the remote connectivity solution must consider this requirement. IT staff must be able to connect to a large set of devices and perform operations. This is because remote connectivity solutions with persistent connectivity within each of the devices may not scale. A persistent connection needs connection states to be maintained and refreshed at regular intervals. These models do not work efficiently at scale. A solution that involves on-demand connectivity has a better prospect of scaling.
Managing remote edge computing sites with no IT staff locally available can be a significant overhead. Automation is the key enabler for efficient operation. It is desirable to create rules, clear logs and bring the system back to a normal operating state. Remote connect solutions must support programmable interfaces such as APIs, which IT staff can use to create If This Then That rules. Programmable APIs can also be used to pull statuses and statistics at regular intervals and feed data to the operational management systems.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
With the recent publicized ransomware and cyberattacks, medical device security has become a hot topic in the boardroom. Senior management is not only concerned about sensitive patient data being leaked. Patient safety is now also at risk.
The organizational challenges of securing medical devices
Common cyberattacks that aren’t designed to harm patients are still a major threat to patient safety due to the fact that, in many cases, connected medical devices are unprotected. Even as a result of an everyday cyberattack, such as ransomware, where medical devices aren’t being targeted directly, patient treatments can be interrupted and devices might crash, causing service disruption.
There are new vulnerabilities discovered all the time, including Urgent/11 running on VxWorks, Wi-Fi vulnerabilities on Meditronic’s smart insulin pumps, NotPetya based on the same EternalBlue package as WannaCry, Sodinokibi malware running on Microsoft Windows 7 through 10 and Selective TCP Acknowledgment vulnerability known as SACK Panic that resides in the TCP stack of the Linux kernels.
This is in addition to the infamous WannaCry ransomware attack that is still active, and has been attributed to shutting down more than 60 hospitals in the UK and more than 100 million dollars in damages. But even though the danger is clear, and there are directives from the FDA and Office of Civil Rights to take action, not enough is being done to protect patient safety.
Who is responsible for medical device security?
Typically, IT is primarily responsible for information security in larger hospitals, but they need to rely on specialized expertise of biomedical engineers to know how to secure medical devices effectively. Sharing information and collaborating can be difficult when the relevant experts work in different departments. Communications are even more complicated when biomedical engineering is outsourced. Recently, we are seeing a new trend where biomedical engineering is reporting to IT, which makes collaboration easier. A new position is also emerging: The medical device security engineer which makes one individual ultimately responsible for the security of medical devices.
However, even if one person is charged with security, hospitals typically have specialized departments such as radiology, oncology, cardiology and pediatrics that each have their own medical devices with unique connectivity requirements, behaviors and workflows. This makes it difficult for one individual to define and enforce a unified security policy throughout the hospital.
Patient safety interfering with patient care
Doctors and nurses are already at their limit caring for patients. When devices do have authentication, punching in passwords to protect patient data and safety can appear counterproductive because they slow down patient treatments. Since remembering passwords is tedious, many caregivers share logins which can make devices even less secure.
In addition, if a medical device is malfunctioning, caregivers are likely to yank the device and replace it with another without being aware that the product failure is due to a security incident. After a manufacturer announces a security vulnerability and a patch is available, the installation needs to be coordinated with the manufacturer and all the departments to help minimize the impact on patient treatments.
If a patch isn’t available, all the relevant departments need to collaborate to apply a mitigation, such as limiting device communications by utilizing access lists or implementing network segmentation. All of these measurements can impact business processes related to patient care.
Collaboration with verification
Because of all the complexity and the high level of collaboration required, voluntary compliance to medical devices’ security procedures isn’t strong enough. To protect patient safety, medical device security should be fully regulated with specific measurable requirements, and then enforced. Doctors and other caregivers should also be educated about the potential risk to patient health by not securing medical devices as part of their formal training.
However, there are steps that hospitals can take today without waiting for regulations and cybersecurity training to take effect. Hospitals should make sure that all the responsible people in the relevant departments share all information related to medical device operations and clinical workflows. IT security needs to be part of the procurement process so that security requirements are taken into consideration.
Hospitals need to have full visibility when it comes to medical devices, including those that were added by vendors on a trial basis. Hospitals must also have the ability to assess all vulnerabilities and prioritize them based on their impact on patient safety, service availability and data confidentiality. Following the prioritization, hospitals should implement the proper compensating controls, such as network segmentation and access control lists, to limit the attack surface. Devices should also be continually monitored for anomalous behavior to detect and prevent potential threats.
Medical device cybersecurity is a must, but it requires cooperation from everyone. A combination of training, sensible policies, enforcement and automation can help keep patients safe. Because in the end, patient health and safety are equally important.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
There is a huge buzz these days around the impending roll out of 5G technology into the broad consumer and commercial marketplace. This heavily-hyped new technology will surely bring value to some applications. However, there are places where the advent of a ubiquitous 5G infrastructure simply does not matter. In the realm of IoT, the potential benefits are a lot less clear and far from ubiquitous.
As one considers the implications of 5G in developing and executing an IoT strategy, here are some of the key considerations:
What is 5G?
Broadly speaking, 5G is a next generation cellular system solution for enhanced communications. Current cellular technology in widespread use today is often referred to as 4G or 4G LTE. 5G represents major improvements to 4G infrastructure particularly focused on two key drivers:
- Increased communication speed with lower latency
- Increased bandwidth
Certain applications will take advantage of this enhanced capability. However, 5G is not a panacea for all and comes with a few challenges largely related to the higher frequency of the signal:
- The range for a tower or cell site will be significantly shorter for 5G than for lower frequency 4G
- Because the range is shorter, there will be a need for a vastly more elaborate and extensive network of cell sites in order to provide coverage
- 5G transmission has more of a problem transmitting through walls and foliage than lower frequency networks
- For battery powered end devices, the useful battery life will be lower than with existing infrastructure because the chipsets draw more power
Because of this, the cost and logistical challenges of deploying a broadly accessible 5G infrastructure will be enormous.
When will 5G be ubiquitous?
Despite the hype, it will be many years or longer before a ubiquitous 5G network is deployed and fully operational. Yes, indeed there are 5G -enabled phones coming out and, yes, the cell carriers are all hyping the start of 5G rollout. Hype aside, the fact is that even where 5G infrastructure is deployed, the coverage is often concentrated in limited regions. We are still a long time away from having a widespread 5G infrastructure available for most regions.
Considering the challenges and potential benefits of 5G infrastructure, the affect on IoT can now be considered.
Where 5G matters
5G helps in situations that need high speed communications and increased bandwidth beyond indoor applications, including applications that require extreme low latency, real-time communications or large data transfers. For example, a deployment of autonomous vehicles would need low latency. Real-time communications with access to shared processing infrastructure can help with highly complex analytics. An IoT application that has large data transfers could involve augmented reality where high bandwidth and speed are necessary for moving real-time video data.
IoT applications are doing quite well today without the use of 5G, but there are situations where having this could be an advantage. It is important to realize what the drivers behind 5G are. It is the large cellular carriers — such as Verizon, AT&T and Sprint — that view this as a means to compete with the large cable carriers that roll out wireless infrastructure in the Wi-Fi family. As the saying goes, follow the money.
Where 5G doesn’t matter
There are many situations today that simply do not require a 5G infrastructure. For example, applications that involve very small datasets where increased speed or bandwidth are irrelevant, edge computing applications where the processing of sensor data is performed locally, or applications that do not require real-time updates. 5G is not necessary in situations where sensor data needs to be communicated infrequently rather than continuously.
Many applications today simply do not require the benefits that 5G can bring. This is obvious in the range of products in the consumer, commercial, medical and manufacturing industries that work without a 5G infrastructure.
In summary, 5G will bring benefits in the next decade to a range of IoT applications where its fundamental capabilities are useful. A majority of the current IoT solutions have little or no need for the unique capabilities of 5G, especially considering the disadvantages for implementing 5G hardware. While 5G is coming, it will be a long time before it is widely available and will not be a benefit to all IoT applications.
A data scientist or IT specialist will look at a glass and wonder whether it’s half full or half empty. An operational technology engineer will wonder how the water got there in the first place.
The divide between IT and OT is one of the major challenges in digital transformation. It sounds odd, but most industrial companies don’t have just one technology department; they have two. The first department is OT, which is responsible for managing the machines and critical systems that produce products. The second is IT, which manages things such as sales tracking, customer management and analytics.
The IT/OT department conflict
IT and OT departments often don’t speak, and when they do they often don’t get along. IT generally complains that OT tends to push away their suggestions. OT generally complains that IT wants to impose technology choices that could ultimately hurt productivity. OT says that they are the department that keeps companies running. IT says they are the department that keeps companies profitable.
Why the conflict? These two departments have very different ways of looking at the world. To IT, security means firewalls and phishing defenses. To OT, security measures include fire extinguishers and razor wire. Move fast and break things was — for a while — an ideal for IT. For OT, it’s their worst nightmare. IT generally replaces equipment every three to four years. OT engineers expect to get three to four decades out of new product purchases. Education and background also play a role. While IT executives often come out of computer science departments, OT is staffed with chemical and mechanical engineers.
This difference in perspective extends to their contrasting relationship to data. To an OT engineer, data from an IoT sensor — or set of sensors — is a very fluid and very live manifestation of a process taking place somewhere. When OT engineers look at trend lines on a screen, they are really thinking about the physics behind the data. Why did a temperature reading in a fermentation chamber drop from 90.5 degrees Fahrenheit to 88.3 degrees Fahrenheit over a 15 second interval? Was it caused by an unanticipated change in the input materials, a mechanical failure or an unexpected change in ambient temperature? And how do OT engineers get the temperature back to normal?
Dealing with the IT/OT divide
With that perspective, you can start to better understand why these two departments frequently don’t see eye to eye. OT engineers might obsess over minute process steps in an effort to eke out gains in efficiency and productivity. It’s their job, and industrial finessing like this can save companies millions of dollars a year. However, IT generally has to play the bad cop role and ask whether these improvements might have the unanticipated aftereffect of increasing maintenance costs and capital replacement cycles.
In some cases, IT might unveil a brilliant plan to reduce costs by having everyone save their data to a centralized data lake in the cloud. This time, OT gets to be the critic; they might note that the time, money and extra coding required to get that data back out of the lake will far outweigh the benefits. Worse, the lack of immediate access to information will mean that the company will — for the most part — be flying blind.
IT will tell you all data is significant so save everything. OT will tell you data is important only when it changes. Just save the exceptions.
Can the two get along? Of course. In fact, you can argue that digital transformation ultimately revolves around bringing these complimentary skills and perspectives together for the greater good. After all, both IT and OT want to save money, utilize resources more efficiently and experience the thrill of achieving a meaningful breakthrough on a problem that’s been alluding the team for a while.
Just don’t expect them to start off on the same wavelength.
The workplace is not what it was five years ago. Today’s modern office is filled with new gadgets and technologies that are all connected to the internet, where an abundance of data is collected and shared non-stop. From computers and desk phones to room panels and lobby displays, businesses are making the move to IoT devices in the workplace not only to enhance productivity and operations, but to enable efficient communications as well.
According to Gartner’s research paper “Leading the IoT,” enterprises should expect to see 20 billion internet-connected devices by 2020. With the increasing popularity of IoT in the workplace, enterprises deploy more devices than ever before to improve efficiencies and create new business opportunities. In fact, internet-connected devices are anticipated to outnumber humans four-to-one by just next year.
This increasing number of IoT devices in the office can be attributed to the migration of the workforce to cloud services. With facilities for servers becoming costly and inefficient to operate, organizations are moving to the cloud to better manage massive amounts of data. Even so, Gartner estimates that 80%of large enterprises in North America will have shut down all of their own data centers by 2022, initiating the move to IoT. Here are a few examples of devices that are embracing IoT:
The rise of instant messaging, voice over IP, and the introduction of the cloud into the workplace over the past five years has drastically evolved unified communications (UC) in the workplace. As the world of UC continues to transform, end-users are beginning to reap the benefits of connecting these devices to the internet.
As an example, video conferencing tools are switching from on-premises software to cloud-based servers, such as Zoom, Microsoft Teams or WebEx Teams. Today’s devices — whether they are video conferencing tools or desk phones — are connected directly to the internet rather than a local server, which enables increased functionality, the transmission of data in real-time and an overall better user experience. As IoT enhances these smart devices, organizations are not only using UC devices to their fullest, but also creating a collaborative workspace for employees.
Cloud-based phone systems
With mountains of cables, phone bills and inefficient and complex configurations, legacy phone systems are becoming extinct as more organizations move to cloud services to manage their data. With a cloud-based phone system, data is stored in a secure server, which can be reached over the internet. This eliminates the need for expensive hardware and cables and replacing traditional landlines. Usually hosted by a third-party provider, a cloud-based phone system platform presents a myriad of benefits, including cost savings, a simpler and faster set-up and remote capabilities built for a mobile workforce. Workers save time that they previously spent on maintaining complicated legacy systems and relying on specific IT resources for complex configurations or more pressing business matters.
Digital signage tools
No matter where you are in the office, it is imperative to keep employees in the loop. Creating a steady flow of information throughout all levels of your business can lead to enhanced communications. Organizations are turning to digital signage to instill corporate culture and convey information effectively and efficiently.
Prior to IoT and the cloud, a lot of effort and time was needed to manage an organization’s digital signage system. Companies found that an IoT-connected digital signage device could be scaled more effectively, making it possible to deploy them around the world. Additionally, IoT devices can be connected to several services and data sources so that the information being displayed and transferred is timely and relevant. With help from cloud services, organizations can have the ability to efficiently manage this data without wasting time on server management, complicated software installation or IT issues.
The move to cloud and the introduction of IoT has simplified the internetworking of smart devices, thereby enhancing and enabling communications. With more devices introduced to the workplace and more data entering the cloud, organizations need to understand how to successfully deploy, monitor and manage these devices. Establishing a policy around the use and security of these devices — such as where these devices can be used and how they are connected to the office network — is critical to the success of IoT devices. With a policy and successful deployment strategy in place, organizations can begin to reap the benefits of the cloud and IoT devices in the workplace.
Whether referenced as the fourth Industrial Revolution or the connected society, the billions of connections forming IoT will change the game for industries, enterprises and consumers. However, revolution is always triggered by the rise and demand of certain driving factors. Similarly, IoT and its development are also backed by certain key components. Ultimately, if organizations want to prosper with IoT, they need to consider who and what is driving this innovation.
Market drivers are the underlying forces and trends that make markets develop and grow. What are the drivers for IoT? What trends affect the mobile industry but also change and expand IoT? With emphasis on the cellular IoT, explore the key proponents that will enable mobile IoT solutions.
After careful analysis of public market information, opinion articles, research papers and tech company websites, 5G Americas identified in no particular order 14 cellular IoT market drivers:
- Third Generation Partnership Project (3GPP) standards. 5G will enable massive IoT and is being developed through the 3GPP technology standards to ensure a successfully connected world. The 3GPP work initiated in release 13 continues now in the 5G standards of releases 15 and 16 allowing a future-proof technology.
- Expanded internet connectivity. Machine-to-machine (M2M) connections will comprise 51% of all connected devices, numbering 14.6 billion by 2022, according to Cisco Visual Networking Index “Global IP Traffic Forecast, 2017-2022.”
- Worldwide high mobile adoption. The number of devices connected to IP networks will be more than three times the global population by 2022, according to “Ericsson Mobility Report November 2018.”
- Ubiquitous sensors. Sensors are becoming prolific; the larger the scale, the lower the cost. Lower cost sensors decrease the cost of deployment for IoT — including decreases in the cost of CPU memory, storage and megabytes — meaning more investment dollars for large processing systems.
- Large scale IoT investments. Large scale IoT investments are occurring in the market today, predominantly in industrial markets. The worldwide spending on IoT will continue an annual growth rate in the double-digits from 2017 to 2022 and exceed $1 trillion in 2022, according to IDC.
- Global application trends. For example, the use of video has grown tremendously in healthcare, public safety, entertainment and surveillance. Globally, IP video traffic will be 82%of all IP traffic for businesses and consumers by 2022 according to Cisco Visual Networking Index.
- New mobile tech. The emergence of new mobile technologies, including Narrowband IoT and LTE-M Category-M1, will alter the 3GPP standards that will coexist with 5G. The number of connections and traffic per connection over cellular networks will drive traffic volumes as organizations deploy IoT technology and the network speeds get In 2017, the average mobile connection speed was 8.7 Mbps, and that will more than triple to 28.5 Mbps by 2022, according to Cisco Visual Networking Index.
- Big data. There is a growing importance for automation, big data and other actionable knowledge that drives IoT by providing interconnection between various devices, machines or appliances that generate data. The goal is not merely to collect data, but also to extract valuable insights and information from the data generated by these devices.
- AI and machine learning. AI and machine learning have been around for some time, but certainly not at the level of implementation seen today. According to McKinsey, AI and machine learning were being used in 60% of IoT activities by 2018. This is spurred by the convergence of algorithmic advances, data proliferation and tremendous increases in power and storage capabilities at a lower cost. AI and machine learning are expected to outpace other technologies in 2019, according to McKinsey.
- Edge infrastructure. Edge computing and the cloud are slightly unpredictable due to network transformation. The low latency and reliability provided by edge computing are requisites for most IoT use cases. According to IDC, IoT deployments are fueling aggressive investments in infrastructure for new compute, storage and networking technologies at the edge. Deploying edge infrastructure, in turn, will drive many greenfield implementations resulting in the growth of the IoT market.
- Use cases. More use cases across vertical domains are providing the business case for greater market expansion. M2M applications that serve these use cases across many industries accelerate IoT growth, especially industrial IoT, smart cities and autonomous vehicles. Enhanced network capabilities are emerging and further stimulating the IoT application market. For example, support for optimized voice quality, more accurate device positioning and support for device mobility at high speed.
- IoT’s biggest challenge and concern is security. Through new 5G technology development, security assurance has evolved. Operating technology (OT) leaders are citing their concern, and moving forward with a focus on well encrypted and secured networks. 5G takes mobile to another level with a wide variety of new, advanced safeguards.
- IPv6 adoption. The transition from IPv4 to IPv6 is crucial. These developments are important because Asia, Europe, North America and Latin America have already exhausted their IPv4 allotments and Africa is expected to exhaust its allotment this year. Looking to 2022, Cisco expects 60% of IPv6-capable devices to be connected to an IPv6 network and IPv6 traffic will amount to 132 exabytes per month or 38% of total internet traffic.
- Open source for 5G. 5G will enable IoT, with the ability to connect more than tens of billions of sensors in the next decade. This level of scale may be supported by open source frameworks and platforms within that 5G infrastructure. Open source technologies support rapid innovation, easing concerns over intellectual property rights. It also permits innovation by integration, meaning developers create new systems by combining freely available open source components.
5G and IoT will revolutionize society, and this is only the tip of the iceberg. The IoT market will continue to develop with these 14 drivers and trends to energize and innovate for the future connected society and Industry 4.0.
With close to 400k followers, the Twitter account named Internet of Shit is the epitome of the dire situation emerging as we stumble into a world of connected devices.
Governments around the world are waking up to a reality where the severity of vulnerable technologies grows with our increasing dependence on technology. While hackers enjoy everything from ransomware to stealing compute power for crypto currency mining, most of the world stands watching the experience like a horror movie in slow motion.
With some very basic steps, we can reduce the vulnerabilities of the world’s connected devices greatly. Unfortunately, the world of cybersecurity has become such a lucrative business that instead of focusing on the basic stuff, the industry pushes out a plethora of prevention technologies so advanced, that only the echelon of IT experts can understand the systems.
Make IoT more secure
Like many other places in life, it seems that the 80/20 rule also applies to cybersecurity. With 20% of the effort, one can achieve 80% of the protection needed. Here are four simple steps that will greatly improve the security of connected devices:
- All communication must be encrypted. Ensuring that all communication to and from the device, independent of protocol used, is encrypted prevents man-in-the-middle attacks and the ability to sniff out information. People can avoid potential devastating consequences of doing something unfortunate, like accidentally sending a password to the device in clear text.
- Never use a hard-coded username and password. If a connected device gets stolen or physically tampered with, it should not be possible for an attacker to login to the device. A reset of a device should not lead to a default username and password or make it possible for the device to rejoin a greater network without proper re-authentication.
- Scan all software for known vulnerabilities. It is estimated that more than 90% of all compromises are due to exploits of known vulnerabilities. As a producer of connected devices, it is critical that all software is frequently scanned against known vulnerabilities as defined in the Common Vulnerabilities and Exposures (CVE) database. All connected devices use third party libraries and software – including Linux and popular protocols — that hackers try to find holes in. If every one in 1000 lines of code contains a bug, that means hackers are and will be successful at finding new vulnerabilities. Doing a scan once before shipping a device does not suffice. Scanning must be an ongoing exercise. Preferably the scan happens on the device, but it is also possible to do a scan against an inventory database if the vendor has full control of all the software running on the device.
- Patch CVEs. The last basic cybersecurity measure is to ensure that when any connected device contains software with a CVE, these devices are patched as soon as possible.
Most CVEs are made public after the vendor of the affected software has developed a patch to the vulnerability. By doing frequent scanning and patching of CVEs, the chance of a compromise is reduced by ten times.
Basic cybersecurity hygiene will improve the security of connected devices with an order of magnitude in just a few steps. Avoid the temptation of buying into advanced cybersecurity tools that in reality only cover edge cases. Spend the time and efforts to implement basic hygiene procedures. If you do that, you are much less likely to end up on the Internet of Shit feed for the world to laugh at.
The extreme storage demands of data-intensive applications are fueling tremendous growth in edge networks, and it’s opening a Pandora’s box of potential problems. While it’s a given that networks and applications such as IoT, connected vehicles, augmented reality, gaming and 5G networks are storage hogs, they also require high data processing requirements to be effective.
International Data Corporation’s “Data Age 2025” report predicts IoT devices alone will create 90 zettabytes of data. That’s 90 sextillion bytes. This data deluge will send storage demands at the edge soaring. According to industry analyst firm Gartner, 75% of the data will be processed at the edge by 2020. That’s a huge shift from today, when researchers say 91% of the data is generated and processed at the central datacenter or cloud.
As network edge processing grows, new technologies are emerging to help enterprises keep pace with rapidly rising storage needs and meet application requirements like lower latencies, reduced cloud egress bandwidths, government regulations and sensitive data security by keeping it local. For example, IoT devices may need to make local processing decisions quickly and not tolerate the latency of reading and writing the data.
The small scale of each edge cloud makes it increasingly difficult to gain economies of scale for the infrastructure. There is less flexibility in resource allocation since there are fewer resources to allocate. All of this makes it even more critical than in core clouds to use the infrastructure efficiently and consolidate resources to fit many diverse workloads on the same infrastructure. The criticality is not only from a total cost of ownership (TCO) consideration, but also to be able to run applications at the edge that need to. After all, it may not be possible to easily add more infrastructure in the same location.
The physical location of edge clouds is based on the proximity to the edge devices it serves. This factor implies special hardware requirements from the infrastructure equipment and might require flexibility in form factors, power, cooling and service mechanical properties. For example, how do technicians access the system to replace a drive? The constraints imposed by the range of physical locations of edge clouds requires a flexible infrastructure that can work with different hardware form factors. Hardware suitable for one edge cloud is not always ideal for or available to another.
Finding the best TCO storage solution for edge clouds
There are several critical factors when considering the right storage solution for edge clouds, but none is more important than TCO. Delivering the optimum TCO typically involves the following:
- Using standard hardware for compute, networking and storage with a low number of server models.
- Deploying low-grade flash drives rather than hard disk drives (HDD), such as quad-level cells and datacenter or even consumer-grade drives.
- Disaggregating storage to allow independent scaling of compute and storage, with more optimized storage use.
- Reducing system service requirements and increasing operational efficiency.
Other factors to consider for edge cloud storage
Edge locations are not well known for being efficiently serviced by technicians since each site is relatively small, likely remote and geographically distributed. This reality makes on-site service costly, so any edge cloud storage solution deployed should help reduce service time. Moreover, when introducing capacity into the edge locations, avoiding drive failure is critical for reducing in-person maintenance requirements. This target can be achieved by moving from HDDs to solid-state drives (SSDs) and having a smart layer that manages the SSDs and reduces their failure rate.
The flash management layer is important for serviceability as many of the drive failures, from an application perspective, are transient failures and are not included in the mean time between failurescalculation. In reality, with a direct-attached storage (DAS) architecture, such transient failures might cause application failure and might even be followed by reconstruction of the data that imposes an unnecessary load on compute and network at the edge. The serviceability issue also comes into play when organizations need to add storage or compute, which must be done in an easy plug-and-play fashion.
With limited edge location space, achieving high density and form factor flexibility in choice and variety is critical. Each edge instance can have different physical requirements while providing the same user experience. One way of achieving high density with limited floor space is by deploying high capacity flash drives and new form factors and implementing an architecture that maximizes resource use and allows the best compute-to-storage ratios.
Another important consideration is that edge locations require special care for security and privacy. In some cases, data privacy is a significant driver for keeping the data at the edge rather than uploading it to a public cloud. For example, as IoT devices are spread throughout the globe, sending the data from these devices to a secure edge cloud is essential. Some security concerns can be addressed with data-at-rest encryption if the drives support it or with a software implementation of data encryption. Other concerns arise when different applications run on the same hardware. Any edge storage solution must provide data and performance isolation guarantees between colocated applications. Each application must only access the data it has permission to access.
NVMe/TCP for edge storage
One of the most practical methods for separating storage from compute is with non-volatile memory express (NVMe)/TCP. With the NVMe/TCP standard ratified in late 2018, production-grade storage solutions for NVMe/TCP are now being built that provide NVMe performance based on a choice of networks without any constraints. NVME/TCP solutions are especially important for edge deployments, where adding network constraints can be impossible. Thanks to this new protocol, new storage solutions are helping organizations disaggregate storage over any IP network, and cluster several proximate edge locations into a high availability storage pool accessible within this cluster or have stateless edge instances seamlessly using storage at the aggregation layer.
These solutions use off-the-shelf, standard servers and allow organizations to choose the servers that best fit the physical edge location, including non-standard form factors. While continuing to leverage the existing supply chain to reduce the overall TCO, the disaggregation of storage from compute enables stateless application servers and allows efficient, quick and independent scale of compute and storage according to application needs. The disaggregation also improves TCO by eliminating the stranded capacity problem present in DAS deployments.
The disaggregation of storage also helps alleviate on-site technician service costs that are often a pain point of edge deployments. Some of the software-defined storage on hyper-converged infrastructure solutions base their handling of drive failures on large-scale deployments and massive numbers of servers and drives participating in the rebuilding process. At the edge, this is not enough since there are fewer infrastructure racks. New NVME/TCP solutions are designed to extend the usage and improve performance of flash memory by taking responsibility for it and generating a precise workload.
When building edge infrastructure for edge cloud computing to process all of the data from billions of IoT devices, storage is an essential building block. Choosing the best storage solution for edge cloud necessitates meeting edge-specific requirements. As organizations look to move more applications to the edge, storage must keep up delivering low latency, low TCO, higher density and smaller form factors. NVME/TCP storage solutions are designed for edge cloud infrastructure and optimally serve the needs of the edge and enable organizations to fully benefit from the potential of the edge infrastructure.
As interest in IoT soars, organizations must avoid sliding into the shiny new object syndrome and validate their IoT investments by showing the investment will create revenue or save. A year ago, I wrote about this in an article for IndustryWeek headlined “IoT and ROI: Two acronyms that need to go together.”
I still believe this to be true, but with an asterisk: Don’t get too hung up on the initial ROI.
In the last year, IoT’s momentum has continued to advance. Two-thirds reported working on IoT projects or will be in the next 18 months, according to the IoT Developer Survey 2019 by Eclipse Foundation. “IoT development is expanding at a rapid pace, fueled by the growth of investments in predominantly industrial markets,” the study said.
As more companies embark on their IoT journeys, I’ve noticed a tendency by some to obsess on IoT ROI. They become overly granular and nit-picky in their attempts to quantify its value to the C-suite.
This can be as debilitating as failing to prove IoT’s worth in the first place by blocking the journey with excessive analysis and slowing development of new applications that could have increased efficiency, enhanced customer services, and created entirely new business models. This approach puts the company at risk of falling behind competitors that are emphasizing experimentation and agility over an abundance of caution.
As I wrote last summer and which still holds true today, IoT is “still a relatively nascent technology for most companies” and “its benefits aren’t as well understood as more established technologies such as cloud or data analytics yet.”
That means IoT advocates inside companies must work harder to demonstrate how the technology helps drive and achieve important corporate goals so they can garner the executive support and funding to move projects forward.
However, I’ve seen a few organizations slide into analysis paralysis, where they spend months filling spreadsheet after spreadsheet with calculations of an IoT project cost and return when, in fact, as with any new technology, these details are seldom as easy to predict as our data-centric minds would like.
I’ve seen some organizations try to evaluate down to the nickel what items like software costs, maintenance, support and updates might cost — items that are very difficult if not impossible to plan for with any precision because there are so many unknowns in the emerging IoT market they are trying to capture.
Think of it this way: Does anyone do an ROI study before joining a gym? There is no guarantee they’ll lose weight or feel better, but it sure seems like a reasonably certain outcome. Why not just start, participate and see where you are in a year based on a thoughtful plan?
This is how companies need to look at IoT: Its game-changing potential is so evident that there is danger in letting the perfect — an unreasonable pursuit of totally buttoned-up ROI forecasting — be the enemy of the good — simply proceeding, with reasonable expectations of success.
So to refine what I wrote last year, it remains essential to show the C-suite that IoT is key to the future of the business and justify why it’s worth strategic investment.
Get into the game with proof-of-concept pilot projects that demonstrate IoT’s value. Build up IoT-adjacent skills in data science, artificial intelligence and security across the organization.
But all the while, don’t waste too much time spinning up exhaustive ROI models. It’s a fool’s errand given the inherent unpredictability of any new technology, and it could mean being late to the game and having to play catch up, which can be a loser’s game.
Over the last century, the global population has more than quadrupled. In 1915, there were 1.8 billion people in the world, according to Harvard Business Review. Today, there is an estimated 7.7 billion, and we might reach 9.7 billion by 2050, according to the United Nations.
As a result, food demand will escalate with an estimated increase of 59% to 98% by 2050, according to a Harvard Business Review article titled “Global Demand for Food Is Rising. Can We Meet It?” This will place added pressure on the agriculture industry worldwide to increase crop production, boost yield-per-acre and improve sustainability. Advanced technology such as IoT is instrumental in transforming farms — no matter the size — into highly efficient enterprises. By utilizing smart practices such as precision farming, modern farms and food processors will step up to the challenge and help feed the world population.
Today’s state of the farm
Farmers today are tech-savvy professionals who must master the economics of supply and demand, coaching life from soil, partnering with nature for food safety and sustainability, and overcoming the brutal extremes of weather. Just as other industries have experienced wide-reaching changes from digitalization, farming has also been modernized to leverage data, predictive science, machine-to-machine connectivity and cloud computing.
In some regions, progress is in the early stages as small to mid-sized family-owned farms still outnumber the corporate-like agribusinesses, which are gaining traction and bringing agriculture technology (agri-tech) into the spotlight. But even small farms leverage technology. It’s estimated that 15% of small farms will leverage precision agri-tech in the next year.
This starts with the machinery, which is a critical element to a farm’s productivity level. Modern tractors are complex pieces of equipment using GPS technology, embedded sensors and dashboards that resemble a commercial jet’s cockpit. Sensors not only help guide straight rows in the field, but they also provide valuable information about the performance of the equipment, such as when it is due for preventive maintenance. Unexpected downtime during planting or harvest season can be disastrous.
The issues are global. Farmers worldwide will need to harvest more crops, either by increasing the amount of land dedicated to farming or enhancing productivity on existing agricultural lands. In highly populated areas, more land is not available. Getting smarter is the only choice.
The U.S. has long been a superpower in food markets and a major exporter. But China typically out-produces the U.S. by taking advantage of a large labor force, estimated to be as high as 315 million laborers. India is another mega producer. With China and India having the world’s largest populations by a wide margin, they both consume most of their own products. These top three producers far outpace the other contenders. The U.S., China and India each produce more food than the entire European Union put together.
Growing trends and results
IoT device installations in the agriculture world will increase from 30 million in 2015 to 75 million in 2020 for a compound annual growth rate of 20%, according to Business Insider Intelligence.
Closely related, the global precision farming market size is also anticipated to escalate. A new report by Grand View Research, Inc., says it will reach $10.23 billion by 2025. The U.S. currently leads the world in IoT smart agriculture, producing 7,340 kgs of cereal per hectare — 2.5 acres — of farmland compared to the global average of 3,851 kgs of cereal per hectare.
This efficiency could improve in the coming decades as farms leverage IoT applications and data generated from sensors. As more sensors are used and embedded in machinery, storage units, barns and facilities — even on livestock — the average farm could generate as much as 4.1 million data points per day in 2050. Software to analyze the data and form conclusions will be the key to turning data into insights.
This is not just a challenge for farms and the agriculture community, but this impacts all aspects of the food industry supply chain. Food and beverage manufacturers are dependent on a network of suppliers to provide the ingredients for their businesses. They are experienced at dealing with the fluctuations in quality and quantity that occurs in nature, but the tightening of supplies in some areas has raised the emphasis on improved sustainability initiatives and the value that technology can play to help ensure a more predictable supply.
IoT applications in food and beverage manufacturers
Soil sensors: Sensors and IoT technology can monitor soil conditions, analyze data points and use AI-driven analytics to automatically determine proactive treatments such as irrigation or use of fertilizers. Farmers can identify proper resource allocation, track results and leverage machine learning to refine optimal best practices. Over time, the system will learn the best combinations and how to adjust applications.
Herd health: For farms with livestock, sensors can monitor herd weight and other signs of herd health, such as milk production in dairy cows. Sensors and timers can also automate feeding cycles, which help control the diet of the animals as needed. Breeding can also benefit from controlled environments, including brooding barns and hatcheries, which require strict temperature control.
Equipment location and maintenance: IoT technology also helps maintenance by tracking the physical location of assets. Modern farms can be massive. Many are spread over miles of land with multiple pieces of machinery in operation at once. Being able to find the machinery and its operator offers a layer of safety precaution. Sensors embedded on equipment can monitor for early warning signs of failure so parts can be ordered. Technology helps make service a science, not an afterthought.
Agricultural robots: Robotics will also use IoT technology and AI-driven analytics to aid in decision-making and automation of tasks. With escalating labor costs, robotics can be valuable in helping to execute routine tasks, from unloading trucks to stacking supplies, inventorying resources and delivering feed or pharmaceuticals to livestock.
Sensors and temperatures: Temperature control is one of the most important ways IoT technology can assist farmers and manufacturers. Sensors embedded in crates, storage containers and refrigeration units can monitor ambient conditions, creating alerts when conditions begin to slide toward non-compliance. Early warning signs of non-compliance give personnel time to intervene before shipments spoil.
Manufacturing machinery: At the processing and manufacturing plants, complex equipment such as ovens, freezers, conveyers and forklifts are used in preparing and packaging products. IoT technology can use embedded sensors to monitor the operating performance of the machinery, looking for early warning signs of equipment failure. Intervention and preemptive maintenance will keep the machinery operating without shut-downs.
Advanced logistics: Monitoring smart transportation, storage and processing are also crucial for making sure that food goes from the farm to the manufacturer safely and quickly. Tracking where shipments are in their journey helps to ensure deliveries stay on time and on the right route. They can also be re-routed in case of last-minute changes.
Traceability: Knowing where food and food ingredients come from is increasingly important to today’s consumer. IoT technology helps track origins of food so manufacturers and retailers can make promises about organic growing conditions and non-GMO foods.
The value of adopting new sensor-driven practices lies with the ability to analyze the data and make meaningful conclusions. With the right analytics and reporting tools, companies can make well-informed decisions based on data insight, not hunches. Data obtained through IoT technology will help the organization make better uses of resources, extend the lifecycle of equipment and boost the yield of crops and livestock. These data insights can be used to make well-informed, strategic decisions about future activities and investments. These will all lead to improvements in performance, yield and profitability. Perhaps most importantly, the entire food and beverage ecosystem will be able to feed the growing world population.