The emergence of the internet of things in recent years has fired the imagination of innovators. IoT is increasingly finding innovative use in new areas, thereby opening new possibilities. One of the areas that IoT technology is now finding use in is care for farm animals.
With IoT to their aid, farm owners are maintaining data more accurately for farm animals and increasing their productivity. In addition, using IoT is also laying the foundation for precision farming.
Consumer preferences and demands are evolving, in turn forcing industries to evolve for the better. Regulations are set in place and keep evolving regularly to ensure customer satisfaction. The agricultural industry has also experienced this evolution. Consumers today want to know the origin of their products — the state of the livestock that produce the milk, the condition of the livestock at the time of slaughter to acquire meat, the medication for the animals, its effects and side effects, and so on.
Farmers, therefore, need to keep a meticulous record of the diet and medication of livestock. For example, farmers in Germany must maintain documentary details of antibiotics used (according to DART 2008; USDA 2011). Conventional means of acquisition and maintenance of data are not efficient enough to do the job. To fulfill industry requirements and consumer demands, several variables need to be rigorously monitored. IoT is emerging as a credible enabler towards these efforts.
According to the United Nations in 2013, the world’s population will grow to approximately 9.6 billion in 2050. This implies the demand for food, inclusive of the kind that originates from the agricultural industry and specifically from farm animals, will grow substantially. Therefore, it is imperative that the industry grows proportionally and farm animals are more productive.
By 2050, the world’s demand for meat is expected to reach around the following figures: 106 million tons of beef, 25 million tons of mutton, 143 million tons of pork, 181 million tons of poultry and 102 million tons of eggs.
How IoT addresses the needs
Sensors are now added to wearables to gather information from livestock to improve productivity and livestock health. Information is acquired about an animal’s behavior, health, injury, medical regime and other similar statistics, such as lactation and fertility. With the inclusion of technology, data acquisition has become simpler and more effective. Consequently, extensive data processing is possible. Scientists can collect data and improve medicine and diet regimes based on the collected information. Farmers know what to do and when to do it. For example, the sensors deployed on farms inform farmers of system failures such as ventilation system failure. In-time notification of possible failures saves lives and prevents health issues in livestock.
IoT is addressing the demands of the agricultural industry in various ways some of which are as follows:
- Improving offspring care: Monitoring the health of the offspring and the variables around it to ensure a healthy and productive livestock. Doing so reduces the numbers of lives lost due to variable factors.
- Mitigating health hazards: Sensors can detect potential health hazards such as toxic gases that might result from incompetent or failing ventilation systems. This also helps maintain livestock health. The result is an overall increase in productivity.
- Round-the-clock animal tracking: This facilitates the location of animals for larger pastures and farms.
- Leveraging fertility windows: Each animal has its own season and a specific window where maximum results are achievable. These sensors monitor livestock health and conditions so that window is not missed.
- Enhancing lactation: Sensors detect which cow needs a specific diet to improve its milk production. In addition, it also notifies about the best time to milk a specific cow.
The inclusion of IoT in agriculture is adding an exciting new dimension to agriculture and helping it evolve to cater to growing demand for food. Monitoring various small variables enables researchers and farmers to efficiently tend to the needs of their livestock. This lays the foundation of precision farming. By addressing minute needs before they develop into big challenges, the system becomes efficient and productivity steadily increases in a way that enables it to meet the ever-increasing demand for food.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
Historically, much of the value that vendors associated with software was in the algorithms and the code. Well, that and the lock-in created by the dominant market share of proprietary software products and their proprietary formats. Sure, many companies build proprietary technologies on top of open source code. This sort of arrangement is especially prevalent in the public cloud world. Because cloud providers don’t distribute their products in the traditional sense, licenses impose fewer restrictions on how they can use open source software as part of their technology.
However, some public cloud providers also contribute to open source projects. The contributions by Google to Kubernetes and TensorFlow are cases in point. Open source projects with strong communities have demonstrated an effectiveness as a development model that’s impossible to ignore.
TensorFlow is a particularly interesting case in the context of IoT and machine learning. Google describes it as “an open source software library for numerical computation using data flow graphs. Nodes in the graph represent mathematical operations, while the graph edges represent the multidimensional data arrays (tensors) communicated between them.” Open sourced by the Google Brain team, it reached version 1.0 earlier this year.
Why companies choose to open source
Consider that you might reasonably think that a lot of Google’s (and, more broadly, Alphabet’s) IP is wrapped up in algorithms, software and general know-how around artificial intelligence. Or, really, ways to extract insights, deliver results or take actions based on data — whether that means displaying a personalized ad or enabling a car to drive autonomously. TensorFlow would seem to be squarely in the domain of these “crown jewels.” And yet here it is open sourced.
In part, this reflects a wider pattern around open source generally. You may write (and tightly control) a lot of internal code relevant to your business. But you can also benefit from opening up related software to a wider pool of developers and users. It’s not either/or.
Thus, you have financial institutions cooperating on blockchain, messaging and other technologies while also holding back plenty of proprietary trading algorithms. TensorFlow likewise represents a small slice of Google’s machine learning research.
It’s also about the data
However, there’s something else going on too. With the almost countless petabytes of data that will be generated by IoT systems and connected devices more broadly, value is shifting from code to data. Of course, it’s far from trivial to figure out how to do useful things with data you collect. But to the degree than an organization controls and owns data, they may choose to focus on effectively monetizing that data while maximizing the community and development velocity of the software.
We see this happening elsewhere as well. Baidu has open sourced autonomous driving technology. The storyline is that this is a way to level the playing field against other vendors taking a more traditional proprietary approach. But it also appears as if the company views the data it is collecting as a competitive differentiator that it doesn’t plan to release.
The new data-driven services
We see the value of data in many of the new services organizations are starting to deliver. For example, as noted in an MIT Sloan Management Review article, “GE wants to go beyond helping its customers manage the performance of individual GE machines to managing the data on all of the machines in a customer’s entire operation.” Other services can include optimizing preventative maintenance schedules on jet turbines and other industrial machinery.
Huge data lakes that aren’t curated or properly analyzed aren’t a benefit, of course. In fact, data that’s not used intelligently can have a negative ROI because of its collection, storage and management costs. But when data can be effectively used to guide actions and gain useful insights, it’s an increasingly large part of a technology company’s value. (And almost every organization is a technology company today.)
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
Artificial Intelligence is rapidly finding application in a myriad of fields, enhancing both the pace and quality of work. The tasks performed by AI are evolving at such a rapid pace that scientists already fear the rise of machines. That might be far-fetched, but AI does bring along some genuine areas of concern. This is primarily because AI has become a powerful tool, which simplifies high-skill tasks.
AI is at the disposal to anyone who wants to perform a task that requires extensive training without any prior experience. Analytics, big data and machine learning help us to analyze a vast amount of information and use it to predict future outcomes. It can, however, also be used to mislead, forge and deceive.
Audio and video forgery capabilities are making astounding progress, thanks to a boost from AI. It is enabling some effective but downright scary new tools for manipulating media. Therefore, it holds power to alter forever how we perceive and consume information. Time will come when people will struggle to know whom and what to trust. Even today, we live in a world of Photoshop, CGI and AI-powered selfie filter apps. The internet democratized the knowledge by enabling free but unregulated and unmonitored access to information. As a result, floodgates of all types of information opened up, ushering in a staggering amount of rumors and lies.
Criminals are utilizing this technology to their benefit. Readily available tools can create high-quality fake videos that can easily fool the general population. Quintessentially, using AI to forge videos is virtually transforming the meaning of evidence and truth in journalism, government communications, testimony in criminal justice and national security.
Creative software giant Adobe has been working on a similar technology “VoCo.” It labeled its project as “Photoshop for audio.” The software requires a 20-minute long audio recording of someone talking. The AI analyzes it, figures out how that person talks and then learns to mimic the speaking style. Just type anything, and the software will read your words in that person’s voice.
Google’s Wavenet is providing similar functionality. It requires a much bigger data set than Lyrebird or VoCo, but it sounds creepily real.
MIT researchers are also working on a model that can generate sound effects for an object being hit in a mute video showing on the screen. The sound is realistically similar to that when the object is hit in real life. The researchers envision the future version automatically producing realistic sound effects good enough for use in movies and television.
With such software, it will become easy to record something controversial in anyone’s voice, rendering voice-based security systems helpless. Telephonic calls could be spoofed. No one will be exactly sure if it is you on the other end of the phone. At the current pace of progress, it may be within two to three years that realistic audio forgeries are good enough to fool the untrained ear, and only five to 10 years before forgeries can fool forensic analysis.
Tom White at Victoria University School of Design created a Twitter bot called “SmileVector” that can make any celebrity smile. It browses the web for pictures of faces and then it morphs their expressions using a deep-learning-powered neural network.
Researchers at Stanford and various other universities are also developing astonishing video capabilities. Using an ordinary webcam, their AI-based software can realistically change the facial expression and speech-related mouth movements of an individual.
Pair this up with any audio-generation software and it’s easy to potentially deceive anyone, even over a video call. Someone can also make a fake video of you doing or saying something controversial.
Jeff Clune, an assistant professor at University of Wyoming, along with his team at Evolving AI Lab is working on image recognition capabilities in reverse, by adopting neural networks trained in object recognition. It allows generating synthetic images based on text description alone. Its neural network is trained on a database of similar pictures. Once it has gone through enough pictures, it can create pictures on command.
A startup called Deep Art uses a technique known as style transfer — which uses neural networks to apply the characteristics of one image to another — to produce realistic paintings. A Russian startup perfected its code developing a mobile app named Prisma, which allows anyone to apply various art styles to pictures on their phones. Facebook also unveiled its version of this technique, adding a couple of new features.
Other work being done on multimedia manipulation using artificial intelligence includes the creation of 3D face models from a single 2D image, changing the facial expressions of someone on video using the Face2Face app in real time and changing the light source and shadows in any picture.
A team of researchers at University College London developed an AI algorithm titled “My Text in Your Handwriting,” which can imitate any handwriting. This algorithm only needs a paragraph’s worth of handwriting to learn and closely replicate a person’s writing.
Luka is also aspiring to create bots that mimic real-life people. It is an AI-powered memorial chatbot. It can learn everything about a person from his/her chat logs, and then allow the person’s friends to chat with the digital identity of that person long after he/she dies. The chatbot, however, can potentially be used before a person dies, thereby stealing the person’s identity.
A study by the Computational Propaganda Research Project at the Oxford Internet Institute found that half of all Twitter accounts regularly commenting about politics in Russia were bots. Secret agents control millions of botnet social media accounts that tweet about politics in order to shape national discourse. Social media bots could even drive coverage of fake news by mainstream media and even influence stock prices.
Imagine those agents and botnets also armed with artificial intelligence technology. Fake tweets and news backed by real-looking HD video, audio, written specimens and government documents is eerily scary. Not only is there the risk of falsehood being used to malign the honest, but the dishonest could misuse it to their defense.
Before the invention of the camera, recreating a scene in the court of law required witnesses and various testimonies. Soon, photographs started assisting along with the witnesses. But later, with the advent of digital photography and rise of Photoshop, photographs were not admissible as reliable evidence. Right now, audio and video recordings are admissible evidence, provided they are of a certain quality and not edited. It’s only a matter of time before the courts refuse audio and video evidence too, howsoever genuine it might seem. The AI-powered tool that imitates handwriting can allow someone to manipulate legal and historical documents or create false evidence to use in court.
Countering the rise of forgery
With potential misuse of artificial intelligence, the times ahead do indeed seem challenging. But, the beautiful thing about technology is that you can always expect to have solutions to any problem.
Blockchain, the technology securing cryptocurrencies, is promising to provide a cybersecurity solution to the internet of things, offering one possibility to counter forgery. With the widespread use of IoT and advancements in embedded systems, it may be possible to design interconnected cameras and microphones that use blockchain technology to create an unimpeachable record of the date of creation of video recordings. Photos can be traced back to the origin by their geotags.
The Art and Artificial Intelligence Laboratory at Rutgers University is currently developing a neural network, which can appreciate and understand art and the subtle differences within a drawing. It uses machine learning algorithms to analyze the images, counting and quantifying different aspects of its observation. This information is processed in its artificial neural network to recognize visual patterns in the artwork.
Similarly, neural networks can be trained to detect forged documents, historical evidence and currency notes. It can also identify fake identities and other bots on the internet by observing their functioning pattern and clashing IP addresses. Forged video and audio could be compared across various dates and platforms to identify their origin.
In addition, regulatory and procedural reforms are required to control this menace.
Even though the audio and video manipulation tools aren’t entirely revolutionary, they no longer require professionals and powerful computers. We can’t stop criminals from getting their hands on such tools. If anything, making these tools available to everyone just to show people what’s happening with AI will make the public aware of the power of artificial intelligence — and hence, aware of the easily forgeable nature of our media.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
Industry 4.0 is here; this also substantiates the start to a new era of industrialization where machines are moving toward being autonomous. For asset-intensive enterprises, it is imperative to keep watch on various processes such as material production, overall equipment efficiency and asset management. The rise in adoption in using sensors, edge devices and artificial intelligence is a product of growing costs for managing assets. Experts have predicted that businesses would invest near half a trillion dollars for asset management by the end of the next decade.
Here is where machine learning comes handy. Though its path is marred with numerous challenges, it still commands a success rate better than any of its contemporary solutions. It might sound weird, but the fact remains that researchers are spending billions on the research and development of a working machine learning technology capable of aiding asset management for industries. Some of the top scientists have spent more than three years simply developing a machine learning algorithm, and to date, the success rate of machine learning technologies clocks below 10%.
Dead investment or exemplary foresight?
In all the commotion, the main question that arises is, “Why is the industry still pursuing such a venture?” The answer to that question lies in machine learning capabilities. Companies are facing challenges to ensure that their machines are running at a heightened efficiency level. For this, they will require the ability to monitor their assets remotely. IIoT offerings have made it possible to reach a certain success rate, but machine learning holds the key.
Data is the future
Big data has taken off at a never-before-expected rate, and this might be the catalyst for the success of machine learning. Assets in an industry are tough to manage, but automation and the use of data through IIoT has provided a new pathway towards better asset management. And not to mention, the additional challenge of the human error element through increased human intervention. But machine learning promises to do away with all of this, as it has been hyped to use machine-generated data for the benefit of those very machines ensuring optimum asset management at all times.
Having a feedback cycle with human intervention is not only time-consuming, but it also increases the chance of errors through miscommunication. Instead, if a machine itself can analyze data and provide alerts at the right moment, the issue of asset management is sorted forever. The machine learns, analyzes and adapts itself to ensure maximum output is realized every single time. Using algorithms, machine learning allows enterprises to unlock hidden insights from their asset data. For instance, a forecast regarding an asset failure can help in scheduling preventive maintenance of yet to fail asset. Such machine learning algorithm-driven predictive analytics software can enable enterprises to make fully vetted and well-timed decisions towards improved asset management.
Though many machine learning algorithms have been around, an ability to apply complex calculations to big data automatically, faster and faster, over and over is the latest development. These machine learning technologies can enhance growth for an enterprise to yield substantial profits through optimum utilization of resources at hand, all made possible by machine learning. A few of the machine learning algorithms that are being applied widely include linear regression, decision tree, logistic regression, random forest, naive Bayes classifier algorithm, neural networks and gradient boosting.
The following are a few of the advantages of machine learning in asset management:
- Highest uptime/runtime and improved machine performance
- Constant machine health monitoring
- Advanced analytics of all assets drilled down to various levels such as machine, plant, facility, and so on
- Reduced consumption of raw materials and resources such as air, water, heat and electricity
- Facilities and operator performance monitoring
- Live alerts, reports and detailed data logs for each instance
In conclusion, industry experts have deemed machine learning as an unrealistic absurdity and all the other negative adjectives you could think of, but the promise of improving our future and simplifying lives through improved asset management is what keeps the spark going.
Read how a utility service provider and meter manufacturer leverages Azure Machine Learning to remotely monitor its IoT-based smart water meters. Notably, the company reduced water consumption by more than 30% by effectively managing meter failures and water leakages.
The demand for and technology of IoT devices are rapidly passing the consumer market and landing squarely in medical electronics, IT/enterprise, industrial and military markets. And those IoT markets are experiencing dramatic growth.
However, those markets are highly demanding when it comes to high reliability. IoT devices in these instances must be reliable to the point where there can be no flaws or failures, and definitely no latent failures — meaning the IoT device shouldn’t fail once it’s being used in the field.
How do you maintain IoT device high reliability especially at the printed circuit board (PCB) level?
Remember, as I said in an earlier blog, most IoT PCBs are not conventional printed circuits. Rather, in most cases, they are small combinations of rigid and flexible circuits or flexible circuits alone. It can be argued that some IoT products may be larger and, in those cases, more of a conventional rigid PCB is used. But in a majority of cases, IoT products are small and call for the rigid-flex or flex circuitry as the basic PCB.
Vias take on special significance for IoT rigid-flex and flex circuit reliability. Vias are tiny drilled openings that create electronic or power connections between a circuit’s multiple layers. Via placement is of paramount importance in maintaining high reliability. Location of via placement is critical because flex circuitry has bending curvature and radius, which can weaken these vias after long periods of time.
The general rule of thumb for IoT circuits is for these drill holes to be as small as possible due to limited space. Sticking to five to six mil finished drill hole size is a good compromise. Going below four mils, like drilling at three mils, calls for laser drilling, which is more time-consuming and costly. Going higher, for example at six to seven mil via hole drilling, means too much valuable real estate is consumed, and that’s not a good move for a small IoT PCB design.
Keeping smallness in mind, those IoT rigid-flex and flex circuits have limited space. Components placed on those small circuits are miniscule as well.
For instance, capacitors are part of an IoT circuit’s electronics. The trend has been to capsulize capacitors into increasingly smaller packaging like the 01005 package, which is a challenge in terms of placement and inspection. This means the IoT rigid-flex and flex circuit assembly house must have sufficient experience with its various processes so it can effectively produce highly reliable products. In particular, savvy assembly houses must have extensive knowhow about solder joints for IoT circuits since devices and surface mount pads are so miniscule.
A misstep during the pick-and-place and reflow process can create latent failures as mentioned earlier. So, solder printing, placement of components and reflow temperature have to be accurately dialed-in and maintained.
Printing, in particular, takes on special meaning. If overprinting is inadvertently performed, shorts may result between the leads or balls of a micro BGA, QFN or flip chip packaged device. A short or solder bridge is a solder extension from one lead or ball to the next and forms a short circuit.
However, if enough paste isn’t dispensed on the extremely small surface mount pads, there will be a shortage of solder paste, thereby creating an open (a broken interconnect of a component’s package) in the worst case scenario or big voids in the best case scenario.
Also, one has to be wary about using through-hole components in an IoT design. One reason is they lack the important functionality. Also, they have a high probability of adversely affecting reliability. Through-hole technology is older and was routinely used for earlier conventional PCBs before surface mount technology came into prominence. Still, there can be some cases where through-hole components are used for IoT products.
Through-hole components are larger compared to similar ones in surface mount packaging such as micro BGAs, CSPs and QFNs. Thus, through-hole components placed on small rigid-flex and flex circuits during the pick-and-place assembly operation will pose difficulties. In effect, they create the probability of reliability issues, again due to the bending and curve radius that are associated with flex circuits.
In summary, distancing an IoT design from using through-hole components is a good idea. Also, printing consistency is highly critical, as well as proper solder joint consistency. These key steps are the foundation of high reliability, which is absolutely required in specific IoT medical, industrial and military/aerospace applications.
The internet of things market continues to expand rapidly with the installed base of IoT devices now forecasted to grow to a staggering 31 billion by 2020. Traditional testing approaches are simply not going to be able to test this new massively interconnected world of different devices, created by different companies, using different technologies, all communicating to deliver digital user experiences. A fundamentally new approach to testing is needed.
Today, most software teams only test what they build, i.e., they only test the software components that their developers have written. All other software components are “out of scope” of testing, even software components that are critical to the product and the user experience (such as the network the product uses to communicate). In the IoT world, this approach simply isn’t good enough. All products in the IoT world involve multiple products from different vendors interacting to deliver a digital user experience and a specific value proposition, and vendors need to start taking responsibility for — and testing! — the whole value proposition, not only the small piece of the pie that they developed. This requires the established rules of software testing to be ripped up and replaced in a hyperconnected world.
IoT is all about collaboration — and that includes how to approach testing. In a hyperconnected world, you need to be able to test other people’s technology to assure the experience — and this requires a huge mind shift. Take the connected fridge for example. The consumer’s only concern is that the fridge is constantly stocked with milk, nothing else matters to them. This requires the entire ecosystem that is part of fulfilling this need to work flawlessly together: From the RFID chip on the milk bottle to the supermarket to the payment platform to the fridge manufacturer. To achieve this end goal, it requires you to share test assets and environments between all involved parties in meeting the user expectation.
The ubiquity of virtualization and now containerization make this possible. For example, the fridge manufacturer can create a test environment in a container and pass that to the milk producer who has his own milk-testing container. He can then plug the containers together and test the system as a whole. Each container requires a clean interface as to how it interacts with the other test containers.
The sheer scale of IoT has killed manual testing, and the need to collaborate has killed code-level testing. You can’t expect to have access to other people’s code. For example, the milk manufacturer is not going to share its source code, but it will share a container. Put simply, IoT is the final nail in the coffin for code-level testing.
The new rules of engagement
So what does testing look like in this brave new world? In a hyperconnected world of digital user experiences, you must adopt a user centric approach to testing.
I believe there are five keys to success:
- Test through the eyes of the user and APIs
- Test all aspects of the user experience
- Monitoring is now testing
- Extend automation and analytics beyond test execution
- Report test status in terms of the user experience
With IoT, the days of testing what you build are over. You need to focus on the value proposition and test everything that is part of the user experience — test everything that you touch. For IoT to realize its dreams it must adopt a user-centric approach to testing.
I sometimes Google a word or a phrase, just to see what the search engine algorithms is de facto defining it to be these days. USA Today even published an article once about the value in Googling yourself, but that’s a topic for a different post.
In any case, I decided recently to Google “internet of things” because for me, and potentially for a lot of people, the general definition was getting cloudy.
So here’s the results of the search on July 17, 2017:
I noticed a few things right off the bat:
- Certain organizations are paying A LOT of money to come up first in that search;
- Google’s default definition is appropriately broad and generic, if somewhat unhelpful; and
- It must be a challenge for those who have never really done any research on the topic to actually get to something useful in a search like this — only one of the articles on the first page of search results was from 2017.
I am not disparaging Microsoft and MIT here. It’s just that for a topic like IoT, getting to valuable, actionable information on the topic may require a bit more context and clarity. Certainly, these search results did not provide that. So I wondered: If I only had 30 seconds to tell someone about the value of IoT, what would I say? What “elevator pitch” would I give to get that person to care about this latest technical evolution?
I think I would ask this question: If you could know in advance that something you own, are responsible for or care about — an appliance, a car, an inventory of machines, a body part — was going to have problems, wouldn’t you want to know so that you could fix it? Well, to quote the oft-used but inaccurate Morpheus meme, “What if I told you that you COULD know?” That is the power of IoT — to allow things to “send and receive data,” but more critically, to know something new and valuable that could change a life for the better.
It may be a stretch to say that knowing that my car battery needs to be replaced is the same as knowing when my drinking water has been contaminated. The technological advancements behind one does support the other though, and the use cases for the tech are only limited by our imagination. As Eric Schmidt of Google said in 2015, “Everyone gets smarter because of technology, and the empowerment of people is the secret to technological progress.” Many who read this site are already implementing complex advanced services like machine learning and artificial intelligence. But Cisco recently reported that almost 75% of IoT projects are failing, and I think that’s in part because those not “in the know” don’t get it.
It is time for us to hone our elevator pitch — think of IoT as a startup and get really good at telling everyone we meet why what we’re doing matters. Tell them about the 200,000 babies that were saved, or the innovations in chemical-free water treatment. The data is out there, but not very easy to find for the average web searcher, so it is up to us to help — or the technology will take longer to “empower” the people and achieve lasting value.
It’s summer. Time for cookouts, kicking back and enjoying that all-American sport, baseball. I love baseball, in part due to its seemingly eternal rhythm. You root for your team. You listen to the same recognizable announcer’s voice calling the shots and grab tickets for your favorite seats. The players are all familiar — their stats, their strengths — enough so that when they come up to bat, you think you know what to expect. But then they face a new pitcher, adjust their stance, hit in a different part of the lineup — and wham! It’s a whole new ballgame.
Baseball has become more exciting since general managers, like Theo Epstein, began using sabermetrics to create miraculous team turnarounds, such as the World Series victories he led for both the Boston Red Sox and the Chicago Cubs. Baseball management tactics and dugout decision-making has changed on the heels of these Moneyball-style wins. Baseball fans everywhere can’t help but take notice that the information derived through analytics can make a major impact on the success of any baseball team — or any business, for that matter.
What can businesses learn from a similar use of analytics? With the advent of the internet of things and all the data being produced by field devices, businesses are rethinking their strategies as IoT gains ground. Operational technology (OT) managers see how shared data acquired by their controlled devices helps improve their own business decision-making. They also recognize how a common architecture that spans across their now disparate OT and IT infrastructures could improve efficiency, while reducing costs. Like sports sabermetrics, OT managers see IoT eventually leading them to a walk-off grand slam win.
It helps to be open-minded, even in the outfield
Traditional operational technology has a lot in common with IoT. It derives data from sensors, meters and other devices to monitor and manage the health of resources and machines. You’ll find it often in manufacturing plants, healthcare, transportation settings and other industrial control processes — the systems one finds outside of the data center.
Simply put, OT is the use of computers to monitor or alter the physical state of a system, such as the control system for a power station or the control network for a rail system. The term has become established to demonstrate the technological and functional differences between traditional IT systems and industrial control systems environments, the so-called “IT in the non-carpeted areas.”
IT and OT implementations evolved independently over time to solve different problems. In IT environments, the need to have different applications and systems interoperate with one another forced the requirement for open standards. Not so with OT. Working within the parameters of their pinpointedly focused proprietary systems, OT has continued to operate relatively sheltered from this pressure.
Operational control systems were specifically designed to be standalone entities. Often, they were not originally intended to be connected or even accessed remotely. But as OT managers begin to recognize the benefits data sharing provides, much like baseball’s GMs, they have become more open standards-minded. They see how unifying OT and IT systems through new IoT projects can eliminate inefficiencies and accelerate innovation.
Isolation inhibits innovation
An interesting article by Rany Jazayerli explored how taking a more curious approach to collecting and using data led to better performance by major league baseball teams. He pointed out that the team managers who preferred to do things the way they had always done them isolated themselves from a changing world. The effects were personally catastrophic, resulting in losses both on and off the field.
Operational technology managers don’t want to make the same mistake. They see that there’s a lot to be gained by adopting standards-based OT solutions and converging disparate OT and IT environments. With open IoT architectures, enterprises can reduce infrastructure cost and complexity, as well as improve performance. A common view of the intelligence gathered through the operations equipment can help businesses make more informed decisions. It can give them the visibility needed to help improve how they manage and deliver services. The data becomes more valuable when it becomes more accessible by applications that can make full use of it.
OT managers recognize that uniting OT and IT implementations can:
- Improve business decision-making by applying control data to business intelligence applications
- Enable them to reap the benefits of newer technologies, such as mobile communications or cloud-based services, for their OT systems
- Optimize industrial control processes and business flows, including introducing more automation
- Lower operating expenses by not having to build, manage and support parallel systems
- Accelerate business results by streamlining development projects
- Improve OT processing scalability
IoT will accelerate the pace of change in OT
That isn’t to say that OT managers will drop their legacy systems and hop on the IoT bandwagon immediately. Until now, industrial networking professionals have been quite conservative. The stakes are high — industrial networks can’t be allowed to fail due to their critical nature. Concerns around security and availability are still forefront in their minds. Their resistance to change can also be in part explained by the proprietary technologies that have locked them in. This tends to make change prohibitively expensive.
But IoT is acting as a catalyst for change. IoT initiatives present a perfect opportunity to unite these isolated, parallel technology disciplines. Many operations are starting to deploy standards-based operational control systems, replacing isolated meters, sensors and actuators with smart, IP-enabled devices as part of their upgrade processes. Businesses are beginning to unify their disparate OT and IT solutions as a result of IoT efforts. As they introduce the use of common protocols and building blocks, they work towards eliminating redundancies and improving business efficiency.
As pointed out by Mike Fahrion in his blog, “Merging OT and IT on the Internet of Things,” “…while the OT world was still busy getting different kinds of machines to communicate at all, the IT world was making incredible progress. Ethernet networking standards were developed and adopted. Cost points fell to amazingly low levels … OT can’t ignore IT anymore. The opportunities are too great. When attaching a remote sensor to a battery-powered node on a wireless mesh network can make the sensor data available to an analytics application on another continent, it’s time to stand up and take notice. That remote sensor has just become a node on the internet of things, and the value of both the sensor and the data it collects has suddenly increased exponentially.”
Like baseball GMs, OT teams are hoping IoT will improve their chances at a hitting a home run at every at-bat.
There is no doubt that we have moved into the growth stage of the internet of things, as devices, things and connected services have very much become a part of the consumer/commercial conversation. Today — driven by the success of Google Home, Alexa, Nest and other systems — IoT-related devices are now something you can purchase at Best Buy, Bed, Bath & Beyond or the U.K.’s John Lewis.
Today’s devices, services and underlying machine learning and artificial intelligence are very much pushing the envelope on what is possible with connected cars, the smart home and wearable technology. However, one of the main threats to the success of the IoT phenomenon is the potential lack of security and privacy protection — and the lack of knowledge around these issues. Do end users, device manufacturers and cloud service providers understand that the data they handle could be at risk of theft, misuse or manipulation? The answer apparently is no.
A security study published earlier this spring documented several cases of poor security, especially relating to pacemakers. The report highlighted a situation where vulnerabilities could be exploited in order to send malicious commands to the pacemaker directly, or intercept and alter data being sent from it. Here was a direct and clear example of multiple stakeholders in a healthcare IoT ecosystem not understanding the risks at hand. Indeed, one could go as far as saying the situation was a life-or-death case in point for the need for improved IoT safeguards and privacy control.
Healthcare IoT has outpaced cybersecurity planning
How did it come about that healthcare IoT was developed with many glaring security weaknesses, which is even more glaring in light of the fact there are 15 million installed healthcare IoT devices in the U.S. alone? As with many IoT devices, security is often seen as an inhibitor to application and services development, with security and privacy practices evolving as “bolt on” features, long after the device ecosystem was designed. This is both costly and dangerous.
In an effective healthcare IoT development ecosystem, devices themselves, along with the services, cloud infrastructure and applications they interact with, need to have clear infosec, identity and privacy controls embedded from the beginning. To achieve this, full data lifecycle analysis needs to be completed, along with the correct level of risk mitigation and protection. Where there are transaction points or areas that require protection, here is where developers need to find ways to upgrade privacy controls or security measures; it’s essential to maintain the confidentiality, availability and integrity of any data, function or service for which the healthcare device is responsible.
If we analyze the specific pacemaker case above, what exactly did the study find? First word of these vulnerabilities came via cybersecurity notices from the U.S. Food & Drug Administration in January 2017: “Cybersecurity Vulnerabilities Identified in St. Jude Medical’s Implantable Cardiac Devices and Merlin@home Transmitter: FDA Safety Communication.” This report detailed that vulnerabilities were not with the pacemaker device itself, but with the in-home transmitter used to monitor the devices and share data with doctors and caregivers. These transmitters collect data from the pacemaker and upload it to a private cloud where clinicians can monitor heart activity. The vulnerability? That the transmitter can also send signals and commands to the pacemaker, potentially enabling a bad actor to disable the device. Further study shows that the pacemaker/transmitter systems are vulnerable to programming hacks as well. The USDA report offers a clear example of where the entire device ecosystem needs to be analyzed to identify data entry and exit points, device and user identity and access management requirements, as well as how the device itself is managed. For example, how are vulnerabilities patched or firmware updated?
An ounce of prevention: Steps in defending healthcare IoT
Given the vulnerabilities exposed in the pacemaker report, what would be a logical path toward securing critical medical devices and systems?
- Admit the dangers. As a first step, device makers and healthcare providers together need to commit themselves to taking action, as another report proves how organizational preparedness for dealing with these threats is lacking. ISACA’s “State of Cyber Security 2017” report cited the fact that healthcare ransomware attacks are on the rise, with 62% of respondents indicating they’d experienced ransomware in 2016, but only 53% had any kind of formal process in place for dealing with it. Indeed, many of the device and cloud vulnerabilities that came to light during last year’s Dyn DDOS attack are likely to remain unresolved years into the future.
- Improve awareness and training. Both at the executive level and in cybersecurity departments, there’s a constant need for training. The bad guys and black hats thrive on innovation, so investments in focus, time and money must happen to keep up. The ISACA report found that the huge majority of cybersecurity professionals are allotting $2,500 or less per year for ongoing training, with one in four receiving less than $1,000. Considering the risks involved in healthcare IoT security, these are scarily inadequate budgets.
- Act cooperatively. By joining with other providers in setting the best practices and benchmarks for secure IoT adoption, device makers and healthcare providers can be on the same page in addressing security concerns. Sharing updates and responses to security threats, whether via forums or distributed alerts to cybersecurity experts working within these organizations, can help preempt problems before they become disasters. It’s an effective model being followed by cybersecurity entities in other industries, most notably financial services, and the FDA has made clear recommendations about how to implement it for healthcare IoT. The National Health Information Sharing and Analysis Center (NH-ISAC) is a prototypical example of just such a trusted community.
- Move beyond existing infrastructures. Even healthcare providers and organizations with cybersecurity systems in place that have been “good enough” to date are courting catastrophe. Older IT and security infrastructures lack transparency, are fragmented and siloed, inflexible and incapable of scaling. The very nature of IoT, however, demands agility and scalability from cybersecurity measures, and legacy systems limit a provider’s ability to respond to these new requirements.
- Adopt unified, scalable, future-proofed identity verification systems. Centrally managed identity platforms with state-of-the-art encryption and other safeguards already exist, and can scale to manage millions of patients, doctors, providers and healthcare professionals and the IoT devices they use, as well as other services and equipment. Such systems can register people and devices, link them together, authorize and de-authorize their access to data, guard against attacks, and apply policies on security and privacy practices and personalization. This way, high-level authentication, data security and cybersecurity are bolstered through an integrated ecosystem that’s also able to support the new technologies continually evolving in tandem with IoT.
In an age when hackers are frighteningly intent on disrupting or manipulating everything from the electoral process to power grids, healthcare IoT is too ripe a target for them to ignore. Which is why proactive planning, rather than passivity or reactivity, is the key to fending them off and protecting patients and providers alike.
I’ve written about some of the changes coming to the workplace, workforce and overall economy because of the continued proliferation of IoT devices. One area I’ve found especially interesting as of late is how IoT is helping something near and dear to my heart: food. Well, food production technically. My father-in-law is a farmer in rural Indiana and, through conversations with him, I’ve learned about all the pressures being placed on today’s farmers. They include:
- Availability of water: One of our customers, a large seed company, is investing heavily in agriculture that needs less and less water to grow because farmers must rely on increasingly unpredictable weather and battle urban demand for underground aquifers.
- Land: As Mark Twain quipped, “Buy land, they’re not making it anymore.” As urban sprawl continues, the price of land continues to rise. Sadly, many farmers are being pressured to sell their land as subdivisions and businesses move in around them. If they don’t sell now, who will want to buy their property when it’s land-locked between a neighborhood and a shopping center?
- Government/environmental regulations: The federal government (and often other foreign governments), are putting more and more restrictions on how food can be produced. This can impact things like pesticides, genetically modified organisms, land management, workforces or animal welfare. Even customers are demanding their own standards from farms these days.
- Managing pests: Controlling insects and disease is something that has plagued (pun intended) humans since the first days of farming. It takes time and money to ensure that whatever measures are used to drive away or exterminate pests are safe for the crops and those who consume them.
- Maintaining a profit: Like every business, farmers struggle with maintaining a profit in a market where prices are almost always stable or falling and expenses continue to rise.
- Increasing yields: Improving the yield (amount of output per acre) is important because it not only offsets increasing costs, but also helps accommodate an increasing population on less and less land.
While the average farmer may not be held in high regard by a layperson, the truth of the matter is these are individuals who often run multimillion dollar businesses and can handle all the complexities and challenges that come with it.
Increasingly, both the agriculture industry and individual farmers must rely on technology to overcome business challenges. IoT is a key part of these plans — connected devices are being implemented by millions of farmers across America.
Looking at the list of challenges listed above, IoT and smart farming can help farmers in all these areas:
- Availability of water: Deployed field sensors can create data points that monitor things like rainfall in specific areas or water requirements for crops to help drive irrigation strategies and reduce water consumption.
- Land availability: While IoT isn’t able to control land prices, it can help make farms better neighbors. IoT technologies can look for things like disease outbreak in livestock, allowing sickened animals to be separated from the herd and treated.
- Government/environmental regulations: Increased regulation means farmers must now provide data points from farm to fork and every step in between. By doing so, farmers ensure that various government import requirements are followed by local producers, making products available to a wider market overall.
- Managing pests: Sensors can monitor and scan the environment for infestations to pinpoint pest hotspots, allowing for more targeted applications of insecticides and other pest controls. This not only controls costs, but also minimizes potential negative environmental impacts.
- Maintaining a profit: Most people have read about the Silicon Valley’s self-driving vehicle obsession by now. Companies like Google, Uber and Tesla are all working on this technology, but people may not realize that self-driving technology and IoT sensors are already being applied to tractors and other farm implements. The ability to reduce the need for labor is creating direct cost savings for farmers, allowing them to spend resources on other aspects of their business instead.
- Increasing yields: As mentioned above, having access to real-time data that aids crop and animal monitoring allows a farmer to quickly identify and resolve problems, improving their overall yield. Data points that can be drilled down to a very specific location help immensely. Even tractors are helping by monitoring real-time yields as they plow, fertilize and harvest.
As we demand more and more from our farmers, harnessing IoT technology for smart farming appears to be the only conceivable way they’ll be able to succeed. With a world population projected by the United Nations to be 9.8 billion in 2050 and 11.2 billion in 2100, IoT in agriculture is an absolute necessity.