IoT Agenda


September 10, 2018  12:48 PM

When worlds collide: Understanding emerging IT/OT convergence

Erick Dean Profile: Erick Dean
Big Data, IIoT, Industrial IoT, information technology, Internet of Things, iot, IoT analytics, IoT data, IT, IT convergence, Machine learning, Operational technology

Industrial IoT continues to fundamentally change the manufacturing industry, and smart companies across the globe are discovering the benefits of combining data from information technology (IT) and operational technology (OT) entities. This is known within the industry as IT/OT convergence, which involves integrating data from IT and OT systems into a single, information-driven environment.

The problem? IT and OT technologies are segmented and often managed by different teams, with IT being advanced in analytics capabilities and OT teams exploring technologies capable of delivering the level of real-time insight. The convergence of IT and OT is an adjustment beneficial for the industry as optimizing data ultimately increases efficiency and productivity in industrial environments.

The good ol’ days

Prior to the concept of IT/OT convergence, management of IT and OT entities were completely segmented. The two networks were developed independent of one another — IT was used solely for business purposes, while OT was built for monitoring industrial machinery. Different job titles were required to oversee each, without any harmony or coordination between the two. Despite the lack of integration, the divided entities were functioning well on their own, as IT and OT data were providing effective solutions to their respective areas.

It takes two to make a thing go right

With the rise of big data, machine learning and smarter industrial equipment, a shift in industry norms started to occur as IT/OT convergence became more advantageous. In fact, a Gartner survey reported that organizations are increasingly eager to integrate their entities, as advancements in technology have demonstrated that integrated systems produce optimized data results. So, what does that mean for organizations looking to modernize their IT and OT strategies?

In order to adapt to the needs of the industry, forward-looking organizations have begun integrating their IT and OT networks by employing IoT sensors on machinery to monitor valves, pumps, gauges and other pieces of equipment. Through converging these systems, companies are able to monitor and harness data like never before — live dashboards, anomaly detection, process automation, relevant generation of key performance indicators (KPIs) and hundreds of other features are possible with IT/OT convergence. Analyzing integrated data is cost-effective, increases transparency and efficiency, and ultimately supports the safety of workers.

The good, the bad, and the ugly

While the benefits are clear, handling IT/OT convergence is no easy undertaking — this change is complex and requires a number of criteria to execute effectively. Since IT and OT departments are often not cohesive, it can take a village to create a single, well-oiled machine. It doesn’t help that the operational difficulty between IT and OT is incredibly different — IT is a well-established and advanced field, and while OT advanced on the equipment side of the house, it is considered the laggard for analytics technologies. Combining the technologies is a daunting task, and there is no best practice to follow since IT/OT convergence is an emerging concept.

So, how can you integrate systems without disrupting current operations?

  1. Educate yourself on IT/OT convergence and the challenges and benefits that come with the change. Putting your leaders in positions where they feel enabled to take on a change can lead to higher success rates.
  2. Communication with key players is essential — make this unification as simple as possible by being transparent and open to adjustment. Because IT and OT entities are vastly segmented, effective internal communication between teams is essential to ensure success.
  3. Create a plan where all parties refer and contribute as needed. Make effective use of this plan by reviewing it often with team members from the top and the bottom of the team organizational chart. Most importantly, evaluate and make changes to your plan as needed.
  4. Be open to change and accept new processes and trainings that will inevitably come. Keep a positive mindset and work together to make this change work. While this change is elaborate, it is both inevitable and necessary given the rapid evolution of the industry — so while it is no easy feat, embrace the shift and adapt to challenges as they come.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

September 10, 2018  11:50 AM

Predictive maintenance can make vehicle recalls a thing of the past

Ruban Phukan Profile: Ruban Phukan
ai, Artificial intelligence, Cognitive computing, Connected car, Connected vehicles, IIoT, Industrial IoT, Internet of Things, iot, Machine learning, manufacturers, Manufacturing, Predictive Analytics, Predictive maintenance

With 64 million units having to be taken off the market, 2014 was one of the worst years for the automotive industry in terms of the number of vehicles recalled. The fact that in 2017 the number dropped to only 10 million was an improvement, but still not cause for celebrating. Just a few weeks back, in March 2018, Tesla recalled 123,000 Model S cars over faulty steering components. Shares of the company fell 4% right after the announcement. Vehicle recalls continue to make headlines, costing car manufacturers millions of dollars and great reputational damage. This is all despite the fact that the automotive sector is one of the industries where manufacturing processes are most stringent in terms of quality checks, regular maintenance and monitoring.

With technology turning vehicles into platforms of innovation, through using security, efficiency and computing power performance, it’s time for manufacturers to become “smarter” when it comes to deploying the right technologies throughout the production and post-purchase lifecycle. This is where AI, machine learning and predictive analytics can enable manufacturers to gain full visibility and control of manufacturing processes.

Predictive maintenance from factory floor to out in the field

They say that the best time to correct an error is yesterday, and that is true in the automotive sector. While AI and machine learning cannot — yet — turn back time, cognitive technologies can analyze data in ways that were previously unattainable. While manual modeling on past failure patterns executed by data scientists is nothing new, data analysis performed by AI-powered platforms builds cognitive learning which not only can learn from past failure patterns, but more importantly learn to detect issues not known or seen before. This is very important as the majority of the issues that happen in the manufacturing process that eventually cause a recall are not repeat issues but new ones. Cognitive applications are able to teach themselves, with data from sensors, the various normal operating conditions and the environmental influences on a machine at a fine-grained level, going beyond the macro-patterns that the human brain tends to spot. This means that micro-anomalies and small changes that can go undetected in the quality check process can now be identified automatically as they occur and be responded to before they lead to current or future defects. In this way, breakdowns or faults can be predicted ahead of them occurring and sparking recalls or causing downtime.

To be most effective, predictive maintenance systems should be deployed across different production touchpoints. During the manufacturing process, cognitive predictive maintenance can identify and share alerts on in-line defects, which can then be corrected early on, while providing information to industrial engineers on the nature of the defect. In this way, defects are addressed before the product reaches the market and costly recalls can be avoided. This can also dramatically reduce scrap and rework in the product lines and thereby save significant costs too.

With industrial IoT and sensors, we now have the ability to understand the health of a machine in minute detail. This is achieved through the creation of digital twins, granular digital representations of the parts and the dynamics in the manufacturing process, which are enabled by machine learning and artificial intelligence. These digital models of physical assets provide a representation of their physical counterpart’s current status, working condition and functionality to help understand and predict the health and readiness of these parts. By creating digital twins, insights can be garnered to automatically detect and address the tiniest of issues that would otherwise be missed during a manual inspection process.

Once the vehicle goes to market, AI and predictive maintenance can work their magic. Service engineers can use cognitive analysis of multiple sources of data collected from sensors and service records to identify and solve problems in vehicles much before the “check engine” light goes on, eliminating safety risks and costly repairs, and delivering a superior customer experience. Also, it helps the engineers plan inventory of parts required for maintenance much better since they know of the problems ahead of time. Both the driver and the service engineers are empowered with proactive alerting systems that flag potential problems before they escalate into safety risks or costly damages, and at the same time move beyond just the schedule-based preventative maintenance to a need-based and timely predictive maintenance. Manufacturers can also gather useful insights from the field data to improve their product design as well as the manufacturing process.

Cognitive-first technology saves the day for the automotive industry

Predictive maintenance is not a layer of monitoring and checks that is added on current control systems. It is, in fact, an integrated cognitive and machine-first technology that runs end to end in the manufacturing and post-purchase lifecycle, ensuring that these processes can run like clockwork. With smart and connected cars changing the way we see vehicles, automotive manufacturers need to embrace what AI and machine learning have to offer to break a new record — that of zero car recalls!

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


September 7, 2018  4:42 PM

IoT security at Black Hat 2018: The insecurity of things

Ofer Amitai Profile: Ofer Amitai
Black Hat, Cloud Security, Internet of Things, iot, iot security, medical device security, security in IOT, smart city, vulnerabilities

It is a long-known fact that most IoT manufacturers neglect IoT security while designing their devices and machines. To further understand this issue, I would recommend catching up on the 2015 Jeep hack and the St. Jude Cardiac Devices hacks that started occurring in 2014. The St. Jude hacks proved that even companies dedicated to life-saving technologies often neglect to produce the necessary security measures to go with them. This was clearly the case in the October 2016 massive Dyn cyberattacks that affected large chunks of the United States and Europe. This year, we found out from the FBI that we lost control of our home routers to Russian hackers.

While attending Black Hat 2018, I attended a few jaw-dropping demonstrations of crucial IoT security flaws along with many other cybersecurity professionals. One of these demonstrations showed ATM break-ins. Typically, one might expect a machine containing money to have a more robust security system protecting the cash therein, and yet the machines were broken into right before my eyes. Additionally, I attended demonstrations of hacks into crucial medical devices and medical networks containing patient data that are instrumental in keeping people alive.

It was astonishing to find out that companies manufacturing medical devices, such as implants, insulin therapy devices (radio-based devices) and pacemakers, completely ignore current security research. One example for this research is the extraordinary work done by Billy Rios and Jonathan Butts (in their free time, I might add) in which they discovered many IoT vulnerabilities. This research will no doubt make our world a much safer place.

It was no less appalling to discover the deep contrasts existing between cloud security standards and IoT security standards — or rather, the lack thereof. Cloud-based enterprises are applying major security standards such as SOC2 to ensure the security of cloud infrastructure and turning certain working procedures into the standard requirement for all. Simultaneously, when it comes to IoT devices, we are living in the proverbial Wild West. There are currently no official industry security standards for IoT. In the healthcare industry, physicians prescribing the use of these devices have no understanding of their lack of security — and I don’t believe that they should be required to have it. However, at this point in time, it is a life-preserving piece of information to know that these devices have feeble security mechanisms in place and are therefore targeted for hacks.

During the Black Hat conference, IBM X-Force Red researchers described 17 different IoT vulnerabilities, including nine critical flaws in four common smart city devices. Daniel Crowley, the research team lead, said that his team had been exploring vulnerabilities that could open doors to “supervillain” attacks. After infiltrating incident command system components, smart car devices and other IoT connections, X-Force researchers found that a certain device was open to attack via the internet and another by cracking hardcoded credentials. At Black Hat, the team demonstrated an exploit of an IoT gateway connected to a dam, resulting in a flooded road.

Notwithstanding the fact that the vulnerabilities included in the abovementioned research have been patched by manufacturers, Crowley cautioned regarding the lack of IoT security standards with many other manufacturers and advised organizations to carefully research IoT risks before adopting new technologies.

All of this is taking a positive turn as IJay Palansky, a trial lawyer in Washington, D.C. and legal partner with Armstrong Teasdale, spoke about the 220,000-member federal class action suit related to the Jeep hack disclosed in 2015 by researchers Chris Valasek and Charlie Miller. Palansky is the lead counsel in this suit. In his presentation at Black Hat, he discussed this, the first IoT-related lawsuit to be launched against a company. Valasek and Miller remotely connected to a 2014 Jeep Grand Cherokee and took control of the car’s steering and brakes. This was done via the Jeep’s connected entertainment center.

“IoT products have certain characteristics; they have a wide variety of code that is often proprietary and makes detection and patching of code more difficult,” Palansky said. He advised organizations to “be paranoid and allocate risk. There needs to be a clear process involving hazard identification, design response, risk assessment and testing.”

The impressive part of this lawsuit is that while no car was damaged or controlled by the attackers beyond the proof of concept, there is still a legal basis on which to build the case. Even if FCA US LLC, Jeep’s brand owner, was able to successfully defend itself as far as the damage caused, this case will cause tremendous damage to the company in reputation and in dollars lost.

This lawsuit should be seen as a clear warning to companies manufacturing IoT devices while simultaneously ignoring security vulnerabilities. This practice will no longer go unnoticed. Manufacturers will have to take responsibility for securing these devices or face the consequences. Hopefully, we are at the beginning of a new security revolution for IoT devices, leading eventually to a healthier and device-secured world.

The IoT security vulnerabilities presented at Black Hat showed beyond any doubt that infiltrated medical monitoring devices or scrambled commercial aircraft communications would create dire consequences. The many vulnerabilities in smart city IoT technologies highlighted the widespread fear of what X-Force’s Crowley nicknamed “supervillain attacks” — state-sponsored attacks with the potential to significantly disrupt human life and safety in increasingly connected communities.

Clearly CISOs are taking notice of these IoT security risks, and I believe that these vulnerabilities should be handled by a combination of organizations implementing security best practices along with IoT manufacturers adhering to basic industry security standards.

Together with the understanding that IoT device manufacturers must embrace security standards, organizations must control the risk by implementing visibility platform technologies, network segregation, authentication enforcement, compliance enforcement, encryption and network access controls. The current “insecurity of things” does not have to remain that way — and I see tremendous improvement potential in the future of IoT security.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


September 7, 2018  12:45 PM

The unlikelihood of ‘IoT, secure by design,’ and what we can do about it

Alec Rooney Profile: Alec Rooney
Automatic Updates, Botnet, DYN, Internet of Things, iot, IoT devices, iot security, Network security, Passwords, patch, patching, Regulations, Updates, vulnerabilities, WannaCry

The concept “secure by design” focuses on a number of recommendations to make IoT devices more secure. The most important of these recommendations is an ability for the user to update the firmware on these devices. Unfortunately, this recommendation does not go far enough, as evidenced by user behavior in the WannaCry ransomware attack. Instead, IoT devices must be backed by a service that updates its firmware and monitors its behavior; but, this is an expectation that fundamentally clashes with IoT manufacturers’ business models and operations.

In October 2016, Dyn fell victim to the largest distributed denial-of-service attack. The source of all that traffic was mostly tens of thousands of webcams that had been infiltrated by the Mirai botnet. We recently analyzed the webcam models involved in that attack and discovered that most of those manufacturers are still selling those same models of cameras and continuing to support them. However, the majority of those cameras do not support automatic updates.

Why is this significant? In March of this year, the UK Government published a report “Secure by Design: Improving the cyber security of consumer Internet of Things.” In that report, the authors list 13 recommendations for the industry to adopt in order to make IoT devices more secure. The most important three recommendations are:

  1. No default passwords
  2. Implement a vulnerability disclosure policy
  3. Keep software updated

Most of the devices that participated in that 2016 Dyn attack were infiltrated because of default passwords, so we can see the reason for recommendation number one above. However, we know that the latest generations of malware and botnets have gotten much more sophisticated: They are now not only targeting default credentials, but also targeting known vulnerabilities.

A recent example of one such sophisticated attack was VPNFilter, wherein the primary attack vector was through known vulnerabilities in router and network-attached storage devices. That is why recommendation number three is so important: We know that no matter how secure we think the software we release today is, the odds are that a security vulnerability will soon be discovered in that software or, more likely, in one of the libraries we have used to build that software.

Still, giving a user a way to manually update the software (including firmware) on an IoT device is not enough.

On our network, we have identified hundreds of devices from major manufacturers that are vulnerable to the libupnp buffer overflow vulnerability discovered by Rapid7 in 2012. The vulnerability was in the underlying libupn library and was patched by January of 2013; yet, more than five years later, we still see devices that are vulnerable! The bottom line: Many of these devices do have firmware updates, but the problem is that manually updating these devices is hard.

For example, how many Samsung TV users would know that they needed to initiate the update process for their Samsung Smart TV? For routers, it is even worse. The typical update instructions require you to:

  1. Determine which model you have. (If you get it wrong you could damage your router — oh, and there could be a specific hardware version, too.)
  2. Find the firmware for that model on the router manufacturer’s website. (If you get this wrong, you could damage your router.)
  3. Download the firmware locally to your computer.
  4. Unzip the file.
  5. Go to your router user interface, for example, http://192.168.0.1 (Does the average person even know how to do that?)
  6. Log into your router. (Does anyone know the credentials to log into your router? Oh, and they shouldn’t be fixed and available in the manual; otherwise, that violates rule number one!)
  7. Go to the firmware update page on your router.
  8. Upload the firmware file from your computer to your router and upgrade. (Oh, and most manufacturers don’t recommend you do it over wireless (who has an Ethernet port on their laptop these days?)
  9. Finally, this upgrade may reset all your router settings back to the factory settings, so you must be ready to reconfigure the router with your SSID and password.

So what can be done? The only answer: Automatic updates for the lifetime of the device. IoT devices need to be managed by a back-end service that ensures security issues are patched as soon as the patch is available. That service should also monitor the behavior of the IoT device to affirm it’s behaving within the bounds of its normal operating parameters.

Now, for the next question, are consumer electronics companies capable of building and maintaining an effective back-end service? I think not. While established consumer electronics companies are now enabling their products to connect to the internet, they lack the expertise to build back-end services that can scale to millions of IoT devices. In addition, their current business model of selling an IoT device for a one-time-low-margin-profit does not support the cost of running a back-end service for the lifetime of that device.

Due to consumer focus on security, regulatory pressure and standards-based initiatives (like the Open Connectivity Foundation), we can expect companies to release more and more IoT devices that are fully managed by a back-end service. However, this process will likely take years, and even then, there will still be millions of insecure devices in homes.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


September 7, 2018  11:19 AM

IoT edge computing: Let’s get edgy

Jason Shepherd Profile: Jason Shepherd
CDN, Cloud Computing, Data Center, Distributed computing, Edge computing, FOG, fog computing, GATEWAY, Internet of Things, iot, IoT devices, IOT Network, ISP, telcos

This is the third of a five-part blog series. Read part two here.

Continuing my series on digital transformation, I now want to turn my attention to the network edge. You know, when you look at volume projections for connected devices versus how most people are doing things today, there’s a pretty major disconnect.

Plethora of online devices

A few years back, we hit a point where there were more devices online than people. In fact, it’s estimated that in 2019, we’ll cross over to more “things” online than traditional end-user devices. By “things,” I mean devices generally not associated with an end user and “headless” in operation. Estimates vary widely on the total number of things over time but in any event, it’s starts-with-a-B big. Clearly, these things represent new actors on the internet and a huge catalyst for digital transformation.

Tick tock

Looking back at the history of computing, the pendulum inevitably swings every 10 to 15 years between centralized and distributed models. Given the sheer volume of networked devices going forward, it’s inevitable that we need distributed architectures because it’s simply not feasible to send all data directly to the cloud. In fact, I think distributed is here to stay for quite some time, if not from here on out.

When in doubt, talk cat videos

Confession time — my wife and I have three cats. Sad but true — we bought phones with higher capacity storage purely to have space for all our cat pictures and videos and send them back and forth. But, who doesn’t like a good cat video? A colleague of mine, Greg, came up with this cat video analogy to describe how things are different with IoT workloads.

When you upload a cat video to a video-sharing service, their servers stream that video down to other people. This content may need to be cached on multiple servers to support demand, and if it goes viral with millions of people wanting to access it, it may then be moved to servers at the provider’s cloud edge — as close to end users as possible to minimize latency and bandwidth consumption on their network. In a nutshell, this is the concept of multi-access edge computing.

However, with IoT you now have millions of devices that might want to hit the same server, all wanting to stream data. This turns the whole paradigm upside down and new architectures are needed to buffer that data so only meaningful traffic is going across networks. We need better ways to support all those connected cat collars out there.

Gateways aren’t just for turning A into B

So, enter the concept of the edge gateway. Gateways aren’t just about enabling connectivity, protocol normalization (A to B) and data aggregation — they also serve the important functions of buffering and filtering data as well as applying security measures. In fact, Gartner estimates that 90% traffic will go through edge gateways.

There are key technical reasons for increased edge computing:

  • Latency — I don’t care how fast and reliable your network is, you just don’t deploy something like a car airbag from the cloud;
  • Bandwidth — There’s an inherent cost associated when moving data, which is especially bad when transporting over cellular, and even worse via satellite.
  • Security — Many legacy systems were never designed to be connected to broader networks, let alone the internet. Edge nodes like gateways can perform functions such as root of trust, identity, encryption, segmentation and threat analytics for these as well as constrained devices that don’t have the horsepower to protect themselves. It’s important to be as close to the data source as possible so any issues are remedied before they proliferate and wreak havoc on broader networks.

The kicker — the total lifecycle cost of data

However, beyond those technical reasons there’s also the consideration of the total lifecycle cost of data. People starting with Pi and Sky often realize that chatty IoT devices hitting public cloud APIs can get super expensive. And on top of that, you then must pay to get your own data back.

A few years back, there was a study by Wikibon about a simulated wind farm 200 miles away from a cloud, promoting a balanced edge and cloud approach. Results show that it reduced total operating cost by over 60%, assuming a 95% reduction in traffic passed over the network due to the utilization edge processing.

Many edges

Of course, it’s not just about edge gateways. There are many different edges:

  • To a telco, the “edge” is the bottom of their cell towers. You might have noticed that MEC originally stood for mobile edge computing but was recently changed to multi-access edge computing, and for good reason because it’s not just about mobile devices. 5G has a key tie in here and of course I like MEC because it helps serve up my cat videos faster!
  • To an ISP or content delivery network provider, the edge is their data centers on key internet hubs — they might call this the “cloud edge.”
  • Of course, there are also traditional on-premises data centers, plus we’re seeing an ever-increasing rise of micro-modular data centers to get more server class compute closer to the producers and consumers of data. Then comes more localized servers including hyper-converged infrastructure and edge gateways sitting immediately upstream of sensors and control systems. All different types of edges.
  • And to an OT professional, the edge means the controllers and field devices (e.g., sensors and actuators) that gather data from the physical word and run their processes.

In effect, the location of edge computing is based on context, with the net being moving compute as close as both necessary and feasible to the users and devices needing it.

The fog versus the cloud

And to ensure that my buzzword count is solid in this post, there’s the term fog, which is really all the edges combined with the networks in between — effectively everything but the cloud, or “the things to cloud continuum.”

The bottom line is that regardless of how we label things, we need scalable systems for these distributed computing resources to work together along with public, private and hybrid clouds. There are also no hard and fast rules as to who owns which environment between OT, IT or otherwise — we just need to make sure technologies meet needs in all cases.

Watch out for my next blog, which talks about ways to get past AOL stage to advanced-class IoT. In the meantime, I’d love to hear your comments and questions.

Keep in touch. Follow me on Twitter @defshepherd and @DellEMCOEM, and join our LinkedIn OEM & IoT Solutions Showcase page here.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


September 6, 2018  12:51 PM

Pinpoint IoT PCB wire bonding reliability

Zulki Khan Profile: Zulki Khan
Internet of Things, iot, IoT hardware, PCB, wire, wiring

IoT devices are spreading into a variety of markets well beyond the consumer area. These small to medium-size devices are finding their niche in industrial, military and aerospace, medical and other associated areas.

In such mission-critical applications, the top requirements are for reliability and ruggedness. IoT devices must be well-built and able to withstand considerable wear and tear, while still being able to operate efficiently over long periods of time.

Wire bonding reliability is at the heart of these demands. As noted in my last blog, wire bonding is increasingly being used today in IoT printed circuit boards (PCBs). It’s used to connect bare chips or dies to a substrate or the small rigid or rigid-flex board. Device packaging is taking too much valuable space on these tiny boards, hence chips must be placed and connected on the board.

There are 10 key principles that must be adhered to with pinpoint accuracy in assembly and manufacturing so that IoT PCB wire bonding maintains high reliability. Some are interrelated. However, as standalones, each one is critical.

  1. First and foremost, metallization is extremely important. This means creating a bond between the substrate on the PCB and the die. If the surfaces aren’t clean and oxidized, the strength of the wire bond is highly questionable.
  2. Different nuances are associated with gold, aluminum, copper and silver wiring. There must be assurances from wire bonding and PCB assembly experts that using a certain type of wire is compatible with a given application.
  3. Flip chips or chip scale packaging must be mounted properly on the board for wire bonding to work, and must be securely attached before wire bonding starts. If not, the created joint will not be reliable. When pull strength testing is applied to the wire bonded joint, the wire will come off.
  4. Virtually all wire bonding requires a fixture to lay the substrate flat. Knowledge and expertise are critical here for precisely creating those fixtures for ultra-fine devices to be wire bonded. Tolerances must be extremely tight to properly hold the substrate before it gets wire bonded.
  5. An experienced programmer is required to deal with an array of wire bonding variables. That programmer must also be savvy enough to perform wire bonding accurately and calculate and verify bonding strengths, bond integrity and bond placement before doing the actual bonds.
  6. Get assurances the electronics manufacturing services provider knows to perform plasma cleaning using argon gas to clean substrates as needed. The surface of those substrates must be 100% defect- and oxidation-free for perfect wire bonding.
  7. The chip and the wire itself must be 100% clean and free of oxidation. Otherwise, a wire bond joint will be unreliable.
  8. A fixture must be used to hold the substrate rigid, especially for flex circuits.
  9. An experienced wire bonding technician that knows the fixture assures that the substrate will remain rigid and co-planar to avoid bends, twists and/or uneven surfaces.
  10. Most often, gold wiring is used. A caveat when using gold wire is to ensure stabilized gold is used. Stabilized gold has a different chemistry with beryllium mixed into gold to increase the life of the gold wire.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


September 6, 2018  11:21 AM

The ‘Jurassic World’ of the internet of things

Francisco Maroto Francisco Maroto Profile: Francisco Maroto
ai, Blockchain, Digital transformation, Internet of Things, iot, Partnerships, transformation

Those who regularly read my articles know that I like movies and TV series. Just remember my article, “About IoT platforms, super powers methodology, superheroes and supervillains.”

This time my article is dedicated to the two trilogies: Jurassic Park and Jurassic World (the latter still pending the third movie).

Sure, millions of years have not since the appearance of the first telemetry species and its evolved cousins of machine-to-machine. But the tempo in technology is measured differently. The unit of time for tech has to do with the Gartner Hype Cycle. For Gartner, technologies pass quickly from “innovation trigger” to “productivity.” Companies that want to appear in a Gartner Magic Quadrant have to successfully adopt these technologies or are they are condemned to disappear.

Large companies that have been in the IT world for more than 15 years may seem like dinosaurs and they are afraid of disappearing because of a meteorite (IoT, metaphorically).

Here I present some technology companies that we could consider dinosaurs that are undergoing a cloning, or transformation, to adapt to the new world of IoT, AI or blockchain. As usual in this type of articles, the included companies and the classification is subjective. Therefore, not all dinosaurs are represented (47 species of cloned animals have been portrayed in the novels and films), nor can all companies be represented by the dinosaur that I have chosen for them.

Welcome to my “Jurassic World” of IoT.

The threats to these cloned dinosaurs are constant. Despite a dinosaur’s size and strength, many predators lurk to take down the giant — take Uptake’s digital safe package over GE Digital, for example.

Some species go in packs to survive — just look at Google and Ayla Networks or Microsoft and Electric Imp — while others seek alliances with cousin giants, like Rockwell Automation making a $1 billion stake in software maker PTC, as the best way to reign in their territory.

There are also dinosaurs in the west world of the telcos, as well as in the world of industrial companies that are adapting or cloning — but that is another story.

Your comments and suggestions can vary my “Jurassic World” table of IoT.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


September 6, 2018  10:22 AM

Professional services industry and IoT: The tech problems this industry faces

Josh Garrett Profile: Josh Garrett
algorithm, Authorization, Automation, breach, cloud, Data, digital, Encryption, firewall, Internet of Things, iot, IT, Mobile, Network, SaaS, Security, service provider, Services, Strategy, Technology

Today, more than half — 59% — of the professional services sector doesn’t have a single digital business strategy. For most professional services organizations, that means IoT initiatives that are still stuck in the earliest stages of adoption in best-case scenarios.

And, since many of these companies feature small employee headcounts, a large portion of the professional services industry is still struggling to implement the internet of things. Unfortunately for these enterprises, a lack of technical expertise and/or limited IT resources makes it difficult to understand the business value of an IoT system — much less successfully deploy one.

For now, professional services organizations perform most data management tasks from within company-owned data centers and public cloud storage networks. However, as the desire for real-time trends and insights continues to grow, this industry will become more and more motivated to make IoT work because of the technology’s ability to perform these activities at the edge of enterprise networks instead. In fact, almost half of the industry uses these flexible technologies to perform IoT data analysis, aggregation and filtering tasks closer to each original data collection source.

As IoT penetrates professional services, expect this sector’s tech-savvy leaders to use endpoints for more than just an expanded variety of network edge tasks — many organizations will look to create competitive advantages with these endpoints, too. If this industry’s enterprises can overcome the significant investment IoT requires, don’t be surprised to see automated facilities maintenance and supply chain workflows within the next year or two — not to mention other business benefits like brand-new market insights, more efficient processes, additional sales opportunities and more data feedback than ever regarding client habits and product/service quality.

Before any professional services business can capitalize on the value IoT creates, however, it must overcome five significant challenges:

Outdated strategy

Like any other major mobile technology deployment, IoT gives professional services organizations a variety of new endpoint devices to use every day at work. While this creates obvious benefits for the business implementing these innovations, it also opens new avenues for cyberattackers to invade corporate networks, too. If a company invests in IoT, it’s important it also reexamines all existing digital security strategies to make sure they’re updated and include systems to deal with IoT’s most serious threats.

Separate systems

For professional services IT leaders, IoT isn’t as scary as one might think. In fact, many organizations already use most — if not all — the data collection systems IoT needs. The hurdle for these enterprises is centralizing these systems, which have almost exclusively been maintained on separate networks to this point. So, the industry doesn’t need to start from scratch to assemble its IoT infrastructure — it just needs to find an efficient way to unify existing systems and streamline data communications across a single network.

Lack of resources

Some of the sector’s largest, resource-right companies can construct their own customized and centralized IoT management platform. For everybody else, IoT isn’t quite that straightforward. Fortunately, affordable cloud-based SaaS offerings can help. Professional services enterprises are already using these technologies to integrate disparate systems, enhance IoT security and perform data analytics in a cost-efficient manner.

Data communication

While implementing an IoT program is impressive, it’s only the first step of these advanced projects. The more difficult challenge is collecting and communicating the vast amount of raw data in a way that satisfies organizational stakeholders and benefits the overall business. After all, even something that appears to be the most inconsequential data point possible could unlock and fuel future innovation if it’s combined with another data set or handed off to the right employee.

Scalable security

After IoT has been implemented and integrated into existing enterprise workflows, program security and endpoint safety become paramount. These advanced mobile technologies require not only multiple layers of protection, but also need to be flexible enough to grow and evolve as the needs of professional services companies change. To protect sensitive data from hackers and cybercriminals, ensure your IoT initiative uses these four tools:

  1. Firewall — Firewalls have been used in enterprise technology for years and are considered an essential part of today’s digital security efforts. By blocking unauthorized network access in a way that still allows external communication, these advanced products are designed to work seamlessly with the millions of unique IoT device types out there.
  2. Encryption — Data is too valuable in today’s digital business environment to leave unprotected — whether it’s at rest on a mobile device or being transported across a global network. Encryption algorithms make sure this happens by scrambling IoT endpoint feedback, rendering it useless to any recipient that isn’t authorized to have a decryption key.
  3. Authorization — Before the first IoT device is ever deployed, organizations need to have a policy in place to control user behavior and the inevitable surge of mobile network traffic. Professional services workplace policies should clearly define user permissions and company-wide systems access permissions. Some of today’s most advanced authorization projects even include outlined task management automations for future implementation initiatives.
  4. Network segmentation — Because IoT requires multiple networks and interconnected enterprise systems, a single external breach carries the potential to create problems for a wider-than-ever range of connected business endpoints. Companies should review how their IoT infrastructure is connected and supported to intentionally separate systems wherever critical technologies, systems and devices are housed. That way, these mission-critical mobile endpoints are left unaffected even in the most widespread intrusion attempts.

As the professional services sector begins to increase its IoT investments, enterprises need to prepare for the myriad of strategy and security risks this technology can create. A managed mobility software partner not only eliminates this burden from any company considering an advanced IoT deployment, but also increases the likelihood of impactful and successful initiatives going forward.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


September 5, 2018  2:08 PM

Enhancing the customer experience by automating processes and outcomes

Jason Kay Jason Kay Profile: Jason Kay
Business model, business-outcomes, Collaboration, Customer satisfaction, Digitization, Enterprise IoT, Internet of Things, iot, IT, IT convergence, Legacy systems

The growth of the IoT industry is impressive, with examples of new use cases continuing to develop almost daily. However, although the industry may be booming, if organizations move too far away from the objectives digitization was designed to address — enhancing day-to-day experiences — the market risks stagnation and undoing the rapid progress that has been achieved so far.

Jason Kay, CCO at IMS Evolve, explains that while a priority for IoT is creating more efficient processes, ultimately those efficiencies should also address issues that are key to the customer experience. From food quality to environmental standards and minimizing product waste, these objectives will not be met solely by automating processes, but by automating outcomes.

Digitization progress

There are several possible theories as to why businesses are not making the projected progress towards successful digitization. Overwhelmed by the number of choices when it comes to innovative technologies? Projects with unrealistic goals? A belief, encouraged by some IT vendors, that digitization requires high-cost, high-risk rip-and-replace deployments? Although worthwhile considerations, these are the symptoms, not the cause. The reason for the lack of progression towards effective digitization is that the outcomes businesses are working toward have no correlation with the core business purposes.

Isolated digitization projects with siloed objectives deliver limited value that can be easily absorbed by unavoidable business costs shortly after project completion. These short-term enhancements make ROI difficult to prove and make projects highly vulnerable under C-suite scrutiny.

In order to accelerate the digitization of an organization, a new approach is required. Although disruptive strategies may appear threatening to current business models, when these projects are executed correctly they not only deliver aggressive ROI and efficiency gains, but present the opportunity to explore new revenue streams and business models. By prioritizing the core objectives of the organization, tangible business value aligned with clearly defined outcomes can be identified, as well as opportunities to reduce costs.

Digitization collaboration

In some ways, the IT industry has inadvertently incited this situation by offering new technologies that on face value appear forward-thinking and innovative, such as artificial intelligence and augmented reality, that don’t deliver tangible value to customers. IoT systems and propositions that have weak ROI due to requiring significant investment and a total infrastructure rebuild causing massive disruption to day-to-day business also waste valuable time and money. As a result, this confusion creates a challenge for businesses to establish viable, deliverable and future-proof digitization strategies. Furthermore, if the focus does not progress from single, process-led goals, this confusion will only continue and the perception of IoT technologies will deteriorate.

Through business-wide collaboration that focuses on the organization’s primary objectives, the true potential of digitization can be achieved. Without this outcome-led approach and with a misplaced focus on digitization projects that fail to add up to a consistent strategy, organizations will not be able to capitalize on the opportunity to use existing infrastructure to drive business value.

Consider the deployment of an IoT layer across refrigeration assets throughout the supply chain to monitor and manage temperature. A process-based strategy would priorities creating efficiencies, as well as potentially utilizing rapid access to refrigeration monitors and controls, together with energy tariffs, to lower costs and energy consumption. However, limiting an IoT project to one single, energy-reduction initiative may fail to demonstrate the full potential of ongoing benefits to management.

By collaborating with multiple teams, such as food safety and compliance, maintenance and merchandising, the scope of the technology can drastically increase. Real-time data from refrigeration assets can be contextualized with merchandising data to automate temperatures and correlate with the specific produce type within the case. With produce being kept at optimum temperatures, shelf-life will improve, waste will reduce and basket size may even increase due to aesthetics and availability. In another example, real-time monitoring of refrigerated asset performance can be used to inform and improve maintenance productivity, moving from reactive to predictive regimes, again improving asset availability and performance and therefore, product quality and customer experience.

There are now multiple opportunities to reduce waste, improve productivity and increase sales from one single IoT deployment, going far beyond incremental energy cost reduction and using the same existing assets and data used for energy management.

Sustainable digitization strategy

In order to move toward a future-proof, strategic realization, businesses must consider how digitization will address wider organizational outcomes, including the impact on customer experience, sustainability requirements and macroeconomic impacts. Collaboration across multiple teams is key to achieving this, as confidence in the project grows due to stronger business case and business-wide commitment to the project long term.

Placing a focus on using legacy infrastructure provides an opportunity for numerous business wins. Digitization can be achieved quickly, without disruption and at a significantly reduced rate. The risks are reduced and return on investment is delivered rapidly by using proven and scalable technologies, offering the chance to release value that can be reinvested into additional technological advancements. Furthermore, with an outcome-led approach, digitization gains the corporate credibility required to further boost investment and create a robust, consistent and sustainable cross-business strategy.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


September 5, 2018  12:44 PM

Integrating IoT and ERP data to achieve performance excellence

David Gustovich Profile: David Gustovich
ai, Artificial intelligence, Enterprise IoT, ERP, ERP business intelligence, IIoT, Industrial IoT, Internet of Things, iot, IoT analytics, IoT data, IT convergence

Implementing an IoT program is far from a one-stop purchase. In fact, by some estimates, it can take up to 25 partners to drive a complete IoT customer system.

That stat sheds light on a story defining the IoT space right now — the increasing number of partnerships between information technology (IT) and characteristically operational technology (OT) vendors. Across the board, from GE and Oracle to Rockwell Automation and PTC, global tech giants are partnering with one another, systems integrators and startups to deliver deeply integrated systems that will ease the process of deploying IoT projects.

Why? Because linking IoT and transactional data is the gateway to digital transformation, despite the complexities involved. In a recent IFS survey, only 16% of the 200 manufacturing and contracting executives surveyed said they consumed IoT data in their ERP systems. That enterprise software must help facilitate IoT projects was one of the study’s key findings. But, in fact, it’s doing the opposite with the majority of those surveyed blaming inflexible and legacy ERP software, or challenges in selling the value internally.

Cloud ERP eases IoT integration

In addressing that first challenge, there’s growing recognition that a robust, IoT-enabled ERP system must be cloud-based. Because cloud ERP is built on open standards with open APIs, those with cloud ERP have a significant advantage in integrating data generated by operational technology.

Enabling data access by plant managers and line-of-business leaders alike will unlock the true value of the data generated by things. Enterprise software can automatically operationalize IoT data, issuing work orders when certain equipment conditions are present, scheduling technicians in the field according to their proximity to a call, or adjusting production schedules depending on fault reports and overall equipment effectiveness calculations, according to the IFS report.

IT and OT convergence key to success

Easier-to-deploy technologies, prebuilt integrations, and optimized software and hardware are the keys to deeper IT/OT convergence. But as with most technology projects, it’s not only about the technology. To ensure the integration of software delivers the most value, it’s important to focus on strategies for bringing IT and OT teams together who have typically done their jobs in separate spheres.

The benefits are huge — significant improvements in business and financial performance can be achieved by linking real-time IoT and industrial IoT information to your ERP system.

For instance, traditional ERP systems are missing a critical ability to identify operational anomalies and uncover the root cause. Real-time IoT and IIoT data can help identify anomalies and outliers at the moment — enabling decisions that can have an immediate impact on business processes. For example, if operating efficiency is lower than planned or conversion cost for a specific SKU is higher than targeted, properly configuring IoT and IIoT data with ERP data can help to identify which specific attribute or element of the process had the strongest correlation to the outcomes of interests, i.e., costs, quality, service. This allows for targeted improvement initiatives to be launched and minimizes the impact of variation. Less variation means lower operating costs, higher profits and better attainment of performance objectives. Moreover, integrated IoT and ERP can have a positive effect on the end customer as well, from improved service levels thanks to sensor monitoring or predicting equipment performance to seamlessly accounting for outages or repairs in the billing cycle.

When we have more data, we can use technologies like artificial intelligence to analyze, predict and even automate action. Only then can ERP systems move from being systems of record to enablers — predicting and guiding organizations to be more proactive in their pursuit of performance excellence.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: