Bluetooth has been around for decades. And while its everyday applications in cellphones, media players and other communication devices might come to mind at first, there are some unique and complex applications of the technology. Bluetooth, and perhaps its lesser known sibling, Bluetooth Low Energy (BLE), have seen some success when applied to transportation, healthcare and security systems. At Exadel, we’ve been working with Bluetooth on mobile devices for many years, and have completed many successful projects, both with standard Bluetooth and with BLE. We’re currently working on developing an app for a major Fortune 500 high-tech company that pairs a mobile app with wearable hardware for biometrics and health awareness, and wanted to share some best practices and challenges of developing a BLE app based on this experience.
One of the significant challenges of this project has been to use BLE to transfer data that is:
- Near-real time
- High volume
We are also working with hardware and firmware that is new, which means we’re co-developing with other teams and defining the protocol for data on top of Bluetooth.
Because of the data requirements listed above, optimization counts. Waste and overhead of data packets must be eliminated or reduced to the smallest amount possible, while still operating reliably. While all overhead is waste in some sense, having some overhead allows for maintainability, adaptability and testability.
So, the key has been to balance the client’s need for adaptability (i.e., the development team needs to be able to make changes and have the expense of that be reasonable in both time and money) and the client’s needs for data throughput to handle high volumes.
Establishing the initial protocol was relatively straightforward; it needed to be flexible enough to handle the current batch of sensor data, handle changes on this project and handle future needs — possibly sensor types not on the horizon today.
As part of our collaboration with the client, we decided to move from BLE 4.2 to BLE 5 in order to take advantage of some new features to BLE 5. The main driving factors were to take advantage of the 2 Mbps speed advantage and extended data length capability.
Once the basics of the protocol were established, we needed to make further optimizations to handle all the requirements.
Working side-by-side with client firmware developers, the fine-tuning of the BLE interaction could begin. Together, the two teams worked on optimizing the interaction to handle the data. Optimizations to the interaction included fine-tuning:
- Connection intervals — the length of time between data interactions. This needs to be large enough to transfer all the data and not overlap with the next connection, but not so frequent that the radio is woken up when not necessary (causing undue battery drain).
- Maximum transmission unit (MTU) size. The team needed to collaborate to find an MTU that supported passing all the data that we needed while maximizing the ratio of overhead to payload. If the MTU is too small, you consume too much bandwidth with overhead and slow your transmission of real data. If your MTU is too large, you will have to consume more bandwidth on resends, which can also cause transmission slowdowns.
In order to ensure that we have as high a volume of overhead/payload as possible, the team has been using BLE 5’s 2 Mbps connection for maximum throughput. The team has worked closely together, iterating through changes to all these settings, as well as debugging the normal challenges that come along with Bluetooth development, such as adding new sensor types, revising features, creative updates, debugging, evaluating data and hardware updates.
It will be exciting to see this app in production and see how Bluetooth and BLE technologies can continue to expand their significance in a range of industries.
This article was co-written by Travis Bolinger, a senior software engineer at Exadel, based in Boulder, Colo. In his role, Bolinger helps Fortune 500 clients solve some of their toughest digital transformation initiatives with mobile, Bluetooth and other leading-edge technologies. Bolinger has a master’s degree in computer science from the University of Wyoming, bringing a strong technical focus in mobile and research and development.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
The global demand for IIoT equipment is growing rapidly. The importance of IIoT is also reflected in the resources major technology brands like Microsoft, Amazon and IBM are pouring into enterprise IoT platform development. Analysts calculate the market for industrial sensors for remote monitoring and control of everything from factory systems to goods tracking and office heating and lighting will be worth $21.6 billion worldwide by 2023.
Last year, a milestone was reached as the number of IoT-connected systems surpassed mobile devices for the first time. Before long, the installed base of smart industrial machines will exceed the number of remote workers. However, advances in IIoT technology are fast outpacing security-by-design standards. While enterprises spend billions of dollars giving employees usernames and passwords to keep their networks safe, not enough is done to protect machine identities.
Currently, there are no recognized industry standards for IIoT device manufacturers to follow. In fact, many device makers don’t believe it is worth their while to build in a high level of security. A McKinsey & Co. and GSA survey found that just 15% of smart equipment manufacturers thought customers would be willing to pay higher prices for more built-in security.
This means customers must assume responsibility for protecting their own smart systems. The first priority must be to secure the identity of each machine. Establishing an assured identity is essential for trusted data communications between remote IIoT devices, mobiles, cloud-based apps and centralized management points.
In a 2018 Forrester and Venafi study, 80% of IT decision-makers confessed to struggling with the issue of machine identity protection. While the global identity and access management market is worth over $8 billion, the bulk of it is focused on human identity protection. Unfortunately, enterprises spend nearly nothing to protect the keys and certificates that machines use to identify and authenticate themselves.
Cybercriminals know this. To make matters worse, they can even buy a digital persona on the dark web for about $1,200 that allows them to impersonate another device. In other words, cybercriminals can hide in plain sight.
Authentication with certificates
For effective management and protection of machine identities, organizations need detailed insight into all machine identities across their networks. Most enterprises already have strong, detailed authentication processes like Active Directory Certificate Services built into their networks.
Certificates are used in place of passwords to authenticate trusted connections between multiple network endpoints, be they on-premises systems, mobile workers or remote cloud-based servers. It makes sense to expand the scope of certificate services to include authentication of IIoT systems.
Put simply, a certificate is an assurance of identity and authorization using a secret private key validated with a known public key. Unlike passwords or other methods based on shared secrets, certificates can’t be stolen or otherwise maliciously appropriated by an impostor.
Secure industrial processes
To securely monitor and manage the data communications of authenticated remote IIoT devices, implementing professional, enterprise-grade virtual private networks (VPNs) is critical. Modern VPN software gives IT administrators the ability to remotely manage IIoT security elements such as privacy and authentication in real time and at scale.
A VPN can help protect the IP connection of every IIoT machine by encrypting all digital communications passing over the internet between innumerable devices and the remote administration center. Encrypted connections allow smart systems to send data over the web while being shielded from any outside third parties who might wish to monitor these online activities.
As analysts forecast compound annual growth rates of more than 15% by 2022, enterprises have a responsibility to put proper measures in place that sufficiently authenticate remote IIoT systems. In combination with remote access controls and certified authentication measures, VPNs are guaranteed to provide robust protection against cyberthreats and criminal activity.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
Product development is an incredibly demanding, exacting process. Not only are there numerous opportunities for errors to crop up along the way, but years can pass before the first product is even introduced to the market. That process can be even more complex when you are introducing connectivity features to your product.
In the connected home and building market, continued growth represents an important opportunity for OEMs that develop devices, appliances, security and lighting to differentiate their products. But, with the expanding number of devices in the market, it will take domain expertise to help stand out.
The good news is that there is an easy way to minimize the inconvenience and potential missteps of this development process and ultimately further the success of your IoT products and services: identify technology partners.
Enlisting expert help can be a critical way to solve for challenges within a solution ecosystem. In fact, nearly all of the respondents to a recent survey, “Connected Home and Building Technology Trends,” identified partners as an important piece of their strategy. What’s more, companies consider technology partners with experience in data management as the highest potential value to them. As many as 60% of the survey respondents emphasized that partners with manufacturing expertise in connected devices are key to achieving their goals.
A diverse set of technology partners can greatly benefit global companies. So, what are the critical questions you should ask yourself in identifying the right partner for your IoT products and services?
What is the partner’s domain of expertise?
The options for technology partners are as vast as the number of connected systems available. And the success of your company and your collaboration depends on thorough research and a clear understanding of your potential partners’ domain of expertise, at both the functional and market sub-segment level.
That research shouldn’t rely solely on what the company shares with you. It’s important for you to explore online reviews if possible and arrange conversations with some of their current customers. Equally critical is to review the latest coverage of the company in the media. That alone can speak volumes about the authenticity of the company. You need a company that can deliver on its promises, so your search needs to be targeted yet thorough.
What is the partner’s global reach?
Whether or not your current plans for your company may involve international expansion, it is crucial to be prepared. A global technology partner typically makes the most sense, as you can start locally and then scale as needed. You wouldn’t want to spend all this time identifying a partner that won’t be able to accommodate your growing geographical needs down the road.
At the same time, a global company can also bring a more diverse workforce and broader perspectives into their work with you. This, in turn, can help to increase the effectiveness of your organization.
How compatible are your cultures?
In your search for technology partners, you will come across numerous companies that appear to have adequate domain expertise. However, what will truly set them apart will be taking a close look at their company culture and their values, and deciding if those are compatible with yours.
For example, if your company prioritizes flexibility and rapid response, ensure that you examine that in your potential relationship with a partner. After all, the cultural fit can be the guiding light for your business relationship with a partner.
The right partner, now — and in the future
As the connected home and building landscape continues to evolve, new opportunities for technology partners will arise for your business. Going through a thoughtful, thorough process of identifying the right partner is critical for any enterprise that wants to increase its capabilities and develop its resources with ease and efficiency.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.
After decades of lagging digitization, the construction industry has entered a new digital era, and IoT is playing an integral role in the transformation. The heightened demand for new construction is helping drive this interest, as firms look for ways to do more with less in an industry that is challenged by worker shortages, tight margins and traditionally low productivity rates. Against this backdrop, contractors have turned to new digital technologies to help improve operations and performance.
Challenges of the traditional jobsite
While contractors have been making strides in evaluating and implementing IoT technologies over the past few years, it will take some time for them to become mainstream in an industry that has been doing things the same way for decades. Even today, it’s not unusual to see supervisors walking around the site to locate workers or equipment, using air horns to signal an emergency or relying on paper logs and other point methods to document and share important project information. These processes are inefficient, ineffective and hinder a contractor’s ability to aggregate and use project data, which becomes compounded across the firm’s project portfolio.
One of the reasons manual processes persist is that, until recently, it has been difficult to develop platform systems that can overcome the rigors of the challenging, constantly changing work site. In addition to contending with remote locations, heavy building materials and the scarcity of power, traditional IT networks have had difficulty meeting the scale and flexibility requirements of an active construction site, which may require connection to thousands of devices and the need for long-range communication. As a result, technology development has largely focused on specific functions (e.g., payroll, estimating) or specific assets (e.g., equipment, tools), which only connect parts of a project.
IoT changes the game
A new type of communication platform technology was needed to overcome these barriers. Fortunately, record outside investment in construction tech startups has fueled innovation, and a group of forward-thinking contractors and solutions providers have been working together to develop robust systems that are transforming the way the industry approaches, manages and executes projects.
By focusing on developing the network, or communication platform, that harnesses and sends data from all the resources on site using IoT devices, emerging technologies are filling an unmet need and providing critical visibility into construction processes and projects. With the ability to handle hundreds or thousands of IoT devices at high densities, these IoT offerings are helping contractors keep track of — and better manage — the many moving parts on a jobsite: people, equipment, tools, information, safety incidents and more. Combined with intelligent cloud software, this unprecedented visibility and real-time data insights are available to stakeholders off site as well.
Wearable devices, equipment sensors and more are replacing anecdotes and assumptions with objective data, showing how workers and machines interact in real time and helping identify and eliminate waste. This is important since the average construction worker spends about 20% of her time waiting for materials, equipment or information, according to a study by the Department of Construction Science and Management at Clemson University. By gathering information from multiple sensors, contractors can see how hoist elevators are being used on site — how long they’re being used and how long workers are waiting for them — so firms can minimize downtime by better managing resources or increasing availability by installing more elevators.
IoT is also having a big impact in making jobsites safer by detecting and documenting worker falls and other site hazards and helping to quantify safety behaviors on site. By identifying trends and providing real-time insights, technology is helping turn safety and risk management from a historical look back and lagging indicators exercise to one focused on real-time behavioral modification, rapid response and predicting — and in time, preventing — safety and risk-related issues.
Taking IoT to the next level
Thanks to new IoT platforms, construction is entering the age of digital transformation, which has the potential to radically change the way the industry operates today. By providing critical visibility into what’s happening throughout and across jobsites, contractors can create safer workplaces and better use resources to keep projects on track and on budget. IoT technologies are already impacting contractors and project teams on site, helping them manage and respond to situations in real time. But the best is yet to come. As technologies and services evolve, more data is collected and more resources are put in place to support it, and contractors discover new ways to solve business problems, the value of IoT platforms will only increase.
In February 2018, my family of four — wife, 6-year-old boy and 3-year-old girl — were thrust quite suddenly into the world of Type 1 diabetes. We spent about five days in the hospital trying to stabilize my daughter and learning a whole lot about how to manage the life-threatening disease that would be with her forever.
It is tough to articulate, without tearing up, the intense moments we lived through in those early days as we administered finger pricks and multiple daily insulin injections to a crying toddler. We still live with it; we’re in the early days of a long journey. It is a tough one at that, but we’re much appreciative of all of the wonderful advances that have been made, both medically and technologically.
Diabetes Type 1 and managing the life-threatening disease
Managing Type 1 diabetes lies in the real-time nature of the daily monitoring of my daughter’s disease — constantly keeping tabs on food intake and blood sugar levels so as to determine action or not. In fact, that’s what your normal pancreas — and the rest of your body — is doing right now: constantly tweaking its biochemistry in order to keep things, in this case blood sugars, in a normal, healthy range. Throw whatever you want at it, an In-N-Out Double-Double, a night out eating pizza and drinking beer, a big hangover brunch at the neighborhood diner, and your pancreas can handle it with aplomb, quickly determining exactly how much of its carb-processing power — insulin — it needs to dispense in order to convert this food into energy your body can use. Charting blood sugar over time, even for the craziest of meals, is like looking at a perfect bell curve, with a relatively small peak.
Doing the same for a person with Type 1 diabetes would yield quite a different experience. Since their pancreas cannot produce insulin, the sugars from their meals would just continue to pile up in the bloodstream — this is what creates both short-term emergency risk as well as long-term health deterioration. A chart of their blood sugars gives just a glimpse of the chaos that they deal with on a daily basis, especially without proper, attentive management.
What I set out to solve
It is against this chaos that my wife and I fight each and every minute of the day. That said, life for diabetics has changed tremendously, for the positive, with the advent of technological advancements like the insulin pump and the continuous glucose monitor (CGM). The latter is life-altering for sure; the CGM is the thing that allows my wife and me to see our daughter’s blood sugar levels at any moment in time, even when we’re at work or otherwise far away. The device is firmly planted on my daughter’s belly and has a fine filament inserted into her tissue. Every five minutes, the CGM transmitter sends the current number to a phone nearby, which further sends the data to the cloud. That’s right — our family is a walking IoT use case!
However, there’s a challenge.
The data never gets to us as fast as we would like and, perhaps as importantly, never exactly where we need it to be. For the former, as a parent, I have unrealistic expectations. It can never be fast enough, although in reality, the downstream recipients of said data — me, my wife and other caregivers — can’t and shouldn’t act immediately. But still, I want it fast! On the delivery side, I can’t be picking up my phone constantly. Or staring at a webpage somewhere. Or casually glancing at my smartwatch. I need that singular value, my daughter’s most recent blood glucose level, accessible via many channels, from phone to smartwatch to iPad to computer to whatever other devices I want to connect.
The maker of our CGM, Dexcom Inc., has done a great job with its product, and it has gotten better each release. The current version does not require a finger stick prick calibration, which means less poking of our three-year-old child. Despite the progress, and its basic tracking applications and historical reporting, Dexcom hasn’t invested heavily in the instantaneous delivery of blood glucose level readings to various types of apps and interfaces.
To deal with this limitation, I set out to solve faster delivery to any number of devices and interfaces.
Luckily for me, I haven’t been alone in this — Type 1 diabetes has a robust, advanced do-it-yourself community. Makers and developers around the world have decoded Bluetooth transmissions to create completely new management applications. They have reverse-engineered communication protocols for insulin pumps and have created advanced algorithms, resulting in the first really usable artificial pancreas implementations. And they have created great mostly real-time visualization applications so parents and others can follow their patients and know what the blood glucose numbers are.
That (poor) programmer inside cried a bit when I stumbled across the great efforts of several major players in the Type 1 diabetes DIY world — the Nightscout Foundation, OpenAPS and Loop. What could I tinker on, placating my technical curiosities, all the while making our day-to-day lives a bit easier when it came to disease management? The problem had been solved already!
As I dug deeper, I saw opportunities though. And after I joined real-time API platform leader PubNub later in the year, things started to click. I needed to have everyone involved in my daughter’s care to have the same view of the data, at the same moment in time, as myself and each other. And I needed to recognize that not everyone was going to have the same application up and open at the same time. I needed to have, and others to have, the critical data as soon as it streamed off the CGM. And to have it visible to us all, whether in real-time charts, SMS, push notifications or Slack messages. There was this real-time aspect of the problem space that even some of the DIYers were missing or had codified in closed applications.
Working with a real-time application platform allows you to think much differently about solution architectures. Integrated fast messaging, serverless extension and integration points, inline security and multiples of distribution models and user interfaces all allowed me to focus on the “business” problem that I was trying to solve:
- Where was data being sent from?
- To whom should it go and by which method?
- Should I augment the data with, in this case, additional statistical aggregations?
- Should I store it in a database, as well, for longer historical views?
I spent way less time thinking about servers, API frameworks, logging, monitoring and so forth, and more time answering the questions that inevitably made the day-to-day management of my daughter’s Type 1 diabetes that much easier.
What we’ll cover in this series
Now that I’ve introduced you to my project and what I set out to solve, I’ll dive into the details of it all in subsequent parts. I’ll discuss the core components of the project — the hardware, the IoT network and communication — and the end-user applications. Keep an eye out for part two, coming soon!
The industrial internet of things and digitized smart factories have outsized potential to meaningfully and permanently improve the efficiencies and capabilities of countless operations. But despite this hype, IIoT adoption and progress has lagged — and I believe this is due to a series of misconceptions that keep IIoT-benefitting companies and developers from pursuing and implementing the technology (or doing so in the most effective ways).
IIoT does demand completely new requirements for real-time data management and analysis. That said, when properly understood and approached correctly, these requirements really shouldn’t be the obstacles to adoption that they are still too often made out to be.
Here are four persistent misconceptions about IIoT, along with more accurate interpretations of the challenges businesses in this space do encounter.
Misconception #1: IIoT doesn’t require unique or new database needs
It very much does. IoT analysts, including those at Gartner, point out that IoT poses entirely new challenges in terms of data volume, data and query complexity, and integration. In reality, the difference between a machine working in a standalone mode or within a networked IIoT remote monitoring system is stark. Unfortunately, organizations are making the mistake of trying to implement an IIoT infrastructure based on existing, traditional databases, such as Microsoft SQL Server, Oracle and so forth). These databases are — in addition to being expensive — usually technically incapable of meeting the increased requirements created by the massive data volume that must be processed for IIoT success. While traditional SQL databases are easy to use, they are not built to query machine data streams in real time.
Misconception #2: IIoT implementations require a NoSQL database
Even database experts often assume — erroneously — that high volumes of unstructured data amount to an inevitable NoSQL use case. It is true that NoSQL databases are particularly well-suited to support complex and flexible queries, thanks to their efficient scaling and distributed architectures. However, the infrastructures of NoSQL databases are often very complicated, requiring a great deal of attention paid to their planning, operation and administration. At the same time, in industrial practice there is almost always relational data that must also be stored, including topologies, firmware information, and ERP or article data. Using relational and non-relational databases means that two different systems must be run in parallel and synchronized. Another challenge is that there is no standardized query language for NoSQL databases, they each have their own query language. To use NoSQL databases like Apache Cassandra, Elasticsearch or MongoDB also means enlisting specialized and experienced programmers — which are expensive, if you can even find one. An alternative that avoids these challenges is to replace pure NoSQL databases with newer and more advanced SQL-based systems that combine the familiarity of ANSI SQL with the scalability and flexibility of NoSQL.
Misconception #3: Time-series databases are the answer
Specialized time-series databases are always in fashion. However, it remains a common mistake to choose a time-series database as the basis for an IIoT platform. These databases are often limited in both their functionality and their scalability with intensive parallel usage. In addition to the visualization of data streams, IIoT necessitates support for frequent analysis operations and data model changes. For instance, these processes may be used to properly diagnose and understand the causes of abnormalities within factory production. An IIoT database must also allow for interactive work with real-time data, including simultaneous reading, writing and execution of ad hoc queries, for use cases such as machine learning, under heavy load.
Additionally, the need for agile processes requires that an IIoT database adapt or extend data schemas at runtime. This means that bare sensor data, ERP data, quality data and so forth are used to examine production anomalies. For example, anomalies may be associated with certain jobs or due to specific raw materials from certain suppliers.
Data model changes of this nature often require teams to completely rebuild their time-series databases, which is (somewhat ironically) time-consuming, not to mention extremely costly. To solve this, many enterprises will use a time-series database alongside a separate relational database handling non-time series data. While this solution is quick to implement, growth will rapidly make the database expensive and increase the difficulty of keeping all the data in those different databases in sync.
Misconception #4: AI can only be achieved with better, cleaner data than you have
IIoT developers sometimes assume they lack the data or data hygiene to set up successful AI systems. And it may be the case that inadequate data leads to poor AI-controlled automation.
However, the fear of inadequate data automatically meaning that no useful results can be obtained, or that wrong decisions will be made, is simply unfounded. In practice, most companies pursuing IIoT will build a real-time data store to optimize — not replace — human decision-making with AI technologies and machine learning.
A practical approach here is to monitor analysis results and then gradually and automatically clean up the data as it goes through your process. Trying to completely clean all historical data — and thus delaying development and implementation of intelligent IIoT systems until your data reaches perfection — will backfire by leaving your AI systems with a quantity and depth of data that is too small to move forward properly. It’s usually better to simply get started, collect raw data and develop the use cases along the way.
Developing an accurate perspective
Organizations in a position to benefit from IIoT should understand that implementing this technology will require wholly new data management and analysis capabilities. Pipelines of sensor data — delivering thousands or even hundreds of thousands of readings per minute in dozens of message formats — must be integrated and analyzed in real time in order to properly monitor, predict and control the behavior of the things in the system. Fast acquisition and analysis of machine data is a prerequisite, while data-driven automation is key to the success of a future-proof IIoT project. IIoT-empowered facilities require data management systems able to:
- Ensure rapid development and time-to-value;
- Enable real-time data analysis;
- Maintain consistent uptime; and
- Ensure low IT operating costs for hosting, integration and administration.
The means to implement the IIoT applications and smart factories of the future are available to businesses today, if they are able to recognize and transcend the above-mentioned misunderstandings.
In the age of digital transformation, all executives should be asking themselves, “How will our business be disrupted?” Some businesses are taking a wait-and-see approach, often because they aren’t sure how digital technologies like IoT apply to them. This conservative approach is understandable because in the Industrial Revolution era, innovation was often based on product-based business models based on incremental innovation, or adding new features and capabilities to products.
And in the past, value creation (profit) was a function of economies of industrial scale: mass production and the high efficiency of repeatable tasks. This old business model banks on customers’ loyalty and being compelled to buy more products. The problem with the old business models is products are quickly being commoditized. In the new digital era, incremental innovation falls short and price becomes the competitive advantage, leaving unprepared companies to play catch up as they attempt to react to digital transformation emerging in their industry.
Today, the most innovative companies realize to stay competitive they must shift mindsets by looking at entirely new business and revenue models with connected devices and product lines. This means reimagining, reinventing and evolving how their businesses operate to develop new value creation. Leaders must innovate revenue models by integrating exponentially advancing digital technology to drive new revenue streams.
And, to ensure the value creation with connected devices, IoT leaders need to make holistic and integrated customer service a top priority at every point along the customer lifecycle — all the way from pre- to post-purchase. This puts customer service in a prominent position to being a key differentiator to drive enhanced revenue and profitability. In fact, with over 20 billion connected things expected by 2020, connected products can not only provide proactive versus reactive service, but also provide some of the best opportunities to create new revenue streams and business models.
To do this, business leaders must clearly understand the impact of integrating IoT data with connected devices and customer service applications. Value creation, by combining and incorporating new technologies into customer service, allows for the development of new business models and revenue streams.
In a connected world, products can add new revenue streams after the initial product sale, including value-added services, subscriptions, and apps, which can easily exceed the initial purchase price. The result? Shifting profit model paradigms from not just selling more products to the same customer, to enabling a recurring revenue stream with existing customers can be much more lucrative. Businesses can not only vastly increase size of the business, but also stop their competition in their tracks.
Take Samson Rope, for example. This 140-year-old company supplies rope for marine, mining, forestry and rescue. It is implementing a high-tech rope threaded with IoT sensors to monitor its condition to know when it needs replacing. Samson services 8,000 lines of rope throughout the life of the product. And according to the company’s director of IT, Dean Haverstraw, combining IoT and field service capabilities will help the company identify when maintenance is needed, creating entirely new revenue streams for their business.
This type of proactive service enables brands to deliver enhanced trust and loyalty with customers. How might this work? Consider a product which sells 300,000 units per year at $1,000 each. At just 1% per month of service revenue, this product could generate $36 million of high-margin, recurring revenue. Integrating IoT and field service with customer service provides a clear path to move out of the old business model paradigm of being a cost-center to a profit-center.
The takeaway? It’s not a matter of if, but when your industry and business model will be disrupted by new, service-oriented, reoccurring digital technology-based business models. The next question to ask yourself is, “Will you sit back and be disrupted or be the disruptor in your industry?”
Supply chain automation has been the primary focus of many companies for well over a decade. While the production line, storage protocols and loading procedures have advanced at lightning speeds, the decision-making process has remained more or less the same.
A small number of issues are identified that can be handled by preset procedures at the ground level. Anything that happens outside this set is passed up the chain of command to be analyzed and dealt with. The time lost in conveying this information, even with IoT devices in place, waiting for a response and then acting on that response, eats up considerable chunks of otherwise productive time.
Logistics is a critical supply chain component of the manufacturing industry. Road Transport in India constitutes a bulk of the logistics chain. Its optimum and efficient utilization governs supply chain profit margins for many companies. On-time delivery by vehicles also helps in customer satisfaction and retention. To control and improve the utilization of these mobile assets, it’s imperative to monitor and manage their movement and performance.
The reports linked to GPS-based location tracking are useful in performing post-mortem analysis of the delivery performance, however the data flows in only after the mistake or issue has already taken place. Developing action points after the fact is reactive and detrimental to business in the long run.
Why does this happen despite the readily available technology and mountains of processed data at your disposal? There are several reasons.
First, the lack of coherent data. Several parameters are monitored separately under separate reports. It’s difficult to relate them to each other to arrive at a concrete conclusion. Normally, individual truck data is available, but cumulative data clustered at a logical hierarchical level is not.
Second, the absence of advanced analytical tools. There is a limit to the manual analysis that can be performed on a spreadsheet.
Third, the presence of functional silos with conflicting objectives in the same organization. For example, the dispatch team focuses on volumes, but the safety team focuses on incident-free delivery. Add to these the challenge of maintaining multiple service providers across different functions and a basic understanding of the problem starts to form.
A control tower addresses all these problems.
Imagine all your data points being captured coherently and analyzed against your entire database. You can determine the best transporter in a plant or the worst-performing plant in a region without having to muddle through piles of reports. Imagine advanced analytics providing you with live, definite and measurable action points. The system will identify actions that can be taken to rectify mistakes on the fly. While doing this, it is keeping an eye on every other aspect it’s plugged into and making sure they clock over nicely. The centralized nature of this system means that all relevant stakeholders will be using a single tool to measure and meet their key performance indicators while keeping the focus on the overall organizational objective.
The control tower relays automatic warning to the relevant process or task owner before a performance parameter is breached. The suggested action helps stem the issue before it gets out of hand. Measurements of user response time, effect of the response and timeframe to implement are used to further enhance the system. Ultimately, this helps improve performance, set benchmarks and fine-tune your logistics chain into a well-oiled machine.
Not too long ago, IoT meant something completely different than it does today. Machine-to-machine used to be the preferred vernacular to describe connected devices, which relied on modest cellular connectivity needs to maintain equipment and ensure proper functionality. Fast-forward to 2019, and now IoT is starting to play an ever-greater role across U.S. industries. With use cases growing in every vertical, we are entering an era of “massive IoT,” which is ideologically similar to IoT that preceded it, but requires much more wireless capacity.
Consider the popularity of smart cities and the amount of connectivity required just to monitor vehicular traffic. In order to obtain an accurate understanding of traffic patterns and points of congestion, large volumes of data will have to be communicated back to a source frequently throughout the day. IoT use cases including the aforementioned example will require support from both 5G and LTE networks to satisfy requirements of a massive IoT world.
Achieving the needed capacity is easier said than done, and requires improvements in both in-building and outdoor networks. Many in the telecom industry believe that large-scale deployments of compact small cell radio access points in the regions lacking coverage will support the necessary growth of IoT and drive 5G implementation across the United States. While it’s true that small cells should have a substantial impact on 5G networks, as well as LTE, there will be other wireless hardware involved in building smart city infrastructure.
Smart infrastructure paves the way to smart city future
A decade ago, deploying extensive wireless connectivity systems into transit infrastructure wasn’t as common. Now, it’s expected that any new venue built in major cities, such a Los Angeles, Seattle or New York, will have strong cellular connectivity support. For example, a new double-decker tunnel running through downtown Seattle Tunnel called SR-99 is being coined “the smartest tunnel ever built” and includes a high-powered distributed antenna system (DAS) and remote radio units to provide incredible connectivity for all passengers and drivers across the two-mile stretch. It’s the development of these kinds of projects which will greatly assist the proliferation of IoT across an entire smart city. While 5G networks are likely to provide the capacity that future IoT deployments will need, 4G/LTE networks should also provide robust blanket connectivity for those use cases.
Massive IoT support will initially comprise a mix of network architectures
Many telecom industry analysts and key players see small cell deployments as a key factor to 5G growth. They are miniature (comparatively speaking) radio access points with antennas which transmit wireless data. The size allows small cells to attach to poles every few blocks instead of miles for cellular towers, which is an amazing asset for boosting connectivity to dead zones that hinder blanket connectivity around the United States, and in turn, supports IoT and smart city development. Parts of Asia, notably South Korea, are more advanced in their rollouts of small cells, laying the blueprint for what the United States is beginning to implement.
However, there is a notable difference in places like South Korea and the United States that could affect how the U.S. leans on small cells for every application. South Korea has three main mobile carriers — Olleh, SK Telecom and LG U+ — that use considerably fewer bands to power their networks over a smaller area. In contrast, the U.S. has four main telecom providers that must provide capacity over much larger distances. Both scale and small cells’ current limitations create a unique challenge.
Currently, small cells can work with one band and one carrier. As a result, small cell deployments that want multi-carrier, multi-band support can become very expensive. They will predominantly be secured to poles, which can quickly become congested when networks require a system that supports the four major carriers and different bands for LTE and 5G. Tormod Larsen, CTO at ExteNet Systems, echoed this point in a recent article. At least initially, massive IoT will be built on a collection of different network setups, such as hybrid DAS, using a combination of active DAS, repeaters, small cells and even Wi-Fi to cost-effectively build networks capable of meeting diverse needs including indoor networks of all sizes. DAS has the advantage of supporting all highly used frequency bands and carriers in a single unit. While DAS is mostly for in-building connectivity solutions, outdoor DAS can be a preferred multi-carrier option over small cells to improve connectivity in preparation for massive IoT and smart cities.
With all four major carriers vying for the most impactful initial 5G deployments and further densification to support burgeoning IoT use cases, expect plenty of creative uses of all wireless connectivity hardware to support their ambitions. Between upgrading existing networks and venues coupled with adding substantial connectivity systems to new infrastructure projects, it will create a gateway to a smart city future.
We’ve all heard how the internet of things is taking over the world, but what has its impact been on software development? It means today’s product expectations are higher than ever. Users want products that are feature-rich, can be accessed remotely, are easy to upgrade and offer solid security. Take the medical device industry, for instance. There is a proliferation of wearable devices available now that help us monitor and understand patient behavior. Making sure these devices and the data being collected from them are secure can literally be a life-or-death task. This means software projects are becoming increasingly complex and require more expertise than ever. For some companies, that has led to outsourcing either some or all their software development.
So, what are some reasons you might outsource your software development? As programming becomes more complicated, it requires specialized skills. You may not have in-house expertise and it can be time-consuming, difficult or expensive to hire. Most companies tend to specialize in either cloud/mobile applications or embedded software. You might have embedded developers, but are working on a mobile application and they lack the skills necessary to execute the project. Or maybe you do have qualified developers in-house, but they are tied up working on other projects. Outsourcing is a great way to get the technical strength you need when you need it without adding to your headcount.
How do you find ‘the one?’
Once you’ve made the decision to outsource, how do you decide which software development company to choose? It’s important to remember that the quality of your software ultimately depends on the provider you hire. The first step is to determine exactly what you need and ensure any providers you consider have the applicable engineering expertise versus general software development experience. It takes a different skill set to develop a GUI than, say, a database.
Understanding the desired product features will allow you to determine what skill set your software provider needs to successfully complete the project. You should look for developers who have successfully developed products at least as complex as yours. Similarly, if you’re looking for a quick prototype versus a production build, find a company with a pro.
Next, it’s important for a provider to understand not only the technical requirements involved in building your product, but also your business processes. A provider should be interested in what you are trying to accomplish, what problem your product is trying to solve and how their role impacts the overall project. If a company is focused on trying to fit your project into a cookie-cutter offering, that should be a red flag. A good developer will approach each new project from ground zero and build a truly custom system that meets the project’s objectives. Think of it as building a partnership versus “buying” software.
But it doesn’t end when the code is written. Always ask potential providers about QA and testing. Testing is a critical component of the software development lifecycle. Even the best programmers introduce bugs into their code. Developers that don’t have a rigorous, defined testing process in place cannot produce quality software.
Once you’ve found your ideal software development partner and you’ve signed the contract, now what? Like any relationship, there are some fundamentals that will determine its success.
Communication is key — and it’s a two-way street. Sharing project background, goals, objectives and a clear plan will help your software provider understand the big picture and may determine the best approach to delivering a solution that will best fit your needs. On the other hand, your provider should provide regular updates on their progress, inform you of any schedule changes and be willing to discuss their processes with you. Open communication on both ends ensures the product you expect is the product they deliver.
Honesty is the best policy, so no budget hiding! Both parties need to be upfront about cost. Your budget will determine the approach your developer chooses and the final features of a product. Cutting corners at this stage can add technical debt to a project that will take five times longer to fix in subsequent stages. Be prepared to collaborate and compromise on the final deliverables and what you are willing to spend.
Outsourcing software can help a company innovate and grow. But it’s important to take the time to find the right partner for your organization. Look for a company that has the technical aptitude you need, cares about your business objectives and is open to honest communication. Finding a qualified, reliable software partner you can trust can be the start of a valuable relationship.