The key focus on the two days Smart Tech 2019 event organized by Smart Mobility Association was on the Present and Future of Vehicle Tracking Technology. Most of the talks and panel discussions revolved around the various aspect of the topic. This was a 2-day industry symposium and hence there was a high level of industry level present at the event not only from the government sector but also from public and private sectors. It emerges that modern-day automotive can’t survive without Technology, Telecom, and mobility sectors. Only then it can evolve as smart automotive. The event was presented by iTriangle Infotech which is India’s largest manufacturer of vehicle telematics devices. The symposium on Vehicle Tracking Technology was very much the need of the hour in order to collaborate and synchronize all industry leaders in this ecosystem. The Smart Mobility Association was quite successful in its initiative.
Smart Tech 2019
If you try to find out the various developments taking place in vehicle tracking technology spectrum across the globe, the situation is very dynamic. In a way, what is happening today is like setting a proper foundation for the future of navigation, connectivity, telecom, technology, and smart transport. All these further lead to smart cities and smart nation. There is a lot of thrust in the country on digitization. Almost Rs. 2 trillion spends are in pipeline for the development of smart cities in India. A lot of focus is thus towards the basic things to be in the place like road construction, traffic management, fleet tracking, etc. A large chunk of public transport is in the process of installation of VT Technology. Smart Tech 2019 is one of its kind event in this regard covering it with 360 degrees perspective including promises, perspectives, challenges, scope, and risks.
Smart Tech 2019
There were more than 250 delegates from various industry segments like Automotive, Telecom, IT, Mobility, etc, in the recently concluded Smart Tech 2019 with a key focus on Vehicle Tracking System. The participation included Ministry of Road Transport and Highways of The Government of India, Automotive Research Association of India (ARAI), International Centre for Automotive Technology (ICAT), Delhi Integrated Multi-Modal Transit System (DIMTS) Limited, and a number of big names in the industry. Rajiv Arora, General Secretary, Smart Mobility Association, says, “The economic scenario of today demands increased productivity while driving down costs, and many businesses are now looking for innovative ways to refine their processes. We are hopeful that this forum shall offer a cross-industry perspective towards this direction, and put forth best practices of vehicle tracking technology in India as well as across the globe,”
Dinesh Tyagi, Director, ICAT said, “Off late, a dire need has been felt to have platforms where discussions can be initiated about new age smart devices and technologies. I am sure Smart Tech 2019 will address this issue and provide a comprehensive knowledge platform for all the industry experts and professionals alike.”
Creating a voice bot has never been easier – Ozonetel launches its very own Voice Bot Platform. Technology is continuously changing and redefining the way businesses interact with their clientele, rendering customized and tailored recommendations. And the latest trend in business communication is the employment of voice bots. Assistants such as Alexa and Google Assistant are great examples of voice bots; having secured a special place in customers’ homes and retail stores.
Voice bots make life more convenient, so much so, that they have become all pervasive. Their popularity continues to grow with more and more businesses choosing to employ voice bots to communicate with their customers.
Ozonetel, a leading provider of on-demand cloud communication/telephony solutions, has launched its very own Voice Bot Platform. This platform will enable easy development of voice bots for various voice endpoints i.e. telephony, mobile app, websites, and digital assistants. All existing chatbots can effortlessly be ported to the Voice Bot platform. This platform provides a new voice channel for all chatbots.
Chaitanya Chokkareddy, Chief Innovation Officer, Ozonetel, says, “We are very excited about this latest innovation from Ozonetel. We believe, voice bots are ushering in a new era in customer support worldwide. The Ozonetel Voice Bot platform will make it very easy and cost-effective for organizations to voice-enable their existing chatbots and build new ones. Ozonetel has been a pioneer in cloud communication solutions, and this new platform will make it incredibly easy for any organization be it a start-up, mid-sized or a large enterprise—to adopt and experiment with voice bots easily.”
Over 2.7 Bn Deskless Employees are part of the global employee engagement crisis causing a big pain for enterprises across the globe. As a matter of fact, the level of disengagement among deskless workers is as high as 80 percent of the global workforce causing billions of dollars of loss annually to organizations. All this is because of the limited reach communication channels. In this context, to address this severe issue, StaffConnect just released a new eBook ‘How Can Enterprises Overcome the Global Employee Engagement Crisis That Impacts 2.7 Billion Deskless Employees’ that is available now to download. StaffConnect is a pioneer in providing mobile employee engagement solutions for the deskless workforce. The eBook touches some very basic questions like, why individuals often become disengaged? It is the enterprises that have workforce deployed remotely and offsite that face these issues most. The eBook provides important information in this regard.
The eBook ‘How Can Enterprises Overcome the Global Employee Engagement Crisis That Impacts 2.7 Billion Deskless Employees’ not only touches the basic pain areas but also provides solutions to them. It is with mobile technologies that can inspire employee engagement and turn this trend around. It is the deskless employees that face higher amounts of limitations than their onsite peer employees. These limitations pertain to communication and access to company systems. These limitations drastically impact their level of engagement. This gradually results negatively in bottom line success. According to the Bureau of National Affairs reports organizations lose more than $11 billion each year because of employee turnover – a direct consequence of employee disengagement. The demand and necessity for a more mobile workforce have increased tremendously for two main reasons – the gig economy and a steady increase in remote working. This has caused inadequate reach to deskless employees working in varied locations.
What many enterprises lack is a prioritized systematic approach for facilitating timely and reliable two-way communication with their deskless employees to ensure high levels of engagement? The need for the right mobile technology tools and platforms for businesses should be considered in order to keep deskless workers in the loop and enhance communication and collaboration. As a matter of fact, most of the deskless employees are not even aware of the basics when it comes to internal communication. That is the reason that a vast majority (as high as 84 percent) of deskless staff does not receive sufficient communication to perform their job effectively as stated by Tribe Inc. The eBook explains how using the StaffConnect platform deskless employees can have access to company information 24/7 using its mobile app all irrespective of role or location. That includes real-time updates from the company and CEO.
Geraldine Osman, CMO, StaffConnect says, “In a world that’s increasingly digitally driven and focused—combined with a shift toward a workforce that is now primarily deskless—the key to increasing employee engagement is integrally connected to technology. To effectively drive engagement across the entire organization, businesses need to implement mobile-enabled apps that are capable of reaching every employee and delivering an engaging user experience. This prevents the silos between office and field-based employees and facilitates a more unified and positive culture that ultimately leads to better performance, retention and customer satisfaction.”
To download and read, “How Can Enterprises Overcome the Global Employee Engagement Crisis That Impacts 2.7 Billion Deskless Employees,” visit: https://www.staffconnectapp.com/download-the-deskless-workforce-ebook/
Betsi Cadwaladr University Health Board (BCUHB) is the largest health organization in Wales delivering a full range of primary, community, mental health, and acute medical services from three main hospitals. In addition, there is a network of 38 different community hospitals, health centers, clinics, offices, mental health units, and community team bases with a workforce of over 17,000 employees serving more than 760,000 people across all six counties in North Wales. BCUHB has attained a tremendous success in employee engagement among their deskless staff by means of unified two-way communication for the deskless workforce with the StaffConnect Mobile Employee Engagement Platform.
Aaron Haley, Communications Officer at BCUHB says, “While email works for our desk-based staff, there’s a big contingency of our workforce who just can’t find the time to get on to a computer as part of their working day. We wanted the new internal communications tool to be completely voluntary, and we wanted to demonstrate our commitment to improving internal communications with a platform that meets the needs of all our employees, regardless of their role or location.”
Stateless Luxon is a new software-defined interconnect product from Stateless Inc. This software-based product creates flawless connectivity between various data centers and between the data center to the hypercloud. Before going any further in details of the product, let’s understand a bit about the rich background of the company that delivers this fabulous product, Luxon. Stateless Inc was founded by Eric Keller, CTO, and Murad Kablan, CEO. Both are Ph.D. on the academic front. With their successful work experience with AT&T and IBM, they both realized the need to develop it while noticing first hand how network virtualization had failed to a large extent. They also authored Stateless Network Functions. That means network virtualization had been an overhyped subject with least success. The founders of Stateless Inc. have a very simple yet powerful vision. Their vision statement says – ‘Give the World the power to simply create customized network on-demand’.
If developing Stateless Luxon had been so simple, it would have been done a couple of decades back. That means even the tech giants in the market in this field had been craving to achieve it but with no success. Perhaps they didn’t have the right direction. Let’s look at some interesting figures. The average number of clouds used per enterprise is 5 (Source: https://www.cio.com/article/3267571/it-governance-critical-as-cloud-adoption-soars-to-96-percent-in-2018.html). 80% of enterprises will shut down their data centers by 2025 (Source: https://blogs.gartner.com/david_cappuccio/2018/07/26/the-data-center-is-dead/). The interconnectivity bandwidth CAGR through 2021 will touch 48% (Source: http://phx.corporate-ir.net/External.File?t=1&item=VHlwZT0yfFBhcmVudElEPTUyNzIyNTl8Q2hpbGRJRD02OTU4MDA=). This clearly indicates a rapidly rising global trend that is Multi-site + Hybrid. In fact, it is already there. Monetizing multi-site + Hybrid is the biggest challenge among colocation and cloud managed service providers. They have started focusing on monetizing links into the data centers. As a matter of fact, for them, there is a need to monetize all connectivity links.
Stateless Luxon Is The Future
The situation is like chasing a moving target. In fact, in today’s dynamic and multi-hybrid cloud ecosystem, dedicated L1 + L2 network appliances are not able to deliver. It’s is a complex situation of new & unique applications, new network connections, and new vectors to monetize. SOFTWARE-DEFINED INTERCONNECT (SD-IX) Deliver, control and monetize composable Layer 3+ network services to interconnect points through software. That’s the Luxon Software platform. Four key features of Stateless Luxon are Multitenant, Automated, API-driven visibility, and Consolidated & Evolvable. Basically, with Luxon, you don’t need to buy new appliances. In fact, new functionalities can be easily deployed through software. General availability Q3, 2019.
Philbert Shih, Managing Director, Structure Research says, “Colocation and cloud service providers are set to experience growing demand as enterprises move away from on-premise data centers. Stateless is poised to capitalize on a growing opportunity as outsourced infrastructure revenue is expected to accelerate significantly through 2022, when it is forecasted to reach $382.63 billion up from $138 billion in 2018.”
Statelss Luxon is the first step in mission
Bob Laliberte, Practice Director & Senior Analyst, ESG says, “The business expectations of colocation and data center providers have changed as enterprises continue to decentralize workloads, and these providers must now position themselves as a primary hub for this tenant traffic. These providers can now leverage the Luxon platform to effectively scale their business and easily monetize additional connections in the data center.”
Murad Kablan, CEO and co-founder, Stateless says, “Luxon is the first step in Stateless’ mission to provide users with the power to create simple, customized networks on demand. The platform provides colocation and cloud MSPs the real-time agility they need to adapt to ever-changing business requirements and priorities.”
This is the concluding post of the 3-series interview with Sandeep Kapoor, EMEAI & Marketing Manager India, Keysight Technologies.
6. What is Six-in-one instrument integration?
The six-in-one instrument integration are Oscilloscope + Function Generator (20MHz with modulation) + Hardware based serial protocol trigger and decode + Frequency Response Analyzer + Digital Voltmeter + Frequency Counter.
The 1000 X-Series offers an integrated 20 MHz function generator with a signal modulation capability. It’s ideal for educational or design labs where bench space and budget are at a premium. The integrated function generator provides stimulus output of sine, square, ramp, pulse, DC and noise waveforms to devices under test Add modulation to the signal with customizable AM, FM and FSK settings. It is an oscilloscope with an integrated function generator.
The 1000 XSeries is a powerful protocol analyzer that can do powerful decode and hardware-based triggering that enables specialized serial communication analysis. Other vendors’ oscilloscopes use software post-processing techniques that slow down the waveform and decode update rate, but the 1000 X-Series has faster decoding by using hardware-based technology that enhances scope usability and the probability of capturing infrequent serial communication errors.
The 1000 X-Series’ frequency response analyzer capability is the perfect tool to help students understand the gain and phase performance of passive LRC circuits or active opamps. This capability is achieved with a gain and phase measurement versus frequency (Bode plot). Vector network analyzers (VNAs) and low-cost frequency response analyzers are typically used for these measurements, but now an easy-to-use gain and phase analysis is possible by utilizing the 1000 X-Series’ built-in WaveGen.
The 1000 X-Series has an integrated 3-digit voltmeter (DVM) inside each oscilloscope. The voltmeter operates through probes connected to the oscilloscope channels, but its measurement is decoupled from the oscilloscope triggering system so both the DVM and triggered oscilloscope measurements can be made with the same connection. AC RMS, DC, and DC RMS can be quickly measured without configuring the oscilloscope capture. The voltmeter results are always displayed, keeping these quick characterization measurements at fingertips.
There is an integrated 5-digit frequency counter inside each oscilloscope. It operates through probes connected to the oscilloscope channels, but its measurement is decoupled from the oscilloscope triggering system, so both the counter and triggered oscilloscope measurements can be made with the same connection.
· More information about the Keysight InfiniiVision 1000 X-Series oscilloscopes is available at https://connectlp.keysight.com/FindYourLocation_1000Xoscilloscope.
· Electronic media kit, including images and additional supporting quotes, is available at https://about.keysight.com/en/newsroom/mediakit/infiniivision-1000x/.
· Information about the company’s complete line of oscilloscopes is available at https://www.keysight.com/find/scopes.
This is part 2 of the series. You can read part 1 here.
4. What are the roles of comprehensive lab guide and oscilloscope fundamental slide set?
The Educator’s Oscilloscope Training Kit provides an array of built-in training signals so that electrical engineering and physics students can learn what an oscilloscope does and how they can perform basic oscilloscope measurements. Also included in the kit is a comprehensive oscilloscope lab guide and tutorial written specifically for the undergraduate student. Keysight also provides a PowerPoint slide-set that professors and lab assistants can use as a pre-lab lecture on oscilloscope fundamentals. This lecture takes about 30 minutes and should be presented before electrical engineering and physics students begin their first circuits lab. Note that this PowerPoint slide-set also includes a complete set of speaker notes.
5. What do you mean by bandwidth upgradable via software license?
Oscilloscope base model starts from 70MHz but it can be upgraded to 100MHz or 200MHz as the hardware capability is already there. Customers can purchase the upgrade option anytime and enable it via a software license key. Investment friendly feature.
Bandwidth is often regarded as the single most important characteristic of an oscilloscope. Measured in Hertz, the bandwidth of your oscilloscope is the range of frequencies that your oscilloscope can accurately measure. Without enough bandwidth, the amplitude of your signal will be incorrect and details of your waveform might be lost. We help customers get fast, accurate answers to their measurement questions, that’s why we offer the largest range of compliance and debugging application-specific oscilloscope software. These applications are engineered to work with the oscilloscope to quickly and easily provide exceptional insight into the signals.
X-Series measurement applications increase the capability and functionality of Keysight Technologies, Inc. signal analyzers to speed time to insight. They provide essential measurements for specific tasks in general-purpose, cellular communications, wireless connectivity applications, covering established standards or modulation types. Applications are supported on both benchtop and modular, with the only difference being the level of performance achieved by the hardware you select. We provide a range of license types (node-locked, transportable, floating or USB portable) and license terms (perpetual or time-based).
Concluding post link: https://itknowledgeexchange.techtarget.com/quality-assurance/1000-xseries/
In the next three posts, we are in discussion with Mr. Sandeep Kapoor, AD MSM – EMEAI & Marketing Manager India, Keysight Technologies.
1. What are the key features of the new models of the InfiniiVision 1000 X-Series oscilloscopes?
InfiniiVision 1000 X-Series oscilloscopes are entry level instruments with professional-level capabilities. It has 50 MHz to 200 MHz, 2 or 4 analog channels and can see more signal detail with 50,000 wfms/sec update rate. It can make professional measurements, including mask, math, FFT, analog bus, and protocol triggering/decode. InfiniiVision 1000 X-Series oscilloscopes are engineered to give quality, industry-proven technology at unbelievably low prices. It gives professional-level functionality with industry-leading software analysis and 6-in-1 instrument integration.
Further, the two models, EDUX1002A and EDU1002G provide a quality education for students and prepare the industry with professional level instruments. The 1000 X-Series leverages the same technology as higher-end oscilloscopes, allowing students to learn on the same hardware and software being used in leading R&D labs. The built-in training signals enable students to quickly learn to capture and analyze signals.
2. How is it able to improve overall efficiency?
By virtue of multiple unique hardware specifications (fast update rate, high memory, bandwidth) and software analysis tools (Serial trigger, decode, offline software analysis, frequency response, etc.), it becomes one of most user-friendly tool for basic debugging as well as application specific test & measurement tool. Users can make automatic measurements without spending too much time which helps in improving efficiency.
3. How is it able to bring high-end technology in an affordable price range?
Keysight has its own R&D and fab wherein custom ICs or MMICs are designed and fabricated. This helps in leveraging specifihttps://itknowledgeexchange.techtarget.com/quality-assurance/oscilloscope/c technologies to multiple platforms. Also, Keysight has a special focus on Academia wherein we work closely with WW Universities and offer different solutions at an affordable price range under special initiatives.
The discussion continues in the next two posts…Next post link: https://itknowledgeexchange.techtarget.com/quality-assurance/oscilloscope/ Continued »
Japan strives for quality in every aspect of life. Quality is on the top for any kind of production there. It is the quality that doesn’t allow any other factor to surpass it there. A lot of organizations follow Japanese quality management techniques and philosophies. But how many of those follow ethically and aesthetically? No Japanese company will compromise with the quality of its products. China, on the other hand, believes in mass production and price war game. It wants to be on the top in a price war. To be there, it doesn’t mind compromising with the quality of the product. While the Japanese product is for life-long use, Chinese product is use and throw. You can use Chinese product as long as it works. After that just throw it.
There is no concept of repair in a Chinese product. Since the price of Chinese products is so low, you can afford to throw it and buy a new one. Does it really make sense? For instance, you buy a Japanese product for $100 and it runs for more than a sufficient number of years without any major failures. On the other hand, a similar kind of product if you buy from China, you can get it for say $25. Now, if that Chinese product goes bad in a couple of years and is hard to repair or the repair cost is higher than the cost of a new product then obviously you will have no other choice than throwing the older one and buying a new one. So every couple of years you spend $25 which in the long run goes costlier than the Japanese product.
Quality versus mass production
While the Japanese product worked efficiently without any hiccups, it also saved your time and money. On the other hand, the Chinese product not only was substandard in quality, but it also wasted your time and money as and when it went out of order.
Dr. Joseph Juran is known for his quality trilogy that consists of quality planning, quality improvement, and quality control. All three are sequential. This is a cyclic process. Because there is no end to quality improvement. The more you go deep, the more you think, the more ideas you get to improve it. For every improvement thus, there has to be a proper plan and execution. Mere execution doesn’t add value to life unless you find out ways to sustain those improvements. Those improvements, in fact, are not for one time. Their sustenance is more important. As a matter of fact, you can move to the next level only if you are able to sustain on the current level. Falling back below will cause deterioration.
Here are some famous quotes from Dr. Joseph Juran:
“All improvement happens project by project and in no other way.”
“Goal setting has traditionally been based on past performance. This practice has tended to perpetuate the sins of the past.”
“Without a standard, there is no logical basis for making a decision or taking action.”
It is most important that top management be quality-minded. In the absence of sincere manifestation of interest at the top, little will happen below.
Quality planning consists of developing the products and processes required to meet customer’s needs
A good rule in the organizational analysis is that no meeting of the minds is really reached until we talk of specific actions or decisions. We can talk of who is responsible for budgets, or inventory, or quality, but little is settled. It is only when we get down to the action words-measure, compute, prepare, check, endorse, recommend, approve-that we can make clear who is to do what.
Joseph Juran was born in 1904 and he died in 2008. He did a lot of work in the field of quality and quality management. He is known for advanced development of the Pareto Analysis that was founded by Vilfredo Pareto.
Dr. W. Edwards Deming did an astonishing job by encapsulating his complete ideology of management in just 14 points. This was followed by 7 deadly diseases of management to balance both the sides of management. His main focus in life and profession was on the state of quality. That way, he started his journey of quality by picking up the concept of quality from where Dr. Shewhart had left. He is also known for adding advanced state of quality. In fact, he is the one who explained precisely about variation, control charts, and so on. As a matter of fact, Edwards Deming promoted and popularized the PDCA cycle by showcasing its importance in the business world. The quality world knows him for the Deming Cycle.
Some of the great quotes by Dr. Edwards Deming are as below:
It is not necessary to change. Survival is not mandatory.
If you can’t describe what you are doing as a process, you don’t know what you’re doing.
It is not enough to do your best; you must know what to do, and then do your best.
Experience teaches nothing without theory.
People are entitled to joy in work.
Learning is not compulsory… neither is survival.
The idea of a merit rating is alluring. The sound of the words captivates the imagination: pay for what you get; get what you pay for; motivate people to do their best, for their own good.
The effect is exactly the opposite of what the words promise. Everyone propels himself forward, or tries to, for his own good, on his own life preserver. The organization is the loser.
The merit rating rewards people that conform to the system. It does not reward attempts to improve the system. Don’t rock the boat.
Dr. Edwards Deming
Dr. Edwards Deming was born on October 14, 1900, in Sioux City, Iowa, United States. He died on December 20, 1993, in Washington, D.C., United States.
Dr. Walter Shewhart is known among the pioneers in Quality. He is among the top few people who made the world understand what Quality is and why is it so important to thrive and strive in the business world. Dr. Walter, in fact, is the one who developed the concept of Plan, Do, Check, Act or the PDCA Cycle. Some organizations alternatively call it Plan=Do-Study-Act or PDSA. But conceptually both are one or the same. Dr. Walter also designed theories of process control and the globally acknowledge Shewhart transformation process.
Here are some of the top quotes by Dr. Walter Shewhart:
Postulate 1. All chance systems of causes are not alike in the sense that they enable us to predict the future in terms of the past.
Postulate 2. Constant systems of chance causes do exist in nature.
Postulate 3. Assignable causes of variation may be found and eliminated.
Both pure and applied science have gradually pushed further and further the requirements for accuracy and precision. However, applied science, particularly in the mass production of interchangeable parts, is even more exacting than pure science in certain matters of accuracy and precision.
Rule 1. Original data should be presented in a way that will preserve the evidence in the original data for all the predictions assumed to be useful.
Every sentence in order to have definite scientific meaning must be practically or at least theoretically verifiable as either true or false upon the basis of experimental measurements either practically or theoretically obtainable by carrying out a definite and previously specified operation in the future. The meaning of such a sentence is the method of its verification.
Dr. Walter Shewhart
Dr. Walter Shewhart was born on March 18, 1891. He died on March 11, 1967, at the age of 76. He was an engineer by occupation. A lot of things happened in this field but the basis that he established like PDCA and process control remains as valuable today as at that time.
In a recent mega launch of smart home appliances by one of the top global brands in India, I was thinking about how smart really are these smart devices. Who defines their smartness and to what extent that impacts our daily life as an end consumer. The various intelligent devices launched were smart TVs, washing machines, air conditioners, headphones, etc. Now if we talk about smart televisions, the key emphasis on various features to ascertain it as smart were its huge size, excellent picture quality, slimness, huge content it carries, multiple languages support, and so on. Well, all this is good especially the content that I can choose as per my personal choice. That personalization is good but is that enough that I would need. Can it cross certain barriers and become smarter, or in fact, real smarter. For instance, can it allow bookmarks, content highlight,etc.?
Similarly, when it comes to Washing Machines, can a washing machine be intelligent to the extent that the water it uses in the first cycle gets purified within the machine in a separate segment and the same water is used in the second cycle. The same process again purifies this water and the machine uses it for the third cycle. In fact, after the final cycle, the same water gets collected in a bucket, detergent-free, that I can use to water my plants? Wouldn’t it be more eco-friendly? On the same notes, I was thinking, why not my smart headphones charge when I jog or exercise converting that energy for the purpose of charging. Something like a self-charging phenomenon. Air conditioners, for instane, consume a large amount of electricity. Can there be a mechanism where they run with least consumption of electricity? And throw least heat?
Smart Devices need to be more intelligent to address global concerns
All above thoughts are for intelligence in my view. In the name of intelligent devices, there is a huge scope. But every smart devices feature must be there for a cause addressing a global concern. Not, instead for the fancy and fantasy of consumer.
This is a real-life example of the world’s largest e-retail enterprise operating in India. My purpose in quoting this example is just to showcase how a small bit of absence of technology can become a big differentiator. Since it is an online store, I just log in my account on my laptop or smartphone through its desktop app or mobile app and do the needful. Everything is so smooth. Finding what I want, ordering, and getting confirmation of delivery on such and such date. Fantastic. In fact, that is the reason for my sticking to this online hyper retail store. Technology is fully leveraged and things are flawless so far. The processes are well defined and automated. That is the reason that is able to handle millions of customer concurrently at any moment of time. A similar number of orders in a day happens.
Every delivery is recorded immediately by the delivery person. I, as a customer, immediately get an alert that the item has been delivered asking my feedback and how was my experience. Well, the first shortfall in this feedback is it just highlights options, do the rating, and submit it. There is no place to enter remarks. The incident in my case is of a false delivery reporting. The delivery person called me one evening informing me that he is to deliver a bundle. Nobody was at home and thus I requested him to come back the next day. He argued if he could hand it over to anybody a floor down or above, or to the security person, that I denied. The delivery person accepted to come the next day, at least to me over the phone.
Customer Experience and Technology
Surprisingly, within the next 2 hours, I get an alert that the consignment has been delivered. Shockingly the status showed it has been received by me in person. Can technology not do a lot in this case thus making this incident a happy ending story?
Digital transformation is a buzzword for many CIOs, CTOs, and CXOs in most of the enterprises. In the name of technology, they might have a lot in place. But, in my opinion, what matters most is the last mile technology that defines your maturity and success. Having SAPs, CRMs, and other enterprise apps in place are fine. What about where it is needed most. The endpoint. An instance where your actual consumer or customer shakes hand with your last bit of face. Let me elaborate it a little by a few real-life examples. There is a top government-run milk and milk products company in India having excellent technology framework and ecosystem, as its CIO claims. It is a decades-old organization. Besides selling milk and dairy products they also have outlets for selling vegetables, fruits, pulses, etc. The number of these outlets is huge.
What lacks in this organization is a mechanism to automate their billing and inventory. With a little automation, it could easily identify the balance of items at any moment of time. This they have recently done though it is too late and is still full of a lot of glitches. More than this is important to know the pulse of the local consumers who come to these stores. That includes me for the store that is near to my residence. When I need a specific vegetable at this store, I get to know that either it didn’t replenish today or is finished. Both are quite vague reasons in this world of technology. The worst part I have seen is that this local store that belongs to that organization does a lot of manipulations. Let’s look at real-life instances here.
Where is the last mile technology?
For example, the store owner orders 20 kilograms of fresh and good quality pumpkins from the warehouse. He gets the same quantity and quality, say in the early morning hours. Within 10 minutes all this fresh lot is shifted to the local vegetable vendor and a lower grade same quantity pumpkins is replaced with it. Now, this will sell at a high-quality rate where in reality, the buyer is getting lower quality material. And there is no check and control mechanism here. Where is the last mile technology? And where are the CXOs of this organization living far away from the real fruits what technology could reap for them?
TensorFlow plug-in recently launched by Quobyte for its distributed file system proves to enhance throughput performance by 30 percent for Machine Learning (ML) training workflows. Probably this is the first of its own kind of plug-in for a proven enhancement in machine learning capabilities. Quobyte is among the top developers of modern storage system software. The Quobyte Data Center File System is the world’s first distributed file system to offer a TensorFlow plug-in that empowers its customers with increased throughput performance with linear scalability for ML-enabled enterprise applications. This, in turn, enables faster and uniform training across larger data sets. Obviously, it automatically ensures higher-accuracy outcomes thus enhancing business’ decision-making capabilities. Any business decision loses its sanctity if not taken in time. A delayed business decision, obviously, causes damage to business growth. The plug-in, in fact, is an open source library for numerical computation.
TensorFlow fits all businesses as it is an open source library. Further, with its numerical computation capabilities handling large data sets, it promises to evolve large-scale machine learning that is already setting its root among various technology segments across industries and institutions like autonomous vehicles, financial services, robotics, government, defense, aerospace, and so on. The scope is endless. It is only how much an industry can leverage its power to enhance decision-making capabilities. With the help of Quobyte storage and TensorFlow, any industry vertical can simplify and streamline their operation of machine learning. The plug-in enables TensroFlow applications to communicate directly to Quobyte by bypassing the operating system kernel. This substantially reduces kernel mode context switching which in turn ensures lower CPU usage. The best part is Quobyte storage can work with all stages of ML thereby increasing GPU utilization from the TensorFlow plug-in.
TensorFlow Is An Open Source Library
Apparently, increased GPU utilization from the TensorFlow plug-in significantly improves model training of ML workflows. Users gain the flexibility to train anywhere. The models can move flawlessly into production. Since the plug-in has nothing to do with the kernel, it works with any version of Linux providing a high amount of flexibility in deployment for use in ML. As a matter of fact, you don’t require any kind of application modifications while using Quobyte TensorFlow plug-in.
Frederic Van Haren, Lead Analyst HPC and AI Systems of analyst firm Evaluator Group says, “As more and more businesses look to leverage ML to increase innovation, achieve a faster time to market and provide a more positive customer experience, there is an increasing need for storage infrastructures that offer higher performance and increased flexibility that these workloads need. Vendors, like Quobyte, that offer high performance, broad platform support and flexibility of deployment options are well positioned to help companies handle bigger data sets, achieve more accurate results and run ML workloads in any environment.”
Bjorn Kolbeck, Quobyte CEO says, “By providing the first distributed file system with a TensorFlow plug-in, we are ensuring as much as a 30 percent faster throughput performance improvement for ML training workflows, helping companies better meet their business objectives through improved operational efficiency. With the higher accuracy of results, scalability to handle bigger data sets and flexibility to run on-prem to the cloud, and edge, we believe we are providing an optimal experience that allows customers to fully leverage the value of their Machine Learning infrastructure investments.”
Cynet is among the global leaders in automated threat discovery and mitigation. If you are an organization with 500 or more endpoints then it becomes important for you to understand this threat assessment program. One of this program’s agenda is to offer free cybersecurity threat assessment. Since there is no upper limit of endpoints for this free assessment, it means even the larger organizations can get the advantage of it. Of course, the risk for any organization depends on two main factors. One, the size of business. Two, the type of business. The larger the size of the business, the larger is the risk. The more critical is your business, the more severe it becomes to ensure your online system’s availability. If I compare Amazon and Walmart, Amazon having no physical stores can’t afford even a few minutes downtime of its systems. A few minutes will cost billions.
When it comes to cybersecurity, threat assessment is equally important for all businesses. Every business whether operating online or offline has an external exposure. The business might not be online for many organizations but their various locations accessing various enterprise solutions must be having online exposure. That is why for every business, having its true security posture check is important. It is equally important to assess profiles risk exposure relative to industry peers. Obviously, if you are at a higher risk than your industry peers, your business will have a higher risk of getting diverted to them.
Cynet offers free threat assessment
As every customer want peace of mind, a business having the least risks will be the first choice for any customer. The cynet Threat Assessment program is for organizations with more than 500 endpoints to undergo this assessment program for free and evaluate and identify critically exposed attack surfaces. That’s the first point of realization.
Once it is done, only then you are in a position to understand actionable knowledge of attacks that are currently alive and active in the ecosystem. In comparison to Cynet Threat Assessment program, any other assessment would be costly and time-consuming. The program needs hardly 72 hours to run a complete assessment with zero out of pocket expense. A recent report ‘Ninth Annual Cost of Cybercrime Study’ published in March 2019 by Accenture and the Ponemon Institute says, “The cost of malware and malicious insider cyberattacks grew 12% in 2018 compared to the previous year. The former now cost U.S. companies an average of $2.6 million annually and the latter $1.6 million. The combined totals equate to one-third of the $13 million average cybersecurity costs to companies, which is $1.3 million more than in 2017.”
As a matter of fact, Cynet has changed the rules of the game altogether by offering a free threat assessment. This free assessment is important for an organization to benchmark its security posture against peers in their business stream. With a correct assessment, an organization can implement remedies quickly and accurately and mitigate the risk to enhance business outputs. The key advantages of Cynet Threat Assessment program are Indications of live attacks, User Identity attack surface, Hosts and apps attack surface, Peer Benchmarking, and Ranking.
Cynet changes the whole paradigm of enterprise threat assessment
Eyal Gruner, CEO, and co-founder of Cynet says, “It is becoming increasingly common to find that organizations are already playing host to malicious activity at varying degrees of attack when we come to deploy our platform. Typically, organizations underestimate the attacker’s ability to silently operate, helping these criminal operations to be successful. With the Cynet Threat Assessment service, we are taking a proactive approach to discover the active risks and remove them from the environment.”
“A word ‘unprecedented’ seems too weak to convey just how much the dimensionless operational space of digital (r)evolution requires an instantaneous reaction.”
“A central topic of my essays is cybersecurity.
A fundamental and delicate question at the heart of my work is: how to motivate my readers to want to learn more.”
“People and organizations need to trust that their digital technologies are safe and secure; otherwise they won’t embrace the digital transformation.
Digitalization and cybersecurity must evolve hand in hand.”
“I advocate a Systems Thinking approach in educating our readers, followers, friends, business associates on digital transformation, emerging technologies and cybersecurity.
Systems thinking forever changed the way I think about the world and approach issues.
An open immeasurable non-linear system – the Cyber Space, where cyber threats and cybersecurity are two of many (to be defined) elements of this system.”
“The discipline of systems thinking is more than just a collection of tools and methods.
Systems thinking is a philosophy and a methodology for understanding behavior of complex dynamic systems.”
“Cybersecurity is becoming the most important security topic of the future – particularly in the age of digitalization.”
“Why is cybersecurity so hard?
o It’s not just a technical problem
o The rules of cyberspace are different from the physical world’s
o Cybersecurity law, policy, and practice are not yet fully developed
o There’s not enough manpower in the world to make sure networks are 100% secure 100% of the time, especially with the prevalence of a cloud-based infrastructure.”
“With hope to create and scale globally an inclusive ‘authors-publisher-readers’ circle of wisdom and expertise; with channelled determination to gain understanding by carefully selecting the best information sources (Dis Moi où Cherche! Mais où?) and reading between the lines, multiplied by expressed interest for knowledge sharing by the industry experts, and as part of my ‘Top Cyber News’ extended roundtable series; I brought in one-of-a-kind ‘Men on the Arena’: Charles (Chuck) Brooks, Stewart Skomra, Mike Quindazzi, and Scott Schober to create a series of articles ‘The Globality Quotient: Cybersecurity’ published by Dennis J. Pitocco, BizCatalyst360 – an award-winning digital magazine.
The Globality Quotient: Cybersecurity
Cybersecurity – Prevention And Protection.
Cybersecurity – What is Ethical Hacking or a Hacker is a Hacker.
Cybersecurity “Hacked Again” & Women in Digital Universe
Women in Cybersecurity: Why Closing the Gender Gap is Critical via TechNative”
“Persuasion with good intention helps problem-solving; manipulation often serves the ego.”
“Coherence improves business flow; resilience makes business robust and anti-fragile.”
“Taking the multidimensional hybrid models for going digital is all about how to strike the right balance of reaping quick wins and focusing on the long-term strategic goals.”
“It is in the best interest of the talent management to improve employee performance by striking the right balance of ends and means.”
“The challenge of digitalization is to have a harmonized vision and build a customized structure to enforce open communication and collaboration.”
“A hybrid organizational structure can bring greater awareness of intricacies and systemic value of organizational systems, processes, people dynamics, technology, and resource allocation, etc.”
“Digital businesses and their people learn through their interactions with the environment, to keep knowledge flow as well as business flow, and strike a delicate digital balance.”
“The digital paradigm that is emerging is the dynamic organization with hybridity of knowledge, flexible processes, and unique competencies.”
“The paradox is the result of two opposing truths existing side by side, which can be both right”
“The purpose of digitalization is to make a significant difference in the overall levels of business performance and organizational maturity.”
“Digital kaleidoscope shows the ever-evolving dynamic view, hybrid digital patterns, and mixed cultures.”
“The hybrid nature of innovation is a combination of something old with something new, with a mixed portfolio of incremental innovations and radical innovations.”
“The hybrid decision-making style is practical because we live in such a hybrid, networked, and extended modern digital working environment.”
“The digital organization has a hybrid nature with flexibility, agility, and innovativeness.”
“Either at individual or business level, we should follow the “simplicity” principle to handle the over-complex digital reality with “VUCA” characteristics.”
Digital Hybridity by Pearl Zhu
“Inclusivity” is like a gift box, really not about the color outside, but the context inside.”
Digital transformation has a different meaning and perspective for different individuals and organizations. Let us see from the perspective of leaders in this arena through some classic digital transformation quotes.
Digital Transformation Quotes
“Education has always been a profit-enabler for individuals and the corporation. Education, both conception and delivery, must evolve quickly and radically to keep pace with digital transition. Education is a part of the digital equation.”
― Stephane Nappo
“People and organizations need to trust that their digital technologies are safe and secure; otherwise they won’t embrace the digital transformation.
Digitalization and cybersecurity must evolve hand in hand.”
― Ludmila Morozova-Buss
“Taking the multidimensional hybrid models for going digital is all about how to strike the right balance of reaping quick wins and focusing on the long-term strategic goals.”
― Pearl Zhu, Digital Hybridity
“Coherence improves business flow; resilience makes business robust and anti-fragile.”
― Pearl Zhu, Digital Hybridity
“A weak digital security can jeopardize a robust physical safety.”
― Stephane Nappo
“CISO 2.0 must lead digital transformation efforts. Act no more like a policeman. Be the dietician of the risk appetite and a business differentiator.”
― Stephane Nappo
“As truly successful business decision making relies on a balance between deliberate and instinctive thinking, so does successful digital transformation rely on interconnectedness and interdependence of the state of the art technologies.”
― Stephane Nappo
“A word ‘unprecedented’ seems too weak to convey just how much the dimensionless operational space of digital (r)evolution requires an instantaneous reaction.”
― Ludmila Morozova-Buss
“A central topic of my essays is cybersecurity.
A fundamental and delicate question at the heart of my work is: how to motivate my readers to want to learn more.”
― Ludmila Morozova-Buss
“5 Ways To Build Your Brand on Social Media:
1 Post content that adds value
2 Spread positivity
3 Create a steady stream of info
4 Make an impact
5 Be yourself”
― Germany Kent
Some fabulous Digital Transformation Quotes
“If we all work together there is no telling how we can change the world through the impact of promoting positivity online.”
― Germany Kent
Enterprise Digital Transformation has become a buzzword these days. You attend any technology event and you find its mention there. In one form or the other. There are many masters emerging in this field. Organizations are claiming fast their adoption of this journey. It matters a lot whether they are taking it as a fancy journey or have some meaningful business-oriented goals in mind to achieve. There are many master concepts emerging in its name.
A lot of vendors are emerging with some very beautiful and lucrative concepts for your organization to adapt to become a digitalized organization. A lot of business deals are happening in their name. Whether these deals are able to see the day of the light is something that needs to be seen. Because after all, it’s your money that is going into the vendor’s pocket. The capability lies not in solution but results.
Enterprise Digital Transformation is a journey
Digital Transformation is also omnipresent these days in any technology publication. Be it online or print. But I feel, it is more of making noise than achieving something significant in enterprise digital transformation. Let us understand the top 10 killers in an enterprise journey that misleads them from a real path of digital transformation. The first and foremost is = DIGITAL TRANSFORMATION CAN’T HAPPEN ALONE OR IN ISOLATION.
Most of the organizations fail in their journey because either their understanding level is quite low in this regard. So they fail to understand the real idea behind it. Or it is the driver of their journey who acts weird. This is not a journey of haphazard ideas and goals. Everything is interlinked. It has to be sequential with a single major milestone to achieve one at a time. There can be though micro-segments or parallel sub-milestones to achieve in that.
Enterprise Digital Transformation Needs Super Engagement
The top two things that come to my mind for achieving significant success in enterprise digital transformation are data and people.
Digital transformation carries a different meaning for different organizations and individuals. The course of digital transformation also depends on the current level of any organization in this foray. For instance, an organization with zero digital initiatives will have a different course of action as compared to an organization already ten steps ahead to the former. On the other hand, a number of startups are taking very innovative initiatives and transforming the world in a real digital way. One such example in this segment is Escher. Escher is an organization that has a clear focus on modernization of global Posts for which the best way is to ensure a superb customer engagement platform. This customer-centric approach definitely helps in achieving business goals on better terms with their customers without any compromise in quality of service or product. What Escher is doing is probably one of its own kind of initiative.
The core strength of Escher lies in its innovative customer engagement mechanism that has become a benchmark for others. Their vision is to enable posts to leverage digital transformation technologies to drive the whole ecosystem with greater speed and better economics. As a matter of fact, this is the first customer engagement application for posts. Currently,35 postal and courier customers are using this platform globally. The application is helping them to shorten the gap between the quality of service in comparison to their counterparts in the private segment. The private shipping companies, in fact, are equipped with better systems as they can afford to pay higher costs. On the other hand, globally, all customers expect a flawless experience despite all kind of digital disruptions happening across the globe. Irrespective of their locations, almost all customers have similar kind of expectations that are real-time informative as well as interactive.
Escher Creates A State-of-the-Art Platform
Nick Manolis, chief executive officer of Escher says, “Today’s postal customers have options for doing business, and we are focused on helping postal operators and couriers meet customers’ high expectations by evolving into digitally-driven, multi-channeled organizations. The Escher platform goes beyond a counter solution in posts’ retail stores. We help posts meet customer demands on their terms.” He further adds, “At Escher, we understand postal and courier operations and the pressures and constraints they face, and we have invested over $80MN in R+D for customer-focused technology. We are dedicated to helping modernize postal and courier operations using the expertise we have gained over two decades in helping to transform over 35 postal and courier operations globally and processing more than two billion transactions annually”.
You can find more details here.
MapR announces MapR AI Edge Program. This program benefits all NVIDIA Inception startup members with a free MapR enterprise license. This is something phenomenal. And it means a lot for the startup community. What it means is that all NVIDIA AI startups can boost their AI development lifecycle with MapR Data Platform. That too with no additional investment. MapR, as we all know, is the visionary creator of next-generation AI and Analytics Data Platform. MapR AI Edge Program, in fact, is an AI accelerator program completely free for this exclusive segment. This enables deployment agility along with data management. This applies to any kind of data between and across edge, on-premise, and cloud. And it covers all Machine Learning (ML) and Artificial Intelligence (AI) products. Probably it will be of great help to Startups working in these respective fields. They get a number of opportunities and enhancements at no cost.
Jack Norris, Senior Vice President, Data and applications, MapR says, “Customers will be able to provide more impactful demos of their AI product by running GPUs anywhere and being able to take advantage of all of the features and capabilities built into MapR. MapR AI Edge Program enables faster deployments and the ability to spotlight NVIDIA in mixed-use environments, eliminating barriers and expediting value creation of AI apps from development to testing and demonstration.” In fact, MapR Enterprise License is a bonus for the startups working in this arena. It will definitely help them to accelerate their business giving them new dimensions to achieve faster deployments. It is going to give them multiple benefits. MapR Enterprise License comes with no storage limitations that gives startups a free hand to develop, test, and demonstrate products. It’s a boon for the startup fraternity. A must grab benefit for them.
MapR Enterprise License is a boon for startups
MapR Enterprise License free for startups will include a number of benefits for them to thrust their business in the right direction. To know more visit here.
The whole world knows the value of quality and how much it matters in terms of loss and profits, then why the top retailers across the world failed to adhere to its standards and procedures as per 2019 Retail Quality Report? One thing I always fail to understand – the real cause of failure in quality measurements. Is it due to time limitations? Is it there too much to achieve? It might be a lack of knowledge. It might be the absence of confidence. Rather, it might be a result of overconfidence. It could also be the result of the right set of people to achieve it. There could be other reasons that probably my readers can pinpoint in the comments section of this article. Let us try to examine these reasons one by one. Whatever is the case, these reasons can lead to big losses, blunders, and failures.
The first reason that comes to my mind is time limitations. If it is too much work to do in too short a period, then it is foolish to start it. It is foolish, in fact, at the management’s end to undertake these kinds of projects. As a matter of fact, the whole project should be broken into two proportions of 80:20. Segregate the goals. Sort out the top 20% of most important ones and then spend the whole time in achieving those with full dedication of analysis, development, and testing. These 20 top goals will get you more than 50% of the results that you were planning to achieve with a whole lot of objectives. Finding the right reason is of utmost importance when it comes to quality. Applause, a global leader in digital quality and testing through crowdsourcing, releases its 2019 Retail Quality Report recently.
2019 Retail Quality Report Is a Big Learning For Global Retail Industry
It is important to go to the root cause of the reasons causing these failures. It is equally important to ensure the same mistakes don’t repeat this year. Otherwise, the whole purpose of 2019 Retail Quality Report gets defeated.
Active Archive Platform From Aparavi gets a major thrust with some phenomenal enhancements a few days back. The platform is already quite popular among multi-cloud data management customers. Aparavi is a market leader in this technology segment. The new features give a triple benefit to its customers. These three areas where the customers get a major boost are operational efficiency, multi-cloud data management, and insight. Any improvement, in fact, brings a chain of benefits. For instance, better resource management leads to enhanced operational efficiency. This, in turn, results in better insight. And all this together helps in greater management of archived data. Some of the new features that need a special mention include direct-to-cloud data transfer, next generation of data classification, full-content search, and tagging. For customers, the addition of any number of new cloud destination becomes simpler and easily manageable including bulk data migration across multiple clouds.
Aparavi launched its Active Archive platform in May 2018. Within a short span, it is quite popular among organizations that were grappling with the large volume of unstructured data. Retaining such data becomes an obligation for organizations in wake of compliance, business reporting, analytics, and historical reference. Thus, this intelligent multi-cloud data management empowers enterprises to actively manage their data for long-term policy-based retention, re-use, and open access. And all this happens while providing a hassle-free way to multi-cloud adoption. These new features can be divided into four major segments. The first remarkable benefit is Direct-to-cloud for improved resource management and operational efficiency. Second, Next-generation data classification and tagging for better management and simplified access. Third, Next generation search for greater insight and access. And fourth, Enhanced multi-cloud management and bulk-data migration.
Aparavi Named Gartner Cool Vendor 2018
Brian Ricci, President, Pinnacle Computer Services says, “Aparavi has helped us greatly with our client’s initiatives to move data to the cloud for long-term retention. With these new features, we can more easily organize and manage the vast amounts of data stored, and find specific data as needed by an organization. It’s a real game-changer.”
Marc Staimer, President of Dragon Slayer Consulting says, “Long-term retention is changing. It is no longer enough to simply store data for lengthy periods of time. Compliance, data migration, search, multi-cloud, and cost control of that data are all difficult problems IT organizations are wrestling with. Aparavi’s latest version tackles these problems head-on in a cost-effective manner.”
Adrian Knapp, CEO, Aparavi says, “We hear from customers all the time about their need to retain data for lengthy periods – often forever – but that they face real challenges in managing it effectively and efficiently. Our Active Archive platform with these enhancements provide the solution. That’s why we tell them, ‘Keep your data! Just do it better.’”
Formulus Black, a ventured startup, creates a new landmark in computing by changing the whole paradigm. It is something that is happening for the first time in the world. While so many giants might have been struggling with this idea and finding ways to give it a real shape, Formulus Black breaks the barrier and takes a leap in a true sense by harnessing the power of in-memory compute for all applications. That is going to be a great boon for large and medium enterprises. The product, ForsaOS breaks the barriers between memory and storage thus giving a revolutionary direction to computing. This is something phenomenal. And at the same time is very crucial for the technology guys working in enterprises to understand the whole ball game. Because only then an enterprise reaps the benefits of this power of in-memory compute.
In-memory compute not only enhances compute operations but saves humongous cost. This was in fact, a major issue being faced by all hardware and software giants to tackle in computing. With the launch of ForsaOS, Formulus Black not only addresses these major issues but also creates a new compute benchmark. ForsaOS is a complete Linux-based software stack that is designed precisely to run all applications in memory. This revolutionary technology brings a huge benefit in compute efficiency. This, in fact, results in a significant benefit in cost-effectiveness, processing speed, memory capacity, and data security with no changes in the application. Everything keeps happening as before with a heap of benefits in cost, speed, memory, and security. No sort of compression or encryption happens in this process. The software stack keeps data in persistent memory.
Formulus Black Launches ForsaOS
This is the first time in the world a technology is launched that keeps data completely safe against power loss. That is a great achievement. Wayne Rickard, Chief Strategy and Marketing Officer, Formulus Black says, “The challenge within the computing industry continues to be how to achieve the fastest speeds with the lowest latencies needed to satisfy increasing demands of the compute side of the equation while overcoming the expense and limited capacity issues from the memory side required to achieve it. We have designed ForsaOS to address these issues by amplifying the memory. Because CPU to memory is extremely fast while I/O to external storage peripherals is slow, we have developed a software solution that utilizes fast DRAM memory as storage while providing all the necessary management tools and features needed to increase effective memory capacity by up to 24x while improving processing speed as much as 450x.
This is an interview with Jerry Melnick, president and CEO, SIOS Technology on SIOS DataKeeper. Jerry Melnick, president and CEO, is responsible for directing the overall corporate strategy for SIOS Technology Corp. and leading the company’s ongoing growth and expansion. He has more than 25 years of experience in the enterprise and high availability software markets. Before joining SIOS, he was CTO at Marathon Technologies where he led business and product strategy for the company’s fault tolerant solutions. His experience also includes executive positions at PPGx, Inc. and Belmont Research, where he was responsible for building a leading-edge software product and consulting business focused on supplying data warehouse and analytical tools.
SIOS Technology provides IT Resilience for critical applications like SQL Server, Oracle, and SAP in the cloud, hybrid cloud or datacenter. Using SIOS high availability clustering software, applications automatically recover from infrastructure and application failures in a matter of minutes with no loss of data, keeping data protected, and applications online. SIOS was founded in 1999 and is a subsidiary of SIOS Corporation, a publicly traded company based in Japan (TYO:3744). The company is headquartered in San Mateo. SIOS software runs business-critical applications in a flexible, scalable cloud environment, such as Amazon Web Services (AWS), Azure, and Google Cloud Platform without sacrificing performance, high availability or disaster protection.
1. What is SIOS DataKeeper all about?
SIOS DataKeeper software is an important ingredient in a cluster solution that lets users add high availability and disaster recovery protection to their Windows cluster or to create a SANless cluster for complete failover protection in environments where shared storage clusters are impossible or impractical, such as cloud, virtual servers, and high-performance storage environments.
Clusters built with SIOS software protect applications including Microsoft SQL Server, SAP, SharePoint, Lync, Dynamics, Hyper-V, and more from downtime and data loss using a SAN or SANless cluster in physical, virtual, and cloud environments and provide enterprise-class protection for all server workloads at a fraction of the cost of array-based replication.
2. How does SIOS DataKeeper ensure high availability for critical applications of an organization?
SIOS DataKeeper eliminates the need for complex and costly hardware SANs when configuring systems for high availability or disaster recovery. It uses fast, efficient, block-level replication to transfers data across both local and wide area networks with minimal bandwidth. It delivers incredibly fast replication speeds without the need for additional hardware accelerators or compression devices. This allows companies to flexibly configure the optimal recovery environment to meet their business objects without the constraints of hardware SANs.
3. What kind of organizations fit in for having SIOS DataKeeper?
SIOS software is used by companies large and small and by applications with just a few gigabytes of data to many terabytes. The software works with any Windows application transparently without modification. It is employed across a wide variety of industries. DataKeeper is used where a company’s day to day operations rely on the availability of the application. With over 70,000 licenses, many of the world’s largest companies use SIOS software to protect the applications their business depend on.
SIOS software runs business-critical applications like SAP and databases such as SQL Server, Oracle, and many others in a flexible, scalable cloud environment, such as Amazon Web Services (AWS), Azure, and Google Cloud Platform without sacrificing performance, high availability or disaster protection.
4. How SIOS DataKeeper edges over other similar products in the market for its Cost-effectiveness and Application-agnostic Design?
The extreme flexibility to create high availability clusters using existing hardware with or without a hardware SAN in both high availability and disaster recovery configurations is the key differentiator. You don’t need to purchase new hardware or learn new technology. DataKeeper works out of the box with Microsoft Windows Server Failover Clusters (WSFC) as a simple add-on and is installed and configured in minutes. Since it’s based on Windows and works at the operating system level, any application that runs on Windows can be protected by WSFC with SIOS DataKeeper. And you don’t need to change the application – it’s all transparent.
5. What advantage did ALYN Hospital get after deploying SIOS DataKeeper?
ALYN Hospital IT was seeking to use existing hardware and their Hyper-V environment that was configured and operating in separate server rooms on-premises. They needed to achieve both high-availability with no loss of data and minimal uptime and disaster protection while providing them ways to maintain uptime during upgrades. SIOS DataKeeper gave them all of these capabilities without costly new expenditures or reconfiguration.
The ability to create 3-node SANless failover clusters with a single active and two standby instances has proven to be especially valuable for ALYN’s needs. They are updating systems and software continuously, and with DataKeeper they can do that without any disruption to operations. Because the data replication supports multiple standbys and enables manual, dynamic assignment of the active and standby instances, the active instance can be moved to any server in a 3-node cluster and remain fully protected during periods of planned hardware and software maintenance.
Other SIOS DataKeeper features that are important to ALYN’s needs include the ability to work with any type of storage and WAN-optimized data replication. The SIOS cluster seamlessly supports any storage volume recognized by Windows, and this substantially simplifies their operations while enabling them to utilize all of their storage resources. Additionally, the WAN optimization will prove useful as ALYN Hospital implements its remote disaster recovery site.
ALYN Hospital is confident the SIOS SANless failover cluster will perform as desired when needed: They test the configuration regularly and routinely change the active and standby designations while redirecting the data replication as needed during planned software updates, and the applications have always continued to run uninterrupted.
6. What was their evaluation criteria?
To evaluate third-party failover clustering software, ALYN Hospital established three criteria: The solution had to work with existing hardware; it had to provide both high availability (HA) and disaster recovery (DR) protections all of the hospital’s critical applications; and the total cost had to fit within the department’s limited budget. The IT staff quickly narrowed the third-party options to two, and after carefully evaluating both, found that only one met all of its criteria: DataKeeper from SIOS Technologies. While they needed a solution that was cost-effective, they were determined not to sacrifice quality or capabilities. With SIOS they found a solution that delivers carrier-class capabilities with a remarkably low total cost of ownership.
7. What are all the platform SIOS DataKeeper works on?
SIOS DataKeeper software adds disaster recovery protection to a Windows cluster or to create a SANless cluster for complete failover protection in Windows environments where shared storage clusters are impossible or impractical, including any combination of physical, virtual, cloud, or hybrid cloud infrastructures. SIOS software runs business-critical applications in a flexible, scalable private or public cloud environments, such as Amazon Web Services (AWS), Azure, and Google Cloud Platform, hybrid clouds or on-premises datacenters.
SIOS also offers SIOS Protection Suite for Linux which is one of a suite of Linux and Windows clustering solutions offered that use artificial intelligence (AI) to improve IT resilience, maintain uptime and lower operational costs.
Munson Healthcare is the largest healthcare system in Northern Michigan. They currently have more than 540,000 unique patients across 30 counties. There are nine community hospitals in this huge network. To manage such a huge customer base, having an accurate and comprehensive record of patients data had become a prime necessity for them to operate flawlessly. After all, healthcare is all about the highest quality of patient care and productivity for users. There was an asking for a new EMR to enhance operational efficiency. The motive was also to ensure compliance with increasing regulations. Complete Data Accuracy Platform from Naveego was not in their procurement list still. It was assumed that the new EMR will be effective enough to manage their newly acquired hospitals, outpatient facilities, and practice groups. But the new EMR system was not able to help in data integration. Something was seriously missing.
Munson’s staff was spending a substantial amount of time to manually document, execute, and validate data. In fact, data mapping was becoming a big pain. Despite investment in EMR, staff had to perform manual spot checks to validate successful migration of appointments. This was becoming a big overhead leading to escalating costs, time loss, and schedule complications. The entire initiative thus was heading towards failure. That is where their hunt for a Complete Data Accuracy Platform began. Naveego’s platform was the only system among many others to qualify to meet all their requirements to address the data accuracy challenges. With the new solution from Naveego in place, the Munson staff was able not only able to increase their efficiency manifold but adherence to processes also improved significantly. Everything was in place within a few days. It was an automatic creation and monitoring of quality checks.
Complete Data Accuracy Platform from Naveego Is A Revolution
Munson Healthcare is quite happy after selecting Naveego Complete Data Accuracy Platform. They are able to manage, detect, and eliminate data issues well in advance. It is helping them to achieve significant cost savings. Naveego achieves it by connecting multiple data sources into a single EMR system. This makes proactively achieve Global Data Health for Munson Healthcare.
Signalchip Launches Semiconductor Chips for 4G/LTE and 5G NR Modems. These are India’s first indigenous semiconductor chips. Fulfilling India’s Prime Minister Narendra Modi’s dream of ‘Make In India’. In fact, this is a phenomenal achievement. This newer silicon-chip technology innovation by Signalchip places India on the global map. Bengaluru once again proves to be India’s key technology hub. This fab-less semiconductor company has achieved something exemplary that many global companies would have been striving to achieve working in various parts of the world. Telecom Secretary Aruna Sundarajan was present at the launch. These are SCBM34XX and SCRF34XX/45XX series of chips. ‘Agumbe’ is the name given to these chips. It is a result of the hard work of more than 8 years. It involved deep research by engineers with a global level of capabilities working with Signalchip. This is totally a game changer in the global Telecom industry.
Signalchip unveils four chips. SCBM3412 is a single chip 4G/LTE modem. It includes the baseband and transceiver in a single device. SCBM3404 is a single chip that is 4×4 LTE baseband modem. SCRF3402 is a 2×2 transceiver for LTE. SCRF4502 is a 2×2 transceiver for 5G NR standards. All LTE/5G-NR bands upto 6GHz are covered by the RF sections. India has its own satellite navigation system known as NAVIC. These chips support positioning using this satellite navigation system. Interestingly, Agumbe is a small remote village in Shimoga district, Thirthahalli taluk in the Malnad region of Karnataka, India and is also known as Cherrapunji of the South. This is because of the heavy rainfalls there. The Agumbe series builds upon SCRF1401. That is India’s first RF transceiver chip. It adheres to high-performance wireless standards such as 3G/4G and WiFi. It was designed by Signalchip in 2015.
Signalchip Brings A Revolution in 4G/5G, In fact
Aruna Sundararajan, says, “This is a proud moment for India’s Digital Communications industry. I congratulate the Signalchip team for designing India’s first indigenous semiconductor chips for 4G/LTE and 5G NR modems. India aspires to take a leadership role in developing inclusive 5G technologies for economic self-sufficiency and strategic needs of the country. These chips are a significant step in this direction as they have the potential to cater to the growing digital connectivity needs of the next 5 billion users, by enabling high-performance mobile networks at lower cost.” She added, “The Signalchip team’s persistence over the 8-year R&D period and the commitment shown by Sridhar Vembu to believe in this vision is indeed commendable.”
Himamshu Khasnis, Founder and CEO of Signalchip says, “Currently in India, all devices and infrastructure, whether imported or domestically manufactured, use imported silicon chips. Silicon chip design is a very challenging activity requiring high-cost R&D, deep know-how and mastery of multiple complex domains. Hence, this technology is not available in most countries. Given that wireless communication is central to almost all economic, strategic and domestic activities happening today, the ability to indigenously design and develop silicon chips is vital for the security and prosperity of our country.”
Finally, Sridhar Vembu, Founder & CEO of Zoho and Mentor to Signalchip says, “India has always had the talent required to build any technology. We just need to be patient and have enough capital to put it all together. It’s a long-term commitment. Through smart planning and relentless efforts, Signalchip has acquired the capability required to create any complex and globally competitive silicon chip, indigenously from India. I truly appreciate the patience and diligence the Signalchip team has shown to build this chip. Only long-term R&D can make Indian companies globally competitive.”
Data is the new currency. Data Security has become one of the top priority of any organization. If we talk about the data security landscape for an organization, it comprises of seven components. These are – discovery, classification, prevention, Rights management, Access governance, Database activity monitoring, loss prevention. Database Activity Monitoring is one of the most important components among these. In fact, the scope of DAM covers analysis, monitoring, recording access and usage for any kind of anomalous activity. And then it also covers a strong mechanism of raising alerts to potential attacks and compromises. DAM tools, thus, monitor SQL traffic ensuring internal abuse prevention. In fact, the vendors providing active blocking are high in demand in this spectrum. When auditors talk about compliance and regulations like SOX, HIPAA, GLBA, PCI, etc, the DAM is of utmost importance. DAM tools are of two types.
Database Activity Monitoring tools have two kinds of architectures. It could be network-based or agent-based. Most of the DAM vendors, in fact, rely on the technique of native audit capabilities in databases. Logically, any good DAM tool supports all kind of top range databases. These include Microsoft, Oracle, IBM, MySQL, MongoDB, Teradata, and PostgresSQL. As a matter of fact, most of the DAM tools come with additional features. These features have enhanced capability of looking into data security. This capability includes data discovery, classification, rights & access management, DLP (data loss prevention), and encryption. In fact, a few of these also cover security information and event management (SIEM) and log management tools. If we look at the top key players in the DAM space, there are many established and some new-age ones. The stalwarts include Google, Microsoft, Intel/McAfee, IBM, and Oracle.
Database Activity Monitoring Is Essential
The new-age Database Activity Monitoring vendors include STEALTHbits, Zafesoft, BlueTalon, Imperva, Datiphy, Protegrity, Huawei (Hexatier), and DB Networks.
A recent study says almost 40% of IT workloads are currently running in the cloud. In the next two years, this workload, on an average, will increase to almost 60%. With these changing scenarios of a significant shift in the cloud, the network performance monitoring requirements in organizations are changing at a faster pace. In fact, most of the organizations using traditional network monitoring tools feel these tools are not that effective and useful. It is because the tools are not designed to consider parameters related to the cloud. Most of the traditional tools are designed keeping an on-premise architecture in mind. That means, with more than 50% of the organization’s technology workload shifting in the cloud, the network performance monitoring vendors need to work on the newly evolved use cases and business scenarios to find out newer solutions catering to the latest needs of network monitoring.
The newer network monitoring demands higher penetration in monitoring into networks connecting to cloud applications. As the workloads shift to cloud, the demand for insight into a network connecting to the cloud is becoming as critical as monitoring of on-premise networks. As the pressure is increasing for this new set of requirements, the network performance vendors are busy in innovating newer ways to achieve these goals. The interesting thing is so far it was about network monitoring of owned networks. In the case of the cloud, the requirement changes steeply to gain insights into a network that an organization doesn’t own. But since these external third-party owned networks are carrying important traffic related to an organization, its network insights becomes important. Irrespective of whether it is a SaaS model, other cloud-based applications, public networks, or private networks, the insights are equally important.
Changing Paradigms of Network Monitoring
So far, it was the deployment of network monitoring tools in your own networks. But that doesn’t hold good anymore. Now the boundaries are becoming boundary-less.
Artificial Intelligence and Machine Learning (AI and ML) can act as a boon for an organization. It all depends on the appropriate use cases and their deployment. It is still to ascertain whether organizations prefer specialists over generalists or vice versa? Does an organization prefer full-time employees responsible solely for IT infrastructure? Or there is a shift from specialists toward generalists? Is manpower increase directly proportional to the increase in workload? Especially in IT organizations? I think it is not so. Despite increasing workloads, the ratio of new specialists is significantly low. That means specialists within the organizations are being asked to becomes generalists. That is because a generalist can take care of two or more than two different nature of jobs easily. While a specialist keeps himself stuck to only a specific nature of the job. That is where the significant role of AI and ML comes in.
As a matter of fact, tagging people as specialists have their own merits and demerits. More demerits, I think. But in most of the organizations, IT professionals have to do more with less. Deliverables are increasing. Timelines are shrinking. And expectations of top management are on the rise. In that case, the only savior of an organization is to take help of AI & ML. It is necessary, thus to get in products that are either capable to capitalize on artificial intelligence and machine learning. Or those can be made capable of with a little of tweaking here or there. Once this starts happening, the IT staff will be able to predict problems well in advance. Rather, in some cases, it could be well before the problem occurs. These products, in fact, provide a simple way to manage infrastructure in a better way. Basically, automation is the only possibility.
AI and ML need to be capitalized fast
In order to avoid the complexity of finding highly skilled specialists and to save a substantial amount of recurring cost to the organization, AI and ML is the solution.
Are you joining me this June to attend 2nd edition of the Global LiFi Congress in Paris? It is on June 12 and 13 happening at Salons Hoche (near the Arc de Triomphe). If you decide to join and need a special discount, do let me know. All the players in the scientific world including the business people in relevant fields and global press & media are quite enthusiastic. The first edition of the World Congress was a big hit. The media coverage was from across the globe. Within a short span, it has already established itself as a unique professional platform of repute. Before going further in detail about the event, let us understand what this technology is all about. LiFi technology transmits data and location. It uses light rays via LED lights to perform it. The two days are going to be of collaboration and learning.
Basically, it is a parallel technology to WiFi that uses radio waves to get the same. The first Global LiFi Congress was very insightful. Speakers with their expertise enlightened attendees on various topics like the efficiency of LiFi in modern-day infrastructure setups such as smart cities. It also covered LiFi in reference to 5G communication networks and so on. As a matter of fact, the first edition of the World Congress witnessed all the major players of LiFi like Renault, EDF, Signify, RATP, OLEDCOMM, etc. This year the participation is definitely going to be much higher than last year looking at the zeal and response from across the globe. The congress aims to align technology and business experts so that they can collaborate to explore all possible offerings of LiFi technology. That is ultimately bound to open a large number of opportunities in those two days.
Global LiFi Congress Foresees Huge Response
There would be many key focus areas of the second edition of Global LiFi Congress in Paris this year. These include logistics, transportation, cybersecurity, aeronautics, R&D, Robotics, AI, IoT, Greenfield, and so on. Scientists and scholars would be talking about the latest developments and emerging new standards in this field. In fact, the significance of LiFi is not limited to businesses. It can do wonders for communities as it is a breakthrough technology having ample scope in everyday life. As a matter of fact, LiFi can amplify the benefits of many existing technologies and applications. Hope to see you all there on June 12 and 13. By the way, why don’t have a look at the following video?
Today’s workforce demands dynamic creation tools. Static tools with orthodox features are becoming obsolete. The first three posts on Zoho Office Suite can be accessed here by clicking on these respective links – Post 1, Post 2, and Post 3. In this final post, I would be touching upon some of the key features of Zoho Office Suite that makes is a set of Dynamic Creation Tools.
Let us start with Zoho Writer. Zoho Writer has built-in automation features. These are – document merging, form-based document creation, fillable documents, and one-click signature collection features. The user can also work completely offline while working on Zoho Writer on the web, iPad, or mobile versions. The document automatically syncs to the user’s account once the connection revives back. There is a distraction-free mode in Zoho Writer. All pings and pop-ups get disabled in this mode. Similarly, there is a focus mode.
In the focus mode, Zoho Writer highlights the paragraph the user is working on while it dims all other text in the document. To bring back the old memories of working on a typewriter, the user can enable typewriter sounds. These are some of the dynamic creation tools I have mentioned. There are a lot more in the Zoho Office Suite. For instance, Zoho Show gets you a user-focused interface. The interface helps author populate slides faster. It also offers a variety of themes and options to include tables, path animation, charts, and smart elements.
As a matter of fact, Zoho Sheet is the world’s first spreadsheet application that offers data-cleaning. This ensures the fixing of all inconsistencies and duplicate data. Zoho Show can talk to Apple TV and Android TV seamlessly. The user’s mobile becomes a controller in that case. With its help, the user can beam slides on multiple TVs.
Dynamic Creation Tools Are The Core Strength of Zoho Office Suite
Deluge is another classic example of Dynamic Creation Tools that Zoho Office Suite presents. It helps users to create custom and personalized functions with the help of this proprietary scripting language. Zoho Office Suite pricing is most suitable for individuals, SMBs, Startups, and mid or large sized enterprises. For a single user, it is free. SMEs can avail it at INR 99 per user per month. Large enterprises can get it at INR 399 per user per month.
David Smith, founder, and principal of Inflow Analysis says, “The future of work will be characterized by secure, contextual, and intelligent digital workplace platforms that are fully integrated across collaboration, productivity, and business applications to support seamless workflows. The approach Zoho is taking shows deep understanding of this convergence and the critical need for a fully integrated platform that supports how people actually work. We believe this a challenge to major technology providers that need to address serious gaps in their portfolios and add adjacencies.”
This is the third post in this series of Zoho Office Suite. In the first post, we discussed the Suite. I talked about its components in the second post. In this post, we will elaborate on its components further. Also, I would talk about how these components integrate well to make suite a class apart. Overall, the suite helps in creating better document, spreadsheet, or note intelligently with the help of AI enriched Zia. The first component is the Zoho Writer. It is Zoho’s document creation application. Zia makes document creator’s life easier by helping in so many creative ways. Zia not only detects context-based grammar mistakes, but it also rates the overall readability score of the document. Not only that, but it also suggests style corrections to the user in a real-time environment. This helps the user to improve overall writing quality and a better document in place.
While Zoho Writer is for creating documents, Zoho Sheet is for spreadsheets. Zia is here too to assist to a larger extent. In fact, Zia gets you deeper insights into data sets. It automatically shows you the most relevant charts and pivot tables drawn out with the data available. Zoho sheet also supports Natural Language Querying (NLQ). The user can ask Zia questions related to their data. Zia, in return, responds back intelligently with relevant function, pivot table, or chart that can be added to the spreadsheet. Next comes the Zoho Notebook. This is the newest entrant to Zoho Office Suite. It is one of the best and most advanced note-taking tool in the market globally. Zia Voice that is an intelligent conversational AI assist users to create customized “smart card” by providing visuals, shopping list, instructions etc. from their favorite websites. All through voice commands.
Zoho Writer along with Zoho Sheet and Zoho Notebook makes an ideal suite
In the previous post, we talked about the Zoho Office Suite. in this post, we would be talking about Zoho Office Suite Components and their integration capabilities.
The good part is that all the Zoho Office Suite Components are integrated well among themselves. Not only that, these applications or the components are also integrated with Zoho’s communications tools like Zoho Mail and Cliq. This is a cross-platform messaging app. So, calling them Zoho’s collaboration tools will not be a misnomer. These collaboration tools include Zoho Projects and Zoho Connect. This is basically a private social network for an enterprise. The other collaboration tools include a number of other Zoho’s business applications. With the help of these contextual integrations, the user gets empowerment to merge data from Zoho CRM into a sheet or document. Then the user can send this data for signature through Zoho Sign. That creates a flawless workflow mechanism in its own ecosystem. Imagine the amount of work and efforts it saves in doing so in an automated or integrated manner.
Zoho Office Suite Components Are Well Integrated
In this context, Sridhar Vembu, CEO, Zoho Corp. says, “We built Zoho Office Suite to be the most integrated suite of productivity tools of its kind. For decades, Zoho has provided tools for users to share and work on documents quickly and efficiently. Now, with this new version of Zoho Office Suite—empowered by Zia—Zoho’s integrations are tighter than ever before, providing seamless collaboration across departments and teams. We’ve added features and tools that can’t be found anywhere else, such as Notebook’s smart cards, Sheet’s data-cleansing tool, and Show’s integration with Apple TV. Just like the line between productivity and collaboration applications is fading, we see the line between business, collaboration, productivity, and communication apps fading. It is the combination of these apps, contextually integrated, that makes the modern worker exponentially more productive!” That is a wonderful perspective about Zoho Office Suite Components.
Zoho Office Suite Components help to create a better document, spreadsheet, or Note with the help of Zia. We will talk about various components in the next post. Continued »
Is this the one the ultimate office suite that is capable of catering to all your future needs? Well, you need to experience it and assess it for yourself. But one thing is for sure. It is far beyond what you and your enterprise are living with – whether Microsoft or Google. This one has an edge over both these giants in its true sense and in many ways. For that, obviously, you have to taste the pudding. Experiencing the Zoho Office Suite is not cumbersome. Rather, you will be happy to use it. It is, of course, the best enterprise office suite. This is the next generation office suite, in fact. Besides doing routine office suite tasks, it empowers businesses with dynamic AI features. And you also need to explore its first-to-market enhancements that are bound to make life a little comfortable for your marketing and sales staff.
Zoho Office Suite comes along with Zia that is Zoho’s AI-powered assistant. It is also empowered with the capability of integration with Apple TV and Android. The story doesn’t end here. There is a lot more in store for an enterprise in this classy office suite. It has proprietary data cleansing and smart note card functionality. The four key components of Zoho Office Suite are Zoho Writer, Zoho Sheet, Zoho Show, and Zoho Notebook. And all have a common catalyst – Zia. Zia, in fact, is Zoho’s AI-powered assistant that has matured a lot by now. The suite, thus, enables deep contextual collaboration to help user and enterprise meet the diverse challenges and end-to-end business requirements of its users. And Zoho Suite is capable of delivering the same set of results irrespective of organization size and the number of users.
Zoho Office Suite will surpass all others by 2022?
Zoho Office Suite is equally good for a startup or small business as well as a large enterprise.
We shall continue the same in the next post. Continued »
A. S. Narayanan, President, National Association of Deaf (NAD) says, ”This Summit is a reflection of changing times. It is so heartening that the discourse on digital accessibility has expanded to include the entire spectrum of disabilities. Just yesterday the Ministry of Information and Broadcasting has requested all private television channels to include sign language in all programs and that is such a welcome step in the right direction.”
Arman Ali, Executive Director, National Centre for Promotion of Employment of Disabled People (NCPEDP) says, “Just like disability is not homogenous, the accessibility solutions also cannot be homogenous. We must acknowledge the various categories of disabilities and their requirements and expectations from technologies. Platforms such as Microsoft’s Empowering for Inclusion summit spark dialogues around the need for inclusive technology and how solutions need to evolve and be inclusive of people with cross-disability. There is an urgent need of different stakeholders such as persons with disabilities, government, corporates, NGOs, to come together and work on these solutions.”
Microsoft’s AI for accessibility is a 5-year program with a funding of $25 million. The target is to enhance human capabilities with the help of AI. It aims to benefit over 1 billion people around the globe with a disability. The technological advancements in AI enable it to see, hear, and reason with increasing intelligence. Some of the best use cases are real-time speech-to-text transcription, predictive text functionality, and computer vision capabilities that showcase how AI is playing a vital role in helping people with disabilities. Microsoft’s approach to accessibility can be found on Microsoft Accessibility Website and Microsoft India’s video on Empowering for Inclusion.
With this, we conclude this three post series on AI for Accessibility. Any thoughts are welcome in the comment section.
We continue from the previous post on AI for Accessibility. Initiatives like Deque System’s Accessibility Testing Tool; Adobe Acrobat for Accessible Documents, a Video Relay Services for the hearing impaired by Dr. Philip Ray Harper and Microsoft Office 365 for Accessibility are some of the examples of AI for accessibility. Shakuntala Doley Gamlin, Secretary, Department of Empowerment of Persons with Disabilities, Ministry of Social Justice and Empowerment said while inaugurating the event, “Including people of all abilities in the development process adds to a nation’s social and economic progress. Our vision is to ensure that we empower them with equal access and opportunity, and strong public-private partnerships will go a long way in ensuring this. The Microsoft Accessibility Summit provides an ideal platform to bring together policymakers and influencers to understand the policy environment and chart a direction for making life, experiences, and opportunities accessible to all.”
Dr. Sriram Rajamani, distinguished scientist and Managing Director, Microsoft Research India says, “At Microsoft, we believe there are no limits to what people can achieve when technology reflects the diversity of everyone who uses it. Cloud and AI solutions are opening up a world of possibilities, empowering people with disabilities with tools that support independence and productivity. The Summit is a significant step forward in advancing our efforts towards sensitizing stakeholders and partners on the business and social value of accessibility. As we continue to learn and grow, we hope to inspire other entities and organizations to build and accelerate their accessibility and inclusion programs.”
Inclusivity is not an initiative limited to a handful of persons. It requires involvement and engagement of each and every human being. Only then the actual targets can be achieved. Like an interpreter is a bridge between a person with a disability and a person with no disability. But in reality, isn’t the interpreter serving the purpose of both the ends?
We conclude AI for Accessibility series in the next post. Continued »
AI is the best possible way to empower people with impairment. That was the core theme of the 2-day event “Empowering for Inclusion Summit 2019” hosted by Microsoft. The sole purpose of the summit was to encourage greater coaction in order to attain optimal empowerment for people with disabilities. This can only be achieved using AI (Artificial Intelligence). This was the second edition of this initiative. The beauty of it is the evolution of this platform to such an extent that there are multi-stakeholders in it. It is the AI for accessibility technology playing a substantial role in enabling people to access, collaborate, and deliver. These multi-stakeholders include non-profits, developers, enterprises, academia, scholars, and experts. All have the common goal of creating inclusive technology solutions. The credit for the creation of this platform goes to Microsoft India who is taking a lead in enhancing it to every possible extent.
The second edition of Microsoft’s Accessibility Summit – Empowering for Inclusion was held on 15th and 16th February. It was held in New Delhi in collaboration with The National Association of the Deaf (NAD) and the National Centre for Promotion of Employment of Disabled People (NCPEPD). The core theme this year was AI for accessibility. The summit plays a pivotal role in bringing together multiple stakeholders to initiate anything using inclusive technology. Catering to the enhancement of accessibility standards and giving a new dimension to policies is the primary aim to create a better accessible India. Everyone in the ecosystem has to play an important role in achieving these goals. The initiative can come from any of the stakeholders in the system like People with disabilities, policy-makers, service providers, people engaged in support systems, CSR, and developers of assistive technologies.
AI for Accessibility is a great mission
The journey of AI for accessibility is, in fact, a never-ending journey. Continued »
DH2i’s New DxOdyssey, Software-Defined Perimeter (SDP) Software, Promises to Ensure Security for Remote User Access to Cloud Services
An Alternative to VPNs, Which Present Management Headaches and Security Vulnerabilities
From the IT department to the C-suite, data security has become a key priority, driven by business and competitive requirements, as well as regulations compliance. Until recently, VPNs have been considered one of the most secure methods for the transfer of data. However, recently it has become abundantly clear that in most cases VPNs are unable to meet the security requirements of today’s business environment, nor meet regulations compliance mandates. And, for many IT departments VPNs have been nothing but an expensive, time-consuming, management headache. Today, I speak with Don Boxley, CEO and Co-Founder of DH2i (www.dh2i.com) about this increasingly critical topic.
Q: The undeniable benefits of the cloud have acted as a catalyst for datacenters to expand beyond their physical walls. However, this expansion also introduces potential security issues. Datacenters have typically turned to VPNs – could you discuss the plusses and minuses?
A: Yes, security technology datacenter managers have historically turned to VPNs. They did so because with VPNs datacenters managers were able to give users secure connections to cloud-based services. On the plus side, it’s a legacy perimeter security technology they’re very familiar with. On the minus side, they’re obsolete for the new IoT reality of hybrid and multi-cloud. They weren’t designed for them. They create too large an attack surface. The issues that surround using traditional approaches such as VPNs to secure hybrid cloud environment include:
Q: How are these problems exacerbated for organizations that wish to grant strategic partners access to infrastructure and information?
A: Providing such access represents a critical security risk that can introduce a multitude of security threats to your enterprise. Besides the threat of potentially introducing malware into your systems, there are other possible technical and business dangers of which to be aware. First, granting system access to third parties instantly lowers your security level. If a vendor that you invite in has feeble security controls, they now will become the weakest link in your security chain. If an outside attacker compromises that vendor’s system, this malevolent force can use that as a backdoor into your network. In parallel, as that third party’s risk increases, so does yours. Some of the largest and most publicized retail data breaches in history have been linked back to third-party vendors.
Q: What types of solutions/approaches overcome the limitations just discussed?
A: One approach to secure remote user/partner access to cloud services is to deploy a software-defined-perimeter (SDP). An SDP starts with the question: Does every remote user/partner really need full access to my network to transact business? An SDP would enable organizations to give remote users/partners access to the specific computing services they need without giving them a “slice of the network” or put another way if you want to virtually eliminate network attack surfaces get users off your network by using software-defined-perimeters. This would be an essential component of moving the organization’s network to a Zero Trust (ZT) architecture. The analyst firm Forrester defines a Zero Trust (ZT) architecture as one that abolishes the idea of a trusted network inside a defined corporate perimeter. In the case of remote user/partner access to cloud services, ZT would involve the creation of micro-perimeters of control around computing assets to gain visibility into how remote users/partner use services/data across the cloud network to win, serve, and retain customers.
Q: You recently introduced a new software called DxOdyssey. Could you tell me more about it?
A: Sure. This fall, DH2i introduced new network security software product, DxOdyssey, that is specifically designed to enable the organization to dynamically deploy highly available micro-perimeters to isolate services for fine-grained user access without using a VPN. DxOdyssey was purpose-built to give medium and large organizations the perimeter security model needed for cloud-centric network connectivity with virtually no attack surface.
Q: I believe that this is something that anyone that is concerned with the data security of their organization should check-out. Where can one go to learn more?
A: Please visit http://dh2i.com/dxodyssey/ for more information and/or to schedule a live demo.
When you hear or read low-code, what comes to your mind in the first go? Is it like you have to bend low and write code? Well, jokes apart, the technology has reached a level where you can develop a low to the mid complexity application with the help of Zoho Creator Low Code Platform. And for this, you, in fact, don’t need to be a hardcore developer. You don’t even are required to be an expert in any of the development language. You just need two basic things for becoming an expert in Creator. One, some ground-level command knowledge. Two, training from an expert, maybe, from Zoho or from someone who is already an expert in this particular technology. Another interesting part is that you can integrate this new piece with an existing application.
So, for instance, you have SAP running as a core business application in your organization. And a new requirement comes from a user group asking to develop new functionality. There are two ways of doing it. One, call SAP experts, pay them some hefty per day cost, and get it done. Another way is to develop the piece in Zoho Creator Low Code Platform by calling all the essential data from existing SAP database and then pushing back the result in SAP to further flow in various business processes. That calls for development separately in Creator, and a backward and forward integration with SAP with no new tables creation even. Or it may be a call for creating a couple of interim tables to store data and then push that to SAP tables. That means a lot of relief in terms of money, time, and manpower.
Low Code Platform Is A Reality Now
As a matter of fact, Zoho Creator can be learned by non-IT persons also. It just needs the person interested to learn it to have good business knowledge and a basic interest in learning a few fundamental things about Creator. Then it is merely a matter of practice and applicability.
Now, this is what I call as real innovation and customer-focused innovation. Otherwise, a lot of organizations keep claiming a lot in the name of innovation but in actuality, that is of no use. ASUS launched the world’s smallest notebooks in 13″, 14″, and 15″ segment named as Zenbook 13, Zenbook 14, and Zenbook 15. These are ultrathin having four-sided unique technology called “NanoEdge” displays. The new NumberPad is not at its usual place. It is uniquely and innovatively placed on the touchpad. So, now, the NumPad and touchpad are residing at the same location and the touch is intelligently recognized by the laptop whether it is intended for Numpad or touchpad. Effectively, the touchpad has become a multilayer model. It leaves a scope of better and larger keys and keyboard size increased significantly. It also enhances productivity and let the user work with a better pace and concentration.
The login in these zenbooks (world’s smallest notebooks) is through a powerful 3D IT camera that works fine even in the low-light environment to recognize the face and allow the user to log in. There is an Ergolift hinge to raise keyboard at the rear only when you open the laptop to use. It actually helps in comfortable typing. Technically and design wise it also helps in improving the cooling and audio performance. It is powered by 8th Gen Intel Core CPUs along with GeoForce Graphic Cards. The laptop allows users to access gigabit Wi-Fi. These are just a few of the features. Actual revolution is in its design that is definitely mindful and user-centric. As a matter of fact, Zenbook 13 is smaller than an A4-size sheet. That is phenomenal. The numeric keypad is LED-illuminated. It gives a different kind of feel to the user.
Smallest Notebooks are the latest Zenbooks from Asus
The lightweight Zenbooks that are world’s smallest notebooks are in reality a powerhouse with unmatched quality and design.
Chief Information Officer is the chief custodian of the information of an organization. Mostly his role is to take care of the digital information but when his role combines with that of a CISO (Chief Information Security Officer) the physical information comes into his vicinity. Similarly, CISO as a separate role has to ensure the right kind of measures to be in place to ensure the safety and security of any kind of organizational information. Information scrutiny and information flow scrutiny is fine that he can create an appropriate process and ensure strict adherence to those processes. But when it comes to the safety of information in connection to the employees or external stakeholders then is he responsible to scrutinize those people too? I think yes. So basically, in that case, a background check of a new recruit also becomes essential.
Recently in an online technology magazine, there was a news about a CIO of a large retail organization about the new launch of their mobile initiatives. That was a very basic kind of mobile app that was launched. The way it was being publicized was something not matching with the initiative. A very small initiative was being projected as something extraordinary. That is synthesized news. Induced one. I put a comment below that article that this small thing should have been done two decades ago in such an old and large organization with such a large IT setup. This is a tragedy that technology heads making a mockery of technology. Actually, organizations have no criteria to measure the intellectual and monetary loss of non-automation of a business critical process. Something that could have been done years back in an organization, if stays uninitiated or under process for years denotes lethargy.
Another example is of a CIO sacked in an organization for financial fraud. He was booked for taking money from vendors for few of the big deals happening in the organization. He was not sacked actually but was told to put down his papers and then was told to move out immediately. Today he is CIO of another large organization. Would be playing similar kind of games. Sad thing is that organizations recruiting C-suite people sometimes don’t come to know about these darker sides of their personalities.
This is the concluding post on our discussion with Barry Phillips, CMO, Maxta Inc. The first post is Software Model Versus Appliance Model – Which Business Model? In the second post, he tells how Inflexible Architecture Will Create Much Bigger Issues For IT. In the third post, that is the previous post, he elaborates the first two essential passing criteria of a hyperconvergence software. Let us conclude the series with the rest of the three important parameters.
Barry asks – “Can you add capacity within the server? The only way to add capacity with an appliance vendor is by adding another appliance. Even though some vendors offer a storage-only node, the step-up cost of another “pizza box” isn’t trivial. With true hyperconvergence software, enables you can add capacity to an existing server by adding drives to open slots, swapping in higher capacity drives, or by adding servers. If you can only add capacity by adding nodes, you have a fake software model.”
True Hyperconvergence Software
Barry says, “Are you being forced into the same appliance software licensing model or do you have a choice? Hyperconverged appliances tie the software license to the appliance, so when you refresh your hardware you get the privilege of repurchasing the software. This is a “term license,” which means you get to buy the software over and over again, and it’s the only option you have in a fake software model. While many software companies are starting to offer term licenses to provide subscription-like pricing, nearly all software companies still offer a perpetual license that you own forever. You should have a choice of perpetual or term licensing. Do you like the thought of owning the software for life, but don’t want to pay for it all upfront? Just lease the software from any number of leasing companies. It gives you the best of both worlds.
Barry concludes – “Can you add more memory and CPU resources? Just like adding storage capacity, you should be able to add additional memory or compute whether inside an existing server or by adding a compute-only server. A true hyperconvergence software model scales storage independent of compute. A fake hyperconvergence software model operates the same way as the appliance model.”
This post is third in series in continuation to previous two posts on Barry Phillips of Maxta Inc. talking about the five essential components of hyperconvergence software. You can read the first post by clicking here and the second post by clicking here. Any hyperconvergence software that doesn’t fulfill the following criteria is not a true hyperconvergence software. Who can tell it better than Maxta Inc? The five important criteria are:
Whenever there is new software to be put in production, do you need to buy new hardware? Every time? As Barry Phillips, CMO, Maxta Inc. says, “Can the software be installed on your existing server hardware? This is the first sniff test of whether it is a true software model or a fake software model. Of course, you need to make sure the hardware has the right specifications to run the software, but you shouldn’t need to buy new server hardware. And don’t get fooled by the old trick of being able to run “trial” software on your own hardware, but you have to buy new hardware to put the software in production. True infrastructure software vendors like Microsoft, Citrix and VMware do not make you buy new hardware to run their software.”
Barry questions, “Does your server hardware have to be from an approved list of server SKUs? And then elaborates it saying, “If you do want to refresh your hardware when you implement hyperconvergence, does the hyperconvergence software vendor limit you to a certain set of server SKUs? If so, that isn’t really software; it’s just an appliance vendor separating out the appliance software from the confined set of appliance hardware.”
The basic question is there are a lot of vendors in the market giving a different kind of hyperconvergene solutions. Do they really provide a true hyperconvergence environment? Do they fulfill above to criteria? Let us have a look at the other three criteria in the next post.
In continuation to my previous article on Software model taking drastic precedence over the appliance-based business model, let us try to encapsulate five essential properties of a true hyperconvergene software. Are most of the ITs still creating Inflexible Architecture? Barry Phillips, CMO, Maxta Inc. says, “Once that appliance-based product has taken off, the company will want to change to a software business model from a profitability perspective. This can be a difficult pivot to make financially since revenue decreases before profitability improves, and it changes how the sales teams are paid. If the pivot is made successfully, then the company is much more profitable and financially stable”.
Barry adds further, “Even if a pivot to software works out for the vendor, it does not always work out well for the customer – especially if the software model is an appliance “in software clothing.” If you’re considering hyperconvergence software, make sure it’s not an appliance in disguise. Many vendors will claim to offer hyperconvergence software, but still significantly restrict how their solution can be deployed and used. Ask vendors these questions to determine how much (or how little) flexibility you’ll get with their software.” “As the hyperconvergence market shifts from appliance offerings to software, vendors that started out selling hardware platforms will need to shake both the appliance business model and the appliance mentality. As you evaluate hyperconvergence, always understand what limitations and costs will be in four or five years when you need to refresh or upgrade”, he continues.
Talking further, Barry adds, “Infrastructure platforms are evolving quickly, so the ability to scale, choose and change hardware platforms, and use different hypervisors will certainly make life easier. Getting locked into an inflexible architecture will create much bigger issues for IT down the road. By asking the right questions upfront, you’ll be able to navigate the changing landscape.”
We will continue with Barry’s ideation on Hperconvergence Software in next post. Continued »
Who can understand Hyperconvergence Software better than Maxta Inc? We have already covered an article this previous month. You can read that by clicking here. Let us understand that further a little deeper and try to understand what are the five basic requirements for hyperconvergence software. Before that let us get into some basics. Are financial analysts clear about the concept of an organization switching from their appliance model to a software model? That is making, in fact, stock prices soaring for whatsoever reason. And the fundamental reason is the software model itself. When you try to find out the pros and cons of a business model between the two, obviously software model takes a larger leap over appliance model. But if the software is a brighter business model, than why companies still keep sticking to shipping appliances? One neatly needs to understand the whole gamut behind it.
Obviously, selling appliances is much easier through any channel. Directly or through a distributor and reseller network. On the same note, when it comes to software, it is equally easier to build an application for a specific or a specific set of hardware platforms. Of course, it is not difficult to support a limited number of hardware platforms. But then this kind of design will have a lot of limitations that will invite a large number of troubles. The most important thing is that such software can’t find its place among universal acceptance. That is the basic issue Maxta tries to overcome for any size of organization launching or using any kind of software. That is the most critical differentiator between the appliance-based business model versus a software business model. Most of the organizations plan to change to a software business model for higher profitability.
Appliance model is becoming an obsolete business model
Appliance-based model is becoming obsolete because of hyperconvergence software having higher capabilities.
We shall continue this discussion in the next few articles Continued »
Any kind of disruption creates two reactions. One, fear out of which the players become defensive and start stepping back. Two, a very few players look at it as a new pool of opportunities and start exploring various innovative ways to cater to it. Most of the players in the former category start getting into a shell and become history sooner or later. They keep sitting on their laurels achieved in the past. Because they didn’t accept to participate in the new game of warriors and hence have nothing to prove in the newer battlefield of business. Most of the players in the latter category succeed despite swimming upstream because of two latent forces coming from within. One, courage. Two, innovative ideas taking a shape of reality. Violin Systems very distinctly stands apart as a spearhead in this category. Let’s see what makes them a class apart in technology.
Violin Systems is a synonym to extreme performance enterprise storage. And that is provided at the price of traditional primary storage. The sole aim is to empower enterprises to get the maximum leverage of their business-critical data in a manner that was never thought of by any of the technology players across the globe. The solution provides lowest-latency and highest IOPs that is unmatched. This includes all kind of seriously essential data services like data-protection, data reduction, and business continuity, to name a few. Businesses can easily bank on Violin Systems for achieving a new level of enhancement in their application performance with extreme reliability thus taking their business service levels to newer heights along with reducing costs drastically. Immediate access to information is an organization’s top dream because that is the only key to achieve higher revenue and gain a substantial increase in customer satisfaction.
Violin Systems is a synonym to extreme performance enterprise storage
In today’s scenario which organization in the world would not like to be a data-driven business. Violin Systems helps enterprises drive their business-critical applications to support operations, quality, and delivery across their entire stakeholders’ ecosystem. It also helps enterprises to easily scale and extend their competitiveness thus staying ahead of the others in the fray. That is the reason enterprise customers reply on Violin Systems for unmatched extreme performance and excel to drive their business without any compromise.
There is a constant shift in the role of developers globally. Actually, it is not a shift completely. It is, in fact, an additional kind of role that is embedding within their existing role of coding and development. It is to test as you build. The new mantra is to test while you develop. Obviously, most of it would be manual testing of small pieces of codes being built. Almost like a unit testing or a segment testing. Now, this doesn’t require any additional skillset in developers. What they have to do is to just test what they are building. It is, rather, a small shift in the mindset only. A developer has to first convince himself that it is very well his job to perform it. One, because it is his own code that needs to be 95%, if not 100%, matching with the business requirements.
It is a kind of building an assurance along with the coding as an additional part of the role of developers. It is not that a shift is happening only in the developer’s role. Testers also are facing an altogether similar kind of overhaul in their roles. The scope of testing earlier and testing now is changing in a big way. In fact, the ultimate goal of any testing is to increase quality and hence self and customer confidence. Testing and development are getting closer as never before. And this is proving to be a smarter and efficient move. While developers are doing all the testing themselves, testers are supposed to become automation engineers in the changing scenarios. Testers doing traditional kind of manual testing as before is a big NO now for most of the progressive organizations.
Role of Developers Takes A New Turn
Earlier testers could exist merely having manual testing skills without a few technical skills and banking completely on their functional knowledge. But now it is not possible. As the role of developers is changing, so is of testers. Testing has become more demanding thus getting more penetrative and effective. The major onus of this shift goes to DevOps and Agile. Testing needs to be in the mainstream of project lifecycle at the earliest and more frequently thereby giving fruitful results faster. Of course, no testing is complete without a human touch of discovering the unknown with the help of their intuitive exploration. Wishing a piece of good luck to all developers and testers to adopt their changing roles for the goodness of quality of the product you are developing and testing.
Any organization in software development and still not test automation tools and completely banking on manual testing can’t stay in the mainstream software business. Even if we talk about an enterprise in any business vertical other than software testing and have a focus on in-house development for any kind of key business application can’t release a healthy product merely on the basis of manual testing. The reason for that is not that manual testing is incapable of testing a product fully. The reason is that in today’s scenario, software applications don’t run on a single platform, hardware, and operating system. Any application you develop, or for that procure from an external vendor, the first and foremost requirement would be its capability to run on multiple platforms like a laptop, tablet, smartphone, and desktop.
This automatically calls for a heterogeneous spectrum of a number of operating systems and different environments. Now that is obviously a big challenge if you have to perform all these testings manually. One way is to have a huge size of manual testers. Another way is to use test automation tools to save a huge spend on resources, manpower, and time. These test tools simulate various environments, loads, operating systems, and capacities. In my opinion, to be in the best situation, the manual versus automated testing ratio should be 30:70 at least. Of course, it can’t be 100% automation. There are certain things that still can be managed manually only. Another way is to outsource a reliable company for testing. This way you can live with limited resources in the testing department.
Test Automation Tools Demand Has Increased Tremendously
But ensure to place a strong SLA with this outsourced testing organizations in terms of information and outcomes. I doubt if there is anybody at customer end accepting a software without proper testing reports. There is a substantial increase in test automation tools demand in the global market for this reason. That easily shows the change in testing trends.