We are in conversation with Jerry Melnick, President & CEO, SIOS Technology Corp. Jerry is responsible for directing the overall corporate strategy for SIOS Technology Corp. and leading the company’s ongoing growth and expansion. He has more than 25 years of experience in the enterprise and high availability software markets. Before joining SIOS, he was CTO at Marathon Technologies where he led business and product strategy for the company’s fault tolerant solutions. His experience also includes executive positions at PPGx, Inc. and Belmont Research, where he was responsible for building a leading-edge software product and consulting business focused on supplying data warehouse and analytical tools.
Jerry began his career at Digital Equipment Corporation where he led an entrepreneurial business unit that delivered highly scalable, mission-critical database platforms to support enterprise-computing environments in the medical, financial and telecommunication markets. He holds a Bachelor of Science degree from Beloit College with graduate work in Computer Engineering and Computer Science at Boston University.
What is the SIOS Technology survey and what is the objective of the survey?
SIOS Technology Corp. with ActualTech Media conducted a survey of IT staff to understand current trends and challenges related to the general state of HA (highly available) applications in organizations of all sizes. An organization’s HA applications are generally the ones that ensure that a business remains in operation. Such systems can range from order taking systems to CRM databases to anything that keeps employees, customers, and partners working together.
We’ve learned that the news is mixed when it comes to how well HA applications are supported.
Who responded to the survey?
For this survey, we gathered responses from 390 IT professionals and decision makers from a broad range of company sizes in the US. Respondents consisted of people that manage databases, infrastructure, architecture, systems, and software development as well as those in IT management roles.
What were some of the key findings uncovered in the survey results?
The following are key findings based on the survey results:
Tell us about the Enterprise Application Landscape. Which applications are in use most; and which might we be surprised about?
We focused on tier 1 mission critical applications, including Oracle, Microsoft SQL Server, SAP/HANA. For most organizations operating these kinds of services, they are the lifeblood. They hold the data that enables the organization to achieve its goals.
56% of respondents to our survey are operating Oracle workloads while 49% are running Microsoft SQL Server. Rounding out the survey, 28% have SAP/HANA in production. These are all clearly critical workloads in most organizations, but there are others. For this survey, we provided respondents an opportunity to tell us what, beyond these three big applications, they are operating that can be considered mission critical. Respondents that availed themselves of this response option indicate that they’re also operating various web databases, primarily from Amazon, as well as MySQL and PostgresQL databases. To a lesser extent, organizations are also operating some NoSQL services that are considered mission critical.
How often does an application performance issue affect end users?
Application performance issues are critical for organizations with 98% of respondents indicating these issues impact end users in some way ranging from daily (experienced by 18% of respondents) to just one time per year (experience by 8% of respondents) and everywhere in between. Application performance issues lead to customer dissatisfaction and can lead to lost revenue and increased expenses. But, there appears to be some disagreement around such issues depending on your perspective in the organization. Respondents holding decision maker roles have a more positive view of the performance situation than others. Only 11% of decision makers report daily performance challenges compared to around 20% of other respondents.
Is it easier to resolve cloud-based application performance issues?
As much as most IT pros would like to fully eliminate the potential for application performance issues that operate in a cloud environment, the fact is that such situations can and will happen. There is a variety of tools available in the market to help IT understand and address application performance issues and IT departments have, over the years, cobbled together troubleshooting toolkits. In general, the fewer tools you need to work with to resolve a problem, the more quickly you can bring services back into full operation. That’s why it’s particularly disheartening to learn that only 19% of responses turn to a single tool to identify cloud application performance issues. This leaves 81% of respondents having to use two or more tools. But, it gets worse; 11% of respondents need to turn to five or more tools in order to identify performance issues with the cloud applications
So now we know cloud-based application performance issues can’t be totally avoided, how long until we can expect a fix?
The real test of an organization’s ability to handle such issues comes when measuring the time it takes to recover when something does go awry. 23% of respondents can typically recover in less than an hour. Fifty-six percent (56%) of respondents take somewhere between one and three hours to recover. After that 23% take 3 or more hours. This isn’t to say that these people are recovering from a complete failure somewhere; they are reacting to a performance fault somewhere in the application and it’s one that’s serious enough to warrant attention. A goal for most organizations is to reduce the amount of time that it takes to troubleshoot problems, which will reduce the amount of time it takes to correct them.
Do future plans about moving HA applications to the cloud show stronger migration?
We requested information from respondents around their future plans as they pertain to moving additional high availability applications to the cloud. Nine percent (9%) of respondents indicate that all of their most important applications are already in the cloud. By the end of 2018, one-half of respondents expect to have more than 50% of their HA applications migrated to the cloud, while 29% say that they will have less than half of the HA applications in such locations. Finally, 12% of respondents say that they will not be moving any more HA applications to the cloud in 2018.
How would you sum up the SIOS Technology survey results?
Although this survey and report represent people’s thinking at a single point in time, there are some potentially important trends that emerge: first, it’s clear that organizations value their mission-critical applications, as they’re protecting them via clustering or other high availability technology. A second takeaway is that even with those safeguards in place, there’s more work to be done, as those apps can still suffer failures and performance issues. Companies need to look at the data and ask themselves, therefore, if they’re doing everything they can to protect their crucial assets. You can download the report here.
Technology has endless ways to help mankind. Mankind has endless reasons to leverage technology in its favor. These reasons include challenge, business, prosperity, and charity. One single factor out of these is good enough to lead in this world of competition. When all the four factors collaborate by chance in a single venture, it creates a big difference. Competition in today’s world is of production, delivery, and quality. When all tools like Kaizen, Agile, TQM, etc. reach to a saturation in helping a business to excel and stay ahead against competitors, then only one thing comes to rescue to get an edge over others. And that is nothing but innovation. MediaTek comes out with an innovative idea by introducing ‘Technology Diaries’. It is an interactive discussion series to happen in multiple cities with an aim to demystify next-generation technologies in common man’s life.
These next-generation technologies include Artificial Intelligence (AI), 5G, and Narrowband Internet of Things (NB-IoT). The event was very lively and interactive thus becoming more informative. The format of this event was entirely different to the regular ones that are dull, one-sided, mechanical, and non interactive. Technology Diaries by MediaTek was launched in Delhi by inviting two celebrities from diverse fields of media and technology. It was one of the most popular Radio Jockeys (RJ) Naved interacting as a common man with two technical persons. The first tech guy on his right was the famous TechGuru Rajiv Makhni. The other was Kuldeep Malik, Director – Corporate Sales, MediaTek India. The company has a credit of being the world’s 4th largest global fabless semiconductor company. Fabless denotes or relates to a company that designs microchips but doesn’t manufactures it under their umbrella. Rather it contracts out or outsources its production.
MediaTek is World’s 4th Largest Global Fabless Semiconductor Company
MediaTek powers around 1.5 billion devices every year. It aims to make technology more accessible. That can happen only by means of making it more understandable for every common person of the country. Probably Technology Diaries series is the best way to achieve it. Because it is not only interactive and informative, it engages everyone equally making it a completely making it a fun and knowledge series. In fact, it is demystifying of newer technologies that can transform our daily lives. This was the inaugural session of Technology Diaries. While Naved was presenting a common man’s perspective on newer and upcoming technologies, it was done in a very light mode without using any technical jargons yet making it understandable for everyone. The whole concept gels well with MediTek’s philosophy of making all relevant technology available to everyone so as to empower them to connect easily to it.
The philosophy says if a person understand the technology behind a concept easily, then it becomes easier to adopt it to use in everyday life. This will definitely enhance and enrich life by making everyone smarter and healthier. As a matter of fact, this commitment of MediaTek to deliver the latest technologies to most of the diversified products and solutions. Their solutions impact almost every industry. That includes mobile industry, automobile, health, entertainment, next-gen wearables, and so on. The sole purpose is to meet their consumers and businesses’ expectations and deliver more than that to them. In fact, you name the technology and you find MediaTek in it. For instance, in mobile segment, their Helio series chips are highly in demand by all mobile manufacturing companies. Those include Vivo, OPPO, Nokia, and so on. MediaTek is equally strong in digital television, optical storage, and DVD/Blu-ray segments.
You won’t find a segment in technology without MediTek in it
Kuldeep Malik, Director- Corporate Sales, MediaTek India, says, “Consumers may not know it, but our chips and technology are an integral part of daily lives. You will find us in 20 percent of homes globally and nearly 1 of every 3 mobile phones is powered by MediaTek. MediaTek chips power next-generation smartphones, tablets, TVs and voice assistants and all kinds of intelligent devices to transform how people interact with each other and the world around them. A new category of devices is driving features and consumer expectations forward with advancements in power, performance, AI, and connectivity. With the Technology Diaries we want to help the less savvy consumers understand and be comfortable using these technologies to their advantage.”
Some of the prominent projects of MediaTek include Narrow-Band IoT (NB-IoT), NeuroPilot, Edge-AI technology, Edge-AI hardware processing, MiraVision, Self-Driving technology, Autus, and so on.
Vizag Fintech Festival is a five-day festival (22-26 Oct), focusing majorly on the Fintech Conference (which will happen from 23-24 Oct). Major themes for the Conference are BankTech (Future Of Banking, Investments And Payments), InsurTech (Technology Enablers In Insurance), GovTech (Technological Advances Fostering Invisible Government, Visible Governance), Financial Inclusion (Increased Access To Financial Services For The Underserved), EmergeTech (Emerging Technologies Including AI, Cyber Security, Blockchain, IoT And Big Data). Fintech Valley Vizag previously conducted Blockchain Business Conference in October 2017 and Fintech Spring Conference in March 2017.
What are the objectives of this event?
To position Andhra Pradesh & Vizag as a leading Fintech hub. Through the Innovation and Startup Policy, the Government intends to create an ecosystem that produces an entrepreneur in every family. The event intends to bring together industry, academia, and investors to innovate, co-create, and build the Fintech ecosystem, with the intent of making Visakhapatnam the Fintech Epicenter of the world.
What top messages this event wants to spread?
The event will be a hub for:
Networking amongst the Hon’ble Chief Minister of AP, CXOs of the top Fintech & IT companies & other global stakeholders (CXO RoundTable)
Exposure to industry trends & hot topics (Fintech Conference)
75+ National & International Speakers (https://www.vizagfintechfestival.com/the-first-experts.php )
Million Dollar Challenge – to enable startups to showcase their solutions and get a chance to win upto $1Mn in funding and set up their office in Vizag (https://www.vizagfintechfestival.com/million-dollar-challenge.php )
Who are the target audiences?
Global Corporates in Fintech ecosystem, Global Startups, Academia (Fintech), Financial Institutions, Investors, CXOs from Top Indian & International banks
Who should attend this 5-day event and why?
Investors – To interact with Startups (the fresh talent)
Financial Institutions – For knowledge exchange, networking, collaboration, to grab business opportunities, to gauge market reaction.
Government Agencies –
Startups – To learn from the experts of the industry, showcase their solutions. They stand a chance to win upto $1Mn in funding and set up their office in Vizag (On the Demo Day of the Million Dollar Challenge – 25th Oct)
Fintech Professionals/ Experts – To lead the industry forward
For more information, feel free to browse through our website, www.vizagfintechfestival.com
Technology is an integral part of every business. In fact, it has become an integral part of everybody’s life on personal and professional fronts. When it comes to education, there is a huge amount of data in this vertical. With the help of new technologies, it is important to relook into the way of teaching. In fact, leveraging new technologies is important to create new ways. IoT is a promising factor for educators in terms of collaboration, communication, and operation. As a matter of fact, there are a lot of ways an education institute can adopt IoT and bring a transformation in the campus and the way education is imparted. IoT in education or teaching, in fact, has a number of meaningful use cases. A simple use case is connecting academia all over the world with the help of IoT.
This can result in a stronger and deeper learning experience for students. This way they can gain knowledge in a more meaningful format. Not only this. As a matter of fact, it also becomes a strong collaborative and learning means for educators across the globe. When we talk about IoT in education, it becomes important to create a simple learning model for students where they can easily relate theory to practical applications. There are a number of resources and aggregators from where important information can come in a real-time environment. A lot can happen with the help of sensors which are very cost effective. Every student can get an education in a very personalized manner. In fact, each student can move at their own pace. Also, student-teacher communication can become highly structured and result oriented. Things can move beyond textbooks. There is a huge scope of IoT in education.
IoT in Education Can Create A New Benchmark
Obviously, if IoT in education is used in a right manner, it can result in better communication, stronger development, and a tremendous increase in creativity at both the ends.
Quality control and enterprise should be a synonym to each other. No enterprise can survive without quality control. In fact, quality control should be a habit that happens automatically. It should not be for other’s sake. It should be there in the blood of an enterprise.
“Quality control is applicable to any kind of enterprise; in fact, it must be applied to every enterprise.”
“The biggest challenge in maintaining a quality management system is ensuring consistency in established practice, in a consistently changing business environment.”
Sng Yang Seng, senior manager, quality assurance, LSG ST Electronics
“Unless the problem is prevented from happening again, the root cause is just a theory.”
Govind Ramu, Program Manager, Google, Mountain View, Calif.
“Quality is when customer satisfaction goes far north and when scrap and rework rates go way south.”
Paul Tang, quality and project manager
“Preaching quality won’t help.”
Hammer and Champy
“Our reputation for quality is only as good as our last machine or our last customer call.”
“Organizational excellence is not about the management of quality. It is about the quality of management.”
“The price for ignoring the impact of design on service can be staggering.”
Davidow and Uttal
“The team concept conveys the message that PQI (Productivity and Quality Improvement) is everybody’s business.”
“Despite what the textbooks say, most important decisions in corporate life are made by individuals, not by committees.”
“High-quality companies treat their human resource as a resource, not a commodity.”
“Compliance is an intended consequence of quality.”
Paula Burdick Parsons, CMQ/OE
“Quality is when the customer returns and the product does not.”
“We only get one chance to do it right the first time.”
Jeffrey S. Schiopota, Aspire Brands
Quality Control and Enterprise go hand in hand
“Develop a healthy disrespect for the impossible.”
Gene Hoffman, SuperValu Inc.
Any business that has set its customer as a reference point to set its priorities never fails. Any other priorities will lead to a disaster. The best way is to assess each activity that is done in the organization. If it is adding a value to the customer, it makes sense. Otherwise, it is a waste of effort and energy.
“With the customer as a reference point, priorities become easier to set.”
That means if you every business priority has the customer as a reference point in it, it leads to more customers, more business, and more profits.
“‘What is our business?’ is not determined by the producer but by the customer.”
You keep on making a product that nobody wants in the market is of no sense. The real sense is to create the value of that product so that it attracts customers. The other way is to understand what your customer wants and then deliver the same to him. Apply your mind only if you are sure to add a certain value to a customer that he acknowledges.
“Learn from the mistakes of others. You can’t live long enough to make them all yourself.”
In fact, in real life, people stop learning from their own mistakes. If you are at that stage, then move a step ahead and start learning from other’s mistakes.
“Customer needs and satisfaction are constantly changing targets.”
Juran Institute, Inc.
Most important for any business is to understand customer needs. In fact, if you are able to understand customer needs, and you have capabilities to develop according to his needs, you will definitely be able to satisfy him. Another important point is to keep an eye on customer and set your targets according to that.
Setting customer as a reference point is important for a business
Above all, it is the customer that matters most. That is why setting a customer as a reference point is very important for any business.
In my previous post, I listed a few quotes on quality. These quotes carry a lot of weightage in today’s business and professional life. In fact, these quotes are rather more relevant in today’s more competitive and stressful environment. I will continue with a few more in the same context in this post. Here we go.
“The first job in decision making is to find the real problem and define it.”
Most of the times we are not aware of the real problem we are trying to address. That means the solution you are heading to is not leading to the resolution of the actual problem. As a result, all the efforts are a waste. On the other hand, this actual problem will keep widening. This, in return, will lead to a chaos.
“Quality improvement is a fragile process. All major processes are.”
What it means is that quality improvement needs utmost importance. Logically, it should be the top agenda of every organization.
“We are what we repeatedly do. Excellence, then, is not an act, but a habit.”
It takes a lot of practice and a lot of effort to follow good practices. Once they become a habit, you don’t need to remember them. They automatically start happening.
“Quality isn’t expensive; it’s priceless.”
Of course. You can’t buy quality. You need to inculcate it. It has to be in the blood of every individual of an organization. Only then the whole organization excels. Otherwise, it remains limited to a few individuals and departments in the organization.
“The market is never saturated with a good product but it is very quickly saturated with a bad one.”
Quotes on Quality
How trues it is in the context of an organization, a product, or an individual. As long as you are delivering the best, you are in high demand. The moment you degrade, there is a steep fall in demand. Hope you like these quotes on quality.
The source of these beautiful quality quotes is ASQ. In fact, there are a plenty of quotes there. I am picking those I find relevant to today’s work life.
“Creativity is thinking up new things. Innovation is doing new things.”
“Quality improvement at a revolutionary pace is now becoming simply good management.”
A. Blanton Godfrey
“Error is always in haste.”
“We can be creative in our approach but never compromising in our quality.”
Jim Nelson (Lonza)
“Failure is simply a reason to strengthen resolve.”
John W. Gardner
“The greatest of faults is to be conscious of none.”
“Great things are not done by impulse…but by a series of small things brought together.”
Vincent Van Gogh
“The best way to understand your customer is to become your customer and walk a mile in his shoes.”
Ian D. Littman
“Effective statistical process control is 10% statistics and 90% management action.”
“If there is one consistent lesson from those who have led this effort [toward continuous improvement], it is that there is no universal strategy for success.”
V. Daniel Hunt
“Changing the culture of an institution is a slow process, and one that is best not rushed. If the effects of TQM are to be lasting, people have to want to be on board.”
“The key is to get into the stores and listen.”
Sam Walton, Founder of Walmart
“All progress is precarious, and the solution of one problem brings us face to face with another problem.”
Dr. Martin Luther King, Jr.
“A ‘problem’ is the distance between where you are now and where you could be, no matter how good you are now.”
Townsend and Gebhardt
“If you cannot solve the real problem, change it into one you can solve.”
Mark Kac, mathematician
Hope you agree that these quotes make a lot of sense in today’s world of high competition.
IoT and IIoT are yet to see the light at the end of the tunnel. Especially when we talk about adoption among masses or availability for them. For the last few years, a lot is happening on IoT. But mostly it is limited to trial, pilot, proof of concept, or conceptualization. A lot of solution companies claim to have multiple solutions in this regard but most of these are far from acceptability by industries. These are based on just few use cases without any assessment whether there is any return value. In fact, solution providers create solutions according to their visibility and thought process. Without even understanding the real needs of industry and their relevant pain areas to address. Definitely, such efforts are only limited to presentations and discussions in forums. Nothing beyond that. In fact, if these solutions are really meaningful, they should have a quick adoption rate.
Any solution has to be feasible, acceptable, and commercially viable. There are many IoT and IIoT projects that talk too much but when it comes to these three scales, they vanish. On top of it, security is a big concern rising at an exponential rate worldwide. The more sockets you open, the more is the risk and vulnerabilities. IoT solution providers are either not aware of it or have not enough capabilities to handle it. Again, it comes to the involvement of multiple patch providers in such cases. Now, these patch providers are strong only in their respective areas. They are not having a business or practical knowledge of other areas. None of these are having their exclusive research division to carry out viability, security, vulnerability, and feasibility assessment facilities. Without these, it makes no sense to go for a big investment.
IoT and IIoT projects are becoming a rate race
Actually, if you see, there are many investors ready for real worthy solutions in the field of IoT and IIoT. It all depends on what all parameters these solutions will qualify to have a mass acceptance.
About Zoho Backstage, Sridhar Vembu, CEO of Zoho Corp. says, “Planning and running an event is a complex operation that has traditionally required different tools for each stage of the process—a website builder, a mass emailing solution, scanners for checking in attendees, and a stand-alone app for audience engagement. Backstage embraces event management from start to finish by offering a unified online platform that allows organizers to market the event, fill seats, and engage audience all from within one product. Rather than getting bogged down in the complexities of the tools being used, this holistic event management system lets organizers focus on what matters: putting on a great event.”
Ramesh C Pathak – PMP, VP Technology. PMI Bangalore, India says, “With Backstage, we were able to save time and money planning our annual conference which was attended by 700+ delegates. All it took was a few clicks to get our event website up and running. We were also able to provide our attendees with an engaging and memorable event experience through a custom branded mobile app. Backstage will surely be part of our future events.”
Jay Krishnan, CEO, T-Hub says, “For an ecosystem player like us, managing tens of projects for hundreds of business requirements to meet the expectations of our stakeholders runs into thousands of requirements that could potentially take as many hours of operating bandwidth and spend. An integrated product suite like Zoho One has truly helped ease our ever-changing requirement list. It has been an enterprise-class product from a world-class player. A big thank you on behalf of the 300+ startups that make up our community.”
Shayak Mazumder, CEO, Eunimart says, “When we started Eunimart, just like any other early stage business, we searched the entire market for any software that could help tie all my different customer-facing and internal functions into one seamless flow. Unfortunately, all the different solutions we came across either did not have the breadth of solutions needed for a complex business model like Eunimart or were built for large businesses and were too expensive for us. Zoho was a lifesaver as our business processes were becoming unmanageable with the growing number of customers. The easily manageable workflows, the ability to build custom models and create custom solutions was exactly what we needed. Our internal team, with the help of a Zoho partner, has managed to automate almost all of our business, enabling us to achieve a 30% increase in productivity and largely increased customer satisfaction.”
“I am grateful to Zoho One for the large suite of solutions that are built just for startups such as ours. I have recommended a lot of different startups on how to implement Zoho and scale their businesses in a more efficient manner,” concludes Shayak.
Zoho Data Centers in Mumbai and Chennai will be operational from August 2018. All Indian users who are running their apps on zoho.in will be catered through these data centers. Rest of the worldwide customers will run as it is from www.zoho.com/
By the first anniversary of Zoho One, the organization has now more than 12,000 customers worldwide. Out of this, around 36% of users are from India. There has been a tremendous increase in the average number of Zoho One applications that a customer uses for its business. It has increased to 16 from 9. In fact, with so many features Zoho One offers to a business, they don’t need to buy different products from different vendors. The whole business can depend on Zoho One from a single cloud platform. In the last one, year, as a matter of fact, five new apps have been added to Zoho One.
The new apps in Zoho One are Zoho Sprints, Cliq, Zoho PageSense, Zoho Backstage, and Zoho Flow. Sridhar Vembu, CEO of Zoho Corp. says, “The adoption of Zoho One has exceeded our expectations, and we hope to continue this momentum into its second year. The stats clearly show that customers will use multiple apps from a single suite if those apps work together more deeply than a patchwork of products from different vendors. We see customers combining data from different Zoho and third-party apps—like email campaigns, CRM, customer support, and accounting—to generate new insights and make better decisions. Zoho One has already replaced more than 650 different products, and that number will only grow as we release even more new apps.” As a matter of fact, all new features and applications come to the existing customers as a bonus without spending an additional single penny.
Zoho Adds Around 5 New Apps To Zoho One Every Year
Vembu adds, “Zia for Zoho One, the new Dashboard, and Zia Search are organization-wide tools designed to give our customers deeper understanding of their businesses. This kind of unified data ecosystem is only possible because all of Zoho’s 40+ applications have been built from scratch, over the last two decades, on the same technology base.”
Generally, organizations have to buy additional third-party products on top of their mainstream business apps to cater to such needs. Thus Zoho One becomes one of the most economical and strong product in that regard. In addition, Zoho is about to launch its own data centers. Zoho Backstage is another boon in the form of an event management software application for the event management industry segment. The beauty of this product is its capability of having a feature of multi-lingual. That makes it one of the most versatile product in this category. Also, there is a mobile app to enhance audience engagement. All these features are embedded in Zoho One which is the flagship suite of applications by Zoho. The whole bunch of applications runs completely in the cloud.
In fact, with the addition of these new features in Zoho One (including Zoho Backstage), the product brings a complete synchronization between different departments and roles within an organization thus leaving no void or gap. Like, Zia, which is Zoho’s new AI companion communicates internally to all Zoho applications to gather enough information to respond quickly to user queries with the right combination of information and intelligence. That empowers key users to get the information without writing any queries and thus getting the information much faster and more reliable. Similarly, Zoho One’s Analytics gets a perfect blend of business operations information thereby resulting in more contextual outcomes.
In addition, the new unified Search capability can quickly access data from multiple applications and bring back useful information in lesser time and with lesser efforts or technical skills. The suite becomes a right weapon for potential business users without requiring technical know-how or support.
Zoho Backstage Empowers Event Organizers, Enterprises
Zoho Backstage is a complete end-to-end solution for event management. It empowers organizers, enterprises, and non-profit organizations in many ways. They can plan, promote, and execute enterprise events of any kind. Such as large-scale meetings, conferences, and large-sized trade shows.
Zoho One has become more powerful with the induction of AI, Analytics, and Search. In fact, it is now a complete enterprise solution with a beauty that it fits in any business vertical. On top of it, it is not carrying a hefty pricing model. The pricing is as simple as the product. In fact, it can deliver more than what a business can expect. In addition, there is a regular and substantial value addition in terms of updates and upgrades on a regular basis. That comes to the customers free of cost. Best thing is, now Zoho One has become a 360 Degree product. That means it is capable of catering to every business need. In an integrated manner, if the business needs. Or in a modular form, if the business wants to integrate it with any third party application in use.
As I repeat, Zoho One is an international level product with very competitive costing model. In fact, it is a big disruption for all big business/ERP brands because of its flat and very low-cost model for any size of users. On top of it, it fits all business sizes. So whether you are a startup, a one-man company, or a multi-trillion turnover organization, it is strong and capable enough to cater to all segments with the same perfection, precision, and speed. Otherwise, why world-class organizations like Hyatt, KPMG, L’Oreal, Mahindra, Tata Projects, Sodexo, Bata, Renault, Facebook, Edureka, Times Internet, HDFC Life, Royal Sundaram, and Apollo Hospitals will run their business on Zoho One. That, in fact, is alarming for the organizations using hefty white elephant business applications spending a huge amount on their purchase, deployment, upkeep, and training. It is a high time for all.
Zoho One becomes enterprise operating system
The already powerful Zoho One becomes more powerful with added features like AI, Analytics, and Search.
Category Leaders Partner to Offer End-to-End Data Management, High Availability and Disaster Recovery Solutions
You need not be in tech to recognize that the answer for achieving all of an organization’s goals will likely not be found in one box. What is required is a comprehensive plan, supported by individual solutions that when united, meet each user’s unique IT, business and budgetary requirements. Today, I speak with Connor Cox, Director of Business Development for DH2i on this important topic, as well as about DH2i’s existing and newly announced partnerships, and the value they deliver to end users and the channel.
Q: What types of challenges are you most commonly seeing around data management, availability (HA) and disaster recovery (DR) in Windows, Linux, Oracle and Docker environments Do these differ from environment to environment, or are the challenges consistent regardless of the underlying platform?
A: While some of the specifics and idiosyncrasies vary, I keep seeing many of the same general challenges regarding data management, high availability and DR across the different OS, database and container environments that organizations manage. Although there are many data management and HA/DR approaches available in the marketplace, IT professionals I speak with still find it challenging to maintain reliable high availability and disaster recovery for their systems. Traditional HA/DR solutions are typically cumbersome and brittle to configure and maintain, and almost always cost more than the value they provide. These HA/DR challenges grow exponentially with each different platform under management, as each platform requires different solutions. I have also seen a growing trend of IT departments moving more workloads to the cloud, further complicating the HA/DR story. Across all different platforms, it all really comes down to cost and complexity as the biggest hurdles I see customers face for data management and HA/DR.
Q: DH2i recently announced it had extended its Partnership Program to include additional technology collaborations. Could you talk about the benefits of DH2i’s Partnership Program for its vendor partners, and how the partnerships help end users to address the above-mentioned challenges?
A: Absolutely. DH2i’s Technology Partner Program is all about creating alliances with the industry’s key players for product certifications and resources. By partnering with DH2i, our Technology Partners are able to offer their customers a more straightforward and cost-effective solution for the HA and DR needs that they have. Rather than running into the issues we just spoke about—whereby customers have disparate, expensive and complex HA/DR solutions for each environment—they can use DH2i solutions to unify these environments under a single optimized framework. Even though our software is infrastructure agnostic, these relationships also help end-users have the confidence to use the infrastructure, cloud, and/or software vendor of their choice in conjunction with DH2i, without having to worry about compatibility.
Q: In the announcement, you advise you are working with Amazon Web Services (AWS), Docker, Microsoft, Red Hat and VMware – could you describe how you are working with each partner – and the benefits to end users of the partnerships?
A: That’s right, our latest round of Technology Partnerships we’ve added include AWS, Docker, Microsoft, Red Hat, and VMware. We just joined with AWS to become an APN Technology Partner, which means we are now collaborating with AWS because our solution runs on their cloud servers (or any on-premises or cloud server, for that matter). This gives us access to trainings, technical enablement content, and the ability to work towards the next tiers of partnership which will bring even further collaboration such as technical validations and case studies.
Our Docker alliance sets us up as a member of the Docker Technology Partner Program. This gives us access to the Docker technical alliances team to ensure we have support identifying opportunities and Docker best practices as well as certification support. DH2i software can manage Docker containers for stateful HA, making this a critical relationship.
With Microsoft, DH2i has certified our software on a variety of their platforms and software including Windows Server, SQL Server, Azure, and Hyper-V. This gives customers the confidence that we have Microsoft-certified solutions for their datacenter. Additionally, we are listed on the SQL Server high availability and disaster recovery partners on Microsoft Docs. This gives visibility of DH2i to Microsoft customers who need industry-leading solutions for their SQL Server high availability challenges.
We also recently joined the Red Hat Connect for Technology Partners Program, which gives DH2i access to NFR licenses to ensure compatibility as well other technical and marketing resources from Red Hat to ensure end users get the most comprehensive HA/DR solutions for their Red Hat systems. We have also certified that DH2i solutions run on Red Hat Enterprise Linux 7.
Finally, with VMware we have validated our solutions as being VMware Ready, which ensures reliable operation under fully loaded conditions and confirms the solutions are optimized for VMware vSphere. Additionally, DH2i is a VMware Technology Alliance Partner (TAP). The VMware TAP program works with technology partners to deliver enhanced value to joint customers.
Where can readers go to learn more about partnering with DH2i?
Information about all of the DH2i partnership programs and information about how to join can be found here.
NAKIVO v7.5 is released with many new features in virtualization and cloud backup space. With the launch of NAKIVO Backup & Replication c7.5, NAKIO creates a new landmark in this domain. The new version includes support for vSphere 6.7, NETGEAR ReadyNAS Support, EMC Data Domain Boost Support, and other features. The product becomes a necessary tool for any kind of heterogeneous as well as homogeneous environments. The new version comes with several amazing features. Those include Cross-Platform Recovery, Support for the latest version of vSphere, Advanced Bandwidth Throttling, and a lot more. NAKIVO Inc. has been gaining a tremendous momentum in virtualization and cloud backup software field. It comes up fast with any new technology, hardware, or software that emerges in the market to bring out a solution that aims to make enterprise technology usage more meaningful, safe, secure, and useful.
NAKIVO Backup & Replication v7.5 or as we call it NAKIVO v7.5 not only supports vSphere 6.7 but also integrates with key storage vendors like DELL EMC Data Domain and NETGEAR ReadyNAS. Let us look at it in a little more detail below.
VMware vSphere 6.7 Support: NAKIVO v7.5 Backup & Replication now officially supports the latest version of VMware that is vSphere 6.7. This enables users to use NAKIVO Backup & Replication to seamlessly manage their latest VMware setups and get the full leverage of new features and thus enhance their backup and recovery mechanisms.
Advanced Bandwidth Throttling: This allows presetting of speed limits for data protection processes in order to avoid choking of the network during peak hours. The manageability becomes easier using NAKIVO v7.5. Since it is completely customizable depending on your needs and adjustment with business hours, etc.
Cross-Platform Recovery: NAKIVO Backup & Recovery v7.5 has come out with new features to manage VM data recovery across various platforms. Cross-Platform Recovery feature, in fact, enables users to export VM Data from VMware and Hyper-V backups into various formats like VMDK, VHDX, and VHD. These exported files can be used for recovery across various hypervisors, longterm archival, sandbox testing, or cross-platform VM data migration.
Support for EMC Data Domain Boost: The latest version NAKIVO v7.5 is now officially able to integrate with the BoostFS plugin. This helps in combining features of the VM backup software with the source-side deduplication feature of Dell EMC Data Domain Boost or Dell EMC DD Boost. This integration, in fact, reduces VM backup size by up to 17X while the speed enhances by as much as 50%.
NETGEAR ReadyNAS: NAKIVO v7.5 can directly integrate with NETGEAR ReadyNAS devices thus creating a NETGEAR ReadyNAS-based VM Backup appliance environment. This, in fact, results in a cost-effective VM backup appliance. As a matter of fact, the ReadyNAS-based appliance is a 5-in-1 solution that includes backup hardware, storage, software, data deduplication, and backup-to-cloud feature.
Launching NAKIVO Backup & Recovery v7.5 Bruce Talley, CEO of NAKIVO Inc. said, “Here at NAKIVO, we are looking to meet and exceed our customers’ expectations while staying on top of developments in the industry,” said “We are continuously working on new versions of the product, adding new features and enhancing the existing ones according to customer feedback and the ever-evolving technological landscape.”
A fully-functional free trial of NAKIVO Backup & Replication v7.5 is available at www.nakivo.com.
Trial Download: /resources/download/trial-
Success Stories: /customers/success-stories/
Project management and quality assurance go hand in hand. Without quality, a project has no sanity. without sanity, a product or service has no value for the customer. Here are few interesting project management and quality assurance quotes from the masters of the respective fields.
“As test documentation goes, test plans have the briefest actual lifespan of any test artifact. Early in a project, there is a push to write a test plan […]. Indeed, there is often an insistence among project managers that a test plan must exist and that writing it is a milestone of some importance. But, once such a plan is written, it is often hard to get any of those same managers to take reviewing and updating it seriously. The test plan becomes a beloved stuffed animal in the hands of a distracted child. We want it to be there at all times. We drag it around from place to place without ever giving it any real attention. We only scream when it gets taken away.”
― James A. Whittaker, How Google Tests Software
“Recommended Reading The Definitive Guide to Getting Your Budget Approved by Johannes Ritter and Frank Röttgers provides a systematic guide for creating a financial business case. The book includes examples as well as the methods for using Monte Carlo simulation and sensitivity analysis to create the business case. The methods described in the book can also be used for quantifying risks and project costs. Mary and Tom Poppendieck in their book Lean Software Development: describe the lean principles and the types of waste in software projects.”
― Gloria J. Miller, Going Agile Project Management Practices
Liking Project Management and Quality Assurance Quotes?
“There was no escape: The entire Elliott 503 Mark II software project had to be abandoned, and with it, over thirty man-years of programming effort, equivalent to nearly one man’s active working life, and I was responsible, both as a designer and as manager, for wasting it. …
How did we recover from the catastrophe? First, we classified our 503 customers into groups, according to the nature and size of the hardware configurations which they had bought … We assigned to each group of customers a small team of programmers and told the team leader to visit the customers to find out what they wanted; to select the easiest request to fulfill, and to make plans (but no promises) to implement it. In no case would we consider a request for a feature that would take more than three months to implement and deliver. The project leader would then have to convince me that the customers’ request was reasonable, that the design of the new feature was appropriate, and that the plans and schedules for implementation were realistic.
Above all, I did not allow anything to be done which I did not myself understand. It worked! The software requested began to be delivered on the promised dates. With an increase in our confidence and that of our customers, we were able to undertake to fulfill slightly more ambitious requests. Within a year we had recovered from the disaster. Within two years, we even had some moderately satisfied customers.”
― C.A.R. Hoare
Following is a very old and well-known quote on the project management lifecycle. We sometimes forget and try to deliver something in so less time that calls for a lot of compromises. Any product that has to compromise with its quality for whatever reason is no less than a crime. In fact, it is a forgery against the customer. Someone who trusts you and places an order for a delivery doesn’t expect a breach in trust. Keeping that in mind, here are some fabulous project management quotes.
“The bearing of a child takes nine months, no matter how many women are assigned.”
― Frederick P. Brooks Jr.
“The conclusion is simple: if a 200-man project has 25 managers who are the most competent and experienced programmers,
fire the 175 troops and put the managers back to programming.”
― Frederick P. Brooks Jr.
There is a lot of learning from these Project Management Quotes
“The key is to take a larger project or goal and break it down into smaller problems to be solved, constraining the scope of work to solving a key problem, and then another key problem.
This strategy, of breaking a project down into discrete, relatively small problems to be resolved, is what Bing Gordon, a cofounder and the former chief creative officer of the video game company Electronic Arts, calls smallifying. Now a partner at the venture capital firm Kleiner Perkins, Gordon has deep experience leading and working with software development teams. He’s also currently on the board of directors of Amazon and Zynga. At Electronic Arts, Gordon found that when software teams worked on longer-term projects, they were inefficient and took unnecessary paths. However, when job tasks were broken down into particular problems to be solved, which were manageable and could be tackled within one or two weeks, developers were more creative and effective.”
― Peter Sims, Little Bets: How Breakthrough Ideas Emerge from Small Discoveries
Life without quality has no meaning. The same applies to a product or service. Enterprises that keep quality on top strive to deliver best to their customers. That always helps in growing fast and gaining a sustainable environment. Here are some excellent quality control quotes to refer to.
“No marshmallows. “I don’t believe this! I’m going to write the president of General Mills! Don’t they have any quality control?”
“I’m sure it’s just a fluke”
“Doesn’t make any difference whether it’s a fluke or not. It shouldn’t have happened. When a person buys a box of lucky charms he’s got expectations”
― Susan Elizabeth Phillips, Nobody’s Baby But Mine
“Measurement is the first step that leads to control and eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.”
― H. James Harrington
“If you don’t set a baseline standard for what you’ll accept in your life, you’ll find it’s easy to slip into behaviors and attitudes or a quality of life that’s far below what you deserve. You need to set and live by these standards no matter what happens in your life.”
― Anthony Robbins, Awaken the Giant Within: How to Take Immediate Control of Your Mental, Emotional, Physical and Financial Destiny!
I hope you are enjoying reading these quality control quotes.
“I worry whoever thought up the term ‘quality control’ thought if we didn’t control it, it would get out of hand.”
― Jane Wagner, The Search for Signs of Intelligent Life in the Universe
A collection of some fabulous software quality control quotes
“I’m a Marketer, I don’t believe in Brands, but I believe in Quality and quality has different definitions” ― ياسمين يوسف
“The idea that schedules can be shortened in order to reduce cost or speed up delivery is a very common misconception. You‘ll commonly see attempts to require overtime or sacrifice ―less important scheduled tasks (like unit-testing) as a way to reduce delivery dates or increase functionality while keeping the delivery dates as is. Avoid this scenario at all costs. Remind those requesting the changes of the following facts:
– A rushed design schedule leads to poor design, bad documentation and probable Quality Assurance or User Acceptance problems.
– A rushed coding or delivery schedule has a direct relationship to the number of bugs delivered to the users.
– A rushed test schedule leads to poorly tested code and has a direct relationship to the number of testing issues encountered.
– All of the above lead to Production issues which are much more expensive to fix.”
― Richard Monson-Haefel, 97 Things Every Software Architect Should Know: Collective Wisdom from the Experts
When I tried to search quality assurance quotes on Google, I found some very excellent ones that I thought, I should share with my readers here. In fact, in a way, I am doing a compilation of these quotes for some reference later on requirements. The same you can also do. Comments and remarks welcome. Let’s start:
“Judge us on the quality of our products, and not the quantity of our paperwork.”
― Michel Van Mellaerts
This makes a lot of sense. Fools are the people who judge a company by the amount of paperwork they do. Rather, the focus should be on the quality of the product. Because at the end of the day, the customer needs quality in the product. He has nothing to do with what all kind of bulky paperwork you perform at the cost of the quality.
“I write them to improve my productivity as a programmer. Making the quality assurance department happy is just a side effect. Unit tests are highly localized. Each test class works within a single package. It tests the interfaces to other packages, but beyond that, it assumes the rest just works.
Functional tests are a different animal. They are written to ensure the software, as a whole, works. They provide quality assurance to the customer and don’t care about programmer productivity. Rather, they should be developed by a different team, one who delights in finding bugs.”
― Martin Fowler, Refactoring: Improving the Design of Existing Code
I think this is the only way to improve the design and quality of the existing code. In fact, the focus has to shift completely on the wholesome approach rather than having a piecemeal approach.
Some Thoughtful and Mindful Quality Assurance Quotes
“Quality takes time and reduces quantity, so it makes you, in a sense, less efficient. The efficiency-optimized organization recognizes quality as its enemy. That’s why many corporate Quality Programs are really Quality Reduction Programs in disguise.”
― Tom DeMarco, Slack: Getting Past Burnout, Busywork, and the Myth of Total Efficiency
That is a huge thing for corporates to digest it.
Another set of quality assurance quotes:
“To have a man whose name is on the label showing such interest, commitment, and determination for the best is a wonderful thing. This is someone who will throw money at the quality, who believes in being the best. Never knock it. Would you prefer to have a bean counter in corporate headquarters, someone who never comes near the brewery, making decisions solely on the basis of the bottom line and profit margins?”
― Charles W. Bamforth, Beer Is Proof God Loves Us: The Craft, Culture, and Ethos of Brewing, Portable Documents
“Software testing is a sport like hunting. It’s bughunting.”
― Amit Kalantri
And finally the best of Quality Assurance Quotes:
“What’s measured improves”
― Peter F. Drucker
Hope you enjoyed these quotes.
If the cloud is not an important part of your enterprise IT strategy this year, there is something wrong. Actually, not necessarily. It might be a decision taken by the think tank of the organization after a thorough assessment and evaluation. Cloud or no cloud is another point. The technology has already raised a big question mark on large datacenters on premise. And a large number of experts managing these datacenters. Plus, a very high-cost infrastructure in place to cater to the needs. Enterprises are thinking whether investing so much and keeping a white elephant on-premise is a wise decision or going public is better? Whether data security is more on-premise or on the cloud?
Slightly less than 50% of enterprises participating in a recent survey acknowledged using public cloud IaaS (Infrastructure as a Service). While 35% are using hosted private clouds, a little lesser than that (32%) are using PaaS (Platform as a Service). Less than 10% of the IT heads agree that cloud is not critical. More than 90% IT Managers acknowledging that cloud is an important part of their IT strategy for 2018 means something significant. The key benefits that everybody thinks, the cloud brings are Smooth migration, Speedy deployment, cutting down of running and one time expenses, increase in IT agility, security enhancement, and a significant rise in workload performance. To a large extent, this is absolutely right. Actually, cloud and Infrastructure as a service (IaaS) are purely a hardware game. It includes storage, networking, and compute. All this deployment takes place in an automated manner.
More than 90% IT Managers agree on Cloud A Part of Their Enterprise IT Strategy
In fact, in most of the cases, Enterprise IT Strategy includes Cloud, Security, Mobility, Automation, AI, etc. It is interesting to note the key differentiators between IaaS and PaaS. PaaS, in fact, has become a little fluid in nature these days.
Global digital infrastructure trends mainly include blockchain technology with a key focus on energy consumption and performance. On the other hand, enterprise networking is taking a new shift with the evolution of intent-based networking. This is a step ahead from software-defined networking. Cloud is impacting enterprises in a big way. To go for it or not to go, hyperconvergence, or stay with what you have as it is are the mind-boggling points for enterprises to take a firm decision. Obviously, any decision taken in this regard will have a long-term impact on the overall growth of the organization. The blockchain is currently in its nascent state. But it has already become a worry point for datacenter operators and vendors. IT professionals and datacenter operators can’t ignore the impact of blockchain technology in datacenters. It is interesting to see how this technology addresses the two important factors energy and performance.
In fact, it is high time to leverage blockchain technology by deploying it and extracting juice out of it. An important issue to analyze is its sustainability. And of course scalability. Power consumption is directly proportional to the size of the network. But to control the waste there is something known as consensus protocol. In fact, it is a set of distributed protocols that intelligently decide which transaction to execute and which not to. But that doesn’t compromise with the consistency and integrity of the blockchain irrespective of whatsoever is the number of distributed nodes. Another protocol that is related is proof of work or PoW protocol. Many blockchains like Ethereum, Bitcoin, etc. are already using it. This protocol is very intensely compute and energy-centric. For these properties of PoW, cryptocurrency mining is preferred to be a part of datacenter operations.
Blockchain technology will take some time to acquire maturity
A recent study about blockchain technology states more than 55% of Bitcoin nodes are in datacenters. The study was conducted was IC3, Cornell University, and Technion. It also reveals that around 30% of Ethereum nodes also are part of datacenters.
IT in enterprises always oscillates between keeping the lights on and finding new sources and ways. Where the focus stays more on keeping the lights on indicates not many innovations happening. That means the environment is more of firefighting and handholding than introducing and deploying new technologies. Between all this, digital economy and digital transformation is something that is taking place in every organization. After all growth and development are more important than managing the day to day operations. As a matter of fact, keeping lights is important to ensure smooth operations but it should not happen at the cost of high-level resources being wasted in mundane jobs. The IT budget of any organization can clearly indicate the overall health of technology in the organization. No focus on R&D, new technologies, and innovations clearly indicate lack of growth of technology in the organization.
Transformation should always be a leap ahead of maintenance. If there is a need for automation of key business processes, it should not stop because of unavailability of the budget. Mobility is another area that needs attention. There is no point in letting your technical debt increase on a regular basis. Rather the focus should be towards decreasing it as much as possible. If legacy systems are taking more and delivering less in comparison, it is high time to discard those systems. It is not wise to spend on 10 legacy applications performing different business tasks in piecemeal. And in turn consuming a large chunk of resources, time, effort, and money. The overall equation remains negative in that case raising an alarming situation for the health of the organization. The same is true for legacy infrastructure. After all their upkeep means recurring investment.
Top IT priorities for 2018 Has Security on the Top
Other top IT priorities include hyperconvergence, security, big data, analytics, Artificial intelligence, machine learning, etc. Obviously, one size doesn’t fit all. The priorities will shuffle according to the nature of the business and other key parameters.
On-premise datacenters differ from a public cloud in a big way. While the on-premise infrastructure environment is usually complex with slow hardware provisioning. The public cloud, on the other hand, is simple with rapid hardware provisioning. That is where Hyperconverged Infrastructure or HCI comes into the picture. Because the efficiency that one brings in is difficult to attain with the other model. Many enterprises who understand this gap in the two models is trying to speed up the IT transformation of their on-premise datacenters with the help of automation. Implementation of these automated management platforms definitely requires a different kind of infrastructure setup that supports it well. This, in fact, creates an environment that is similar to a cloud model.
The purpose of Hpyerconverged Infrastructure is to create an on-premise IT environment that is as good as a public cloud in terms of freedom, agility, and speed. The overall IT transformation process not only needs a different set of tools but also a large number of automation initiatives. An important factor to take care of is policy-driven automation that should be the ultimate goal of this whole exercise. The best way is to set HCI as the base that drives environment-wide automation. As a matter of fact, if you notice closely, automation has already become a regular phenomenon in many organizations for their IT environments. In fact, discovery and orchestration tools are becoming an integral part of a datacenter management system. These tools function on the policy-based resource allocation and management. That is why automated management tools like Puppet, Chef, SaltStack, and Ansible are becoming popular.
Hyperconverged Infrastructure needs a strong IT transformation strategy
While having automation tools is the prime requirement for Hyperconverged Infrastructure. There has to be a suitable It transformation strategy in place before anything else in this regard. Because the overall goal is to bring simple and accelerated environment.
The two industry segments that will witness maximum utilization of Industrial IoT (Internet of Things) as per the latest report IAMAI-Deloitte report released today during ‘IoT for Smart India’ summit held in New Delhi. The title of the report is ‘Demystifying IoT for Digital Transformation’. Another key outcome of this report is about Industrial IoT surpassing the consumer IoT spectrum in India by 2020. A significant target set by the Department of Electronics and Information Technology (DeiTY) in their draft IoT policy is to take the IoT industry in India to $15 bn by the year 2020. If India is able to achieve it, it would be having around 6% share of the global IoT industry. IAMAI stands for Internet and Mobile Association of India. There is a significant adoption of IoT in India because of some of the new initiatives like Digital India and Smart Cities Mission.
Industries across all segments are doing their best to take the maximum leverage of technology for growth. That, in turn, shows a steep rise in adoption of Industrial IoT projects. Most of these projects are taking place in Energy and Utilities, Transport, Logistics, Industrial Manufacturing, and Agriculture verticals. One of the major drawbacks is these projects are taking place in isolation thereby increasing the risk of repetition of efforts and energies, and wastage of money at the same time. A universal collaboration of vendors putting staggered efforts in these projects will not only ensure a tremendous decrease in the above risks but also help in gaining higher success and fast results.
Industrial IoT needs a high-level of collaboration among vendors and projects
Bikram Bedi, Head of India and SAARC, Amazon Internet Services says,
“Internet of Things (IoT) as a technology is receiving tremendous attention for the transformative potential it presents. By connecting the physical and digital worlds, IoT vastly expands the reach of information technology and throws up a myriad of possibilities given the ability to monitor and control things in the physical world electronically, and the availability of previously inaccessible data. IoT applications are being deployed across a wide range of use cases including utilities, transportation, agriculture, healthcare, manufacturing, retail, connected vehicles, connected homes and many more. Given the transformative potential and the significant economic impact IoT can drive for a country like India, IAMAI together with the industry, has launched a concerted effort towards catalyzing the IoT ecosystem in the country.”
Harmeen Mehta, Co-Chair, IoT Committee, IAMAI and Global CIO, Bharti Airtel adds,
“IoT is all set to truly transform and enrich our lives with its digital solutions and innumerous possibilities. India is well positioned to leverage the power of IoT to create massive growth opportunities in the country. At IAMAI, we are fully committed to contributing to this journey and are working closely with the Govt and relevant industry stakeholders to build a vibrant ecosystem that demystifies IoT and works towards developing policy, standards & best practices for IoT connectivity, device protocols, security, mass scale production and cost-effectiveness.”
Digital transformation and skill shortages go hand in hand. Digital transformation involves high engagement of digital technology. IT needs to be an innovative partner in adopting and streamlining these technologies for the organization. Innovations must impact all business verticals with an overall goal of enhancing the customer experience. To improve customer experience, you need to improve overall operational efficiencies. For this, it is important to learn how to find the inefficiencies in the complete ecosystem. Once you are able to achieve that, it will definitely help in improving business agility. The overall impact is a better management of business risk. That is the sole purpose of digital transformation in any organization. First and foremost is to keep it ahead of the competition. This demands a number of mindful strategic alliances. These alliances need to happen both internally and externally. It is not wise to keep all expertise in-house.
Progressive organizations face more heat for having perfect staffing. More so digital transformation and skill shortages are a common phenomenon. On the other hand, progressive businesses have the skill to learn and unlearn things faster than others. For instance, cloud adoption across an enterprise calls for a number of disruptions. Staffing is one of the biggest challenges. It happens because, with the changing scenarios, the organization needs new skillsets. Retraining is one option but that is a time-consuming exercise. Usually, this kind of projects happens on a fast track. Organizations want to see the immediate results. But they fail to understand any transformation comes along with a certain set of pain points. The speed of deployment is directly proportional to the appetite to bear the pain. Contrarily, organizations not serious about digital transformation don’t have such kind of IT skill crunch.
Digital transformation and skill shortages create a lot of opportunities
Organizations having a strategic mindset know about the pain of digital transformation and skill shortages. They, in fact, try to find out innovative ways to manage this situation before the pain turns severe.
Even today the majority of the organization’s IT department is busy handling day-to-day operations and upkeep tasks. On the other hand, enterprises are craving for digitalization and thus seeking more strategic alliances with their IT cells. So basically, the change in demand for IT is from tactical to strategic. This is, in fact, becoming a necessity for IT of any enterprise. As a matter of fact, enterprises are changing their strategy towards the kind of people to hire in IT. Earlier it was more of generalists and less of specialists. Now, the new paradigm is to hire more specialists and outsource all generic kind of jobs that require generalists. This, in turn, is helping the organizations to shift to opex model from capex model. This is helping organizations to create a more flexible and adaptable environment. This transformation is helping in reducing the risk of getting left behind.
Strategic Alliances is the top key for Digital Transformation
Usually, larger and older organizations are carrying bigger technical debt. This technical debt is not only in terms of hardware and software but manpower too. A huge pool of legacy systems, about which I discussed in detail in my previous post doesn’t let them easily integrate with their modern systems. Organizations that are understanding this need for greater strategic alliances not only with IT but all other departments who are the key stakeholders for carrying out the digital transformation. Happily or painfully, organizations have to adopt this methodology. As I say often, the first and foremost task is to transform from a tactical to a strategic alliance model. Scrutiny of legacy softwares and infrastructure is very important to sort out the level of burden they are impacting on the overall mechanism. If businesses have to survive in the current scenario, digital transformation is a must.
No digital transformation is possible without forming strategic alliances with IT. For that, IT needs to be an equal strategic partner in the business.
Working in an enterprise as CIO/CTO never goes without buying a software application for one or the other purpose. On the same grounds, a lot of in-house coding keeps happening through the coders and developers onboard. It is quite interesting to know how unknowingly and swiftly many of these become technical debt for an enterprise. Let us see how. Over a period of time, these applications become a pool of legacy for the organization. Many out of these go out of use or are left with a partial use. Despite all that the upkeep of all these bouquet of codes remains the responsibility of IT department of the organization. What is means is that there comes an invisible elephant eating a big chunk of energy and efforts without getting some significant return. There are many reasons for this technical debt piling up silently and effortlessly.
Let me give you an example. Around 23 years back I created a payroll application in one of my early organizations. This was having a lot of complications and obviously it took a lot of time to develop and establish. after working there for 10 years, I joined another organization and then another. While working in this third organization, I get a request from my first organization that there is some issue in the code of the application that I developed and they need my help to fix it. By this time the application was almost a decade old but still in use. In fact, during this period, the home-grown ERP that we had during my tenure was replaced with an international brand ERP. But for payroll, they were still banking on the same application. That, in fact, was quite surprising for me too.
Technical Debt Appears Silently and Without Any Alarm
On enquiring, the CFO tells that the mainstream ERP of international brand was not able to cater to their needs for their payroll requirements. The cost to develop the whole thing again in-house was not possible for them. Getting a new piece of code from a vendor was incurring a huge cost. That is why they decided to carry on with the same age-old legacy payroll application. In my opinion, it was now a sleeping volcano that could erupt any moment thus creating a huge technical debt for them.
If enterprises find a suitable, stable, and reliable monitoring tool for their production applications, they would not mind shifting to serverless architecture at a faster pace. The adoption would become easier and quicker, in that case. The biggest challenge of visibility in serverless environments is monitoring. There are a number of vendors offering serverless monitoring services and capabilities. These include SignalFX, Datadog, New Relic, etc. It was, in fact, AWS Lambda creating a new concept of serverless architecture. The concept was new though but quite interesting offering function as a service (FaaS). As a matter of fact, serverless means the organization doesn’t need a provision of servers. That doesn’t mean servers are not there in the picture. They are, in fact. But the organization doesn’t need to manage servers. This is quite interesting. Isn’t it? Then who handles server management? Who ensures scaling at the right juncture?
Serverless Architecture, in fact, involves a metering mechanism thereby charging users on the basis of certain parameters. These parameters could be the time of code execution and the number of times a code triggers. That makes serverless monitoring quite interesting. Is it costly? Let’s see. Many organizations are already moving from onsite data centers to serverless architecture. That avoids them bothering about containers or even virtual machines. While AWS was the pioneer in creating serverless technology, there are other players now like Google Cloud Platform and Microsoft Azure. Serverless model comes with certain benefits. These include an improvement in code quality, improvement in developer productivity, cost saving, scalability, to name a few. One of the biggest complaint that comes from this technology’s users is lack of visibility into their servers. Serverless environment demands a different monitoring mechanism. Normal APM (application performance monitoring) and IM (infrastructure monitoring) systems don’t suffice the purpose.
Serverless Monitoring Doesn’t Gel With Traditional Systems
In the nutshell, we can say that serverless computing is currently in a nascent stage and undergoing various experimenting from vendors to attain a substantial reduction in overhead. Some more startups in this field are IOpipe in Seattle, Dashbird in Estonia/San Francisco, OpenGenie (Thundra) in Falls Church, Epsagon in Tel Aviv, and Stackery in Portland. In fact, it will be interesting to watch biggies like Amazon’s next moves in the field of serverless monitoring.
What are the enterprise datacenter preferences worldwide? Is public cloud gaining momentum among enterprises? Well, a number of studies and statistics say the shift is happening but at a slower pace. While the centralized datacenter and core business apps still remain within the boundaries, the local datacenters are reducing. One of the key reasons for this could be a shift towards hosted solutions by the organizations. Obviously, embracing public cloud cuts down your investments especially the capital investments. Organizations are moving to public cloud platforms rather than investing in IT Infrastructure. As a matter of fact, any addition in existing infrastructure not only eats a major chunk of your annual budget but also increases your recurring expenses in terms of their upkeep and maintenance. Organizations are preferring to reduce these costs and enhance their operational performance. Of course, the amount of effort largely depends on certain factors.
For larger organizations, it becomes more challenging to migrate to the public cloud. On the other hand, smaller organizations can migrate their workloads easily. Basically, it depends on the volume of data and the complexity of databases and applications. The simple applications and databases are easier to move. Despite such hurdles, it is interesting to see public cloud gaining momentum among larger enterprises. It is interesting for organizations to study how this shift impacts their IT environment usage and workload. Existing IT infrastructure and assets become a worry point for organizations while taking a call to move to public cloud. An increase in cloud service providers clearly states the mood and trends. In many small organizations, in fact, server rooms and local datacenters have vanished. Noticing this trend and success, even larger organizations are now thinking of moving more workloads to the public cloud.
Public Cloud Gaining Momentum But Slowly
Traditional on-premise deployments are decreasing at the cost of public cloud gaining momentum. Colocation is also becoming a favorite choice, especially for mid-level organizations. That way they are able to consolidate their infrastructure and datacenters. Overall, in-house IT footprints are decreasing across the organizations of all sizes and geographies.
This post is in continuation of my previous two posts. The agenda of these posts is to highlight how enterprises can leverage machine learning in various segments thus enhancing their business decisions. In my first post, we discussed How Machine Learning Transforms Customer Experience in CRM? Similarly, in the next post, we talked about How To Use Machine Learning In Supply Chain Analytics? In this post, we will discuss few more important use cases that are applicable in most of the enterprises. So, let us start with few more machine learning use cases for enterprises. Next use case that comes to my mind is Data Analytics. In fact, it is the first use case I think that originated as soon as machine learning came into existence. The good point here is that it can easily handle unstructured data thus making analytics more meaningful with wider coverage of relevant data.
When we talk about machine learning use cases for enterprises, it is analytics that becomes a foremost priority. The reason for that is coverage of wider datasets and capability of having predictive models while embracing unstructured data. It can, in fact, result in prescriptive analytics. The real beauty is letting it used by those who are not data scientists. Thus the real power comes into the hands of business people who need to take in-time decisions that are business critical. Next use case that we can discuss here is HCM i.e. Human Capital Management. Machine learning is already impacting or rather empowering HR specialists with recruitment, development, training, growth, measurement, and retention of employees. There has been, in fact, a radical shift in recruitment in terms of the way job-finding sites function as well as recruiters and organizations identifying most suitable candidates.
Machine Learning Use Cases for Enterprises Are Helping Them In A Big Way
The next class of machine learning use cases for enterprises comprises of Information Security. Through the application of analytics, machine learning is enhancing information security for various issues, detection, alerts, correction, and so on. With the increasing size of end users especially in large organizations, it is impossible for IT department to check security even logs manually. That is where this technology becomes handy. Machine learning is helping a lot in understanding user behavior, identifying risks and vulnerabilities, mitigating risks without manual intervention, and proactively taking appropriate action against external threats.
When we think about using machine learning in supply chain analytics a lot of ideas would come to your mind. There is an important thing to keep in mind before heading towards any conclusion in this regard. As we all know some components of supply chain management talk in analog terms. More so, there are certain things that you would still be performing with pens and clipboards. But there is a brighter side to it. The other parts of it like autonomous trucks, drones, analytics, driverless cars, etc. are using the latest technology. Let us keep our focus on analytics part for now. Some of the prominent issues that we use in supply chain management are common in B2C and B2B segments. Like, delivering same-day. Be it service or product. The world is becoming more and more demanding day by day. The boundaries between B2B and B2C are fading.
Every business and customer expect 24×7 which means businesses need to be ‘always on’. Customers prefer to get personalized information. It is already on the verge of on-demand and real-time. There are amazing outcomes possible while using machine learning in supply chain analytics. It works more efficiently in case of organizations having trouble in handling scales. Like, when it comes to managing a huge number of stocking locations. Forecasting is another area where businesses require a high level of accuracy and the information should come in real-time. Gone are the times of weekly or monthly forecasts. It is now daily or intra-daily forecasting that is in demand looking at the changing scenarios of businesses. To keep getting the data in real time, point-of-sale (POS) systems need to be in place and integrated well with the centralized system. Security is a big concern.
Machine Learning in Supply Chain Analytics Can Create Wonders
There needs to be a mechanism for identifying and alerting against fraud and theft. Another important area is anticipating unusual events in advance. There comes the need of integrating with big data sources like the weather system. All this is not possible without encashing the benefits of latest technologies. You need to use machine learning in supply chain analytics to achieve all these goals. One of the best business use in this regard is UPS’s ORION (On-Road Integrated Optimization and Navigation) system. The system is working well for more than a decade now. It helps UPS drivers find the best possible route with the aid of GPS systems. There are many other things that only machine learning can handle like dynamic pricing, online customer handling on social media, fraud detection, and defect detection. That is not all. These are just a few of the pointers.
In today’s environment to take the complete leverage of machine learning in supply chain analytics, it is important to use technologies and tools like image recognition, social media analytics, video analytics, and integrating with relevant information aggregators. Ultimately the goal is to gain the advantage with the help of technology to take your business to next level of competition.
Every success in any business has only one factor behind it. That is Customer Experience. Let us see how it impacts in CRM. The foremost goal of any CRM is to provide a 360-degree view of a customer. And thus create a great customer experience in CRM. That is the one factor the whole CRM vertical is striving for. Whether it is customer side or the vendor side. The 360-degree customer view was lacking until the early 2000s. Why it was not possible or thought of because the primary focus at that time was on transactional data. That data that was residing in various formats in databases.
At that time it was touching only the structured data. All unstructured data was being ignored or was treated as useless. This unstructured data used to have most of the valuable customer transactions like communications, phone calls, emails, and social media posts. Though social media posts at that time were too less. It was more or less a partial analysis of customer experience in CRM.
Discarding all this data could result in only partial customer experience in CRM. Because such kind of data was not at all analyzed because of the drawback of the technology in use at that time. Machine Learning is now able to give a major thrust to 360-degree customer view because of its ability to analyze huge sizes of disparate data routing from various sources. In fact, it does not matter if it is structured or not. Basically, with the evolution in experience, the experts have been able to define four prominent stages in customer analytics. These are Acquire, Serve, Nurture, and Grow. Let’s see how machine learning is playing a major role in each of these stages. When we talk about the ‘Acquire’ stage, machine learning based use cases would include micro-segment activities on the prospects thereby improving the level of accuracy.
Customer Experience in CRM Is A Constant Evolving Journey
Similarly, during ‘Serve’ stage, machine learning has an ability to create an intelligent chatbot or virtual assistant for customer self-service. That itself simplifies many complex processes and makes things simpler for customers. Customers, in fact, find a lot of value in it. Optimizing average hold time, taking standard requests without the involvement of customer service representative, and delivering faster are some of the gains for the customer. Machine learning helps a lot in the ‘Nurture’ stage thus transforming customer experience in CRM in a big way. it manages the process of customer interactions in such a way that the annoyance factor goes off and satisfaction level goes high. As a matter of fact, it removes all the customer friction points. Finally, in ‘Grow’ stage, machine learning optimizes and customizes by providing best suitable offers. It enhances conversion rate and profitability.
Nearly 50% of businesses currently use IaaS (Infrastructure as a Service). Around 12-15% of growth is estimated for the next 12 months. Doesn’t it clearly indicate a majority of businesses will have moved to public clouds in the next year? Do enterprises fear the public cloud? Or they have a valid reason and a high amount of clarity to resist it? Are they really resisting it? Cloud services include SaaS, IaaS, and private cloud. Most of the organizations are using it in one or the other form at least for one of their business applications catering to at least one of their critical functions. Also, a study says the majority of investment in hosted infrastructure is happening to public clouds (IaaS). Still, it is far from popularity or wider acceptance. What could be the reason for this? There is a hidden war going on between vendors and enterprises.
There are in fact different kind of scenarios. There are more than 40% of organizations that are not using the public cloud. They neither intend to in the near future. On the other hand, there are organizations that intend to adopt it but are slow to adapt to the cloud. The third segment is of organizations that rely completely on the private cloud which could be hosted or in-house. Cost and security are the two key glitches that businesses have in mind when it comes to adoption of public clouds. Also, it asks for a huge transformation for which it seems they are not ready mentally. It could be because of attributes of their organizations or their IT setups. Still, some prominent features have emerged. For instance, there is a straight connection between the size of the organization and its IaaS adaptability.
Public Cloud adoption is far from expectations
On the other hand, this trait takes a reverse sweep when it comes to the age of the business. The companies that are under five years of existence have highest rates of IaaS adoption. While the older businesses are late and slow adopters. These could all lead to public cloud resistance from various perspectives.
Smart Visualization is not the only way of making AI transform BI in a big way. There are other ways too. Before coming to those, let us discuss smart visualization a little more. As we understand from my previous post, it helps in eliminating the gap between experts and non-experts. That means it helps in getting better business results by actually involving and engaging business experts who are not too tech savvy and don’t use any query languages. Therefore, they henceforth don’t need any tech assistants in boardrooms and other top-level meetings to run some critical analytics that helps them in making crucial business decisions in time. It, in fact, makes machine learning to suggest the selection of right graphic for the right query thus making AI much easier. Another important tool is embedding AI into Data Storytelling. It happens with the help of NLG technology.
AI is able to make data storytelling a powerful tool through the integration of NLG technology. NLG, in fact, makes storytelling more narrative-driven. It happens through telling narratives employing data in preparing visualizations and business dashboards. NLG, as a matter of fact, generates words and sentences from data using NLP. It is definitely quite an interesting part to understand how it happens. There are a number of recent business case studies having integration of NLG into dashboards thus enhancing data storytelling. This, in fact, provides critical business insights that are not easily understandable in numbers and graphics. It involves usage of sentences in natural language thus making it more meaningful by providing additional context and understanding. That altogether gives a different meaning to visualizations, reports, and metrics in dashboards. The whole purpose is to make them easier to comprehend.
Data Storytelling Is Evolving At A Faster Pace
That is one of the reasons for the fast growth of integration of NLG into dashboards. This integration of NLG into dashboards not only makes data storytelling easier but it also makes it easier to query the data with the help of NLP algorithms in order to tell the story in a comprehensive and impressive manner.
AI is transforming business intelligence in a big way. The era of pilots and POCs is over. It is the action time now. The things are happening in production now. As we all know Artificial Intelligence i.e. AI is a combination of a number of technologies like Machine Learning (ML), Natural Language Processing (NLP), and Natural Language Generation (NLG). In fact, this integration of AI with BI is creating wonders. What it does is it makes analytics more crisp and user-friendly. This will lead to BI becoming accessible to masses (or the non-experts) thus speeding up the process of its adoption. As a matter of fact, organizations are becoming data-driven that can help greatly in decision-making. The real catch is to make BI so friendly for business users who are not data analytics or BI professionals that they get the real benefit out of it.
There is something called Smart Visualization. It is visualizing the data smartly. It works with the help of Visual Query Technology. This technology helps in the visual analysis by means of graphically answering BI queries in charts, graphs, and other visual forms. That, in turn, makes analytics faster to perform and easier to operate. This, definitely, removes the hurdles to adoption by removing the requirement of writing queries in code. That was one of the reasons that BI despite being a powerful tool was out of reach of users having no SQL or other programming skills. The same kind of skill of introducing machine learning into smart visualization is emerging as a smart move to apply AI to BI thus removing the wide knowledge gap between experts and non-experts. This is how AI is Transforming Business Intelligence In A Big Way.
AI is Transforming Business Intelligence through Smart Visualization
Smart Visualization, in fact, is enabling users to create powerful dashboards with impressive infographics. The users in case need not necessarily be having deep data analytical skills. That is a wonderful way of AI transforming Business Intelligence and making it accessible easily and effectively.
With the changing trends of Converged Infrastructure and Server Trends, it is interesting to see the shuffle taking place across the globe in terms of on-premise vs off-premise. IT infrastructure is transforming in a big way. So are the perceptions of today’s IT managers. In fact, for many organizations, the decisions between the on-prem and off-prem are far away from clarity. On the other hand, there are many who are changing their IT strategies to make more room for going in favor of off-prem thus reaping benefits out of it. Organizations are still facing troubles for moving ahead in the direction of complete orchestration and automation. It is happening either due to lack of in-house knowledge, business direction, or finding a right vendor to achieve it. And this is happening at the organizations having the most skilled IT teams with no shortage of funds.
One thing is true when it comes to On-Premise vs Off-Premise decision. A small progress in favor of the latter promises to return tangible infrastructure-provisioning profits. The role of servers and converged infrastructure is changing drastically. The key factors impacting it are hyper-convergence, workload balancing, and containerization. Basically, it is all about right-sizing for the sake of future. While many organizations are sure that the existing infrastructure in place is more than enough to cater to their future needs. All it will need is a little expansion and tweaking but no major replacements. On the other hand, there are very large sized enterprises that are worried about a wide gap between what they have currently and their future needs. Hyperconverged, in fact, is becoming a core need of an organization that earlier was having a minor role to play. This is creating major changes in data centers.
Things are Clearer for On-Premise vs Off-Premise
Similarly, a large number of organizations are having or talking about containers. They believe that container technology has an inherent ability to fasten application provisioning times. These are my thoughts on the current trends of On-Premise vs Off-Premise.
This is my concluding post on Top Security Concerns For 2018-2019, a 3-posts series. As we see Encryption, SOC and Mobility remain the top concerns in this regard. In my last post, we were talking about the increasing trends of mobility of employees of an organization. Thus, automatically, a new demand arises from tackling staff mobility between various network environments. The users need to access a number of applications and services lying in a heterogeneous environment. Some of these lie on-premise while the other are residing on the cloud. With a tremendous increase in encrypted tunnels, it is becoming difficult to manage the whole ecosystem. Thus, the new equation is to have a complete visibility and control of the endpoints. The strategies and points of importance are changing shape at a faster pace. Endpoint Security Functionality is, in fact, is a vast area of work.
As a matter of fact, endpoint security begins with telemetry collection for the purpose of analysis. And it goes up to complete lifecycle security, inspection, detection, and real-time response. The toughest task is to integrate all this on a single platform. This brings in a new layer of vendors with a complete focus on endpoint security. These include CrowdStrike, SentinelOne, Carbon Black, ESET, Endgame, Cybereason, and Cylance. IoT security and connected devices are the next big thing when we talk about major security concerns. More sockets, more endpoints, more devices, and more coding automatically pitches in more scope of vulnerabilities and threats to any enterprise. In this context, in the recently concluded RSA Conference in San Francisco, Microsoft launches Azure Sphere. Azure Sphere is a new security platform that focuses on protecting any kind of embedded devices in a smart manner. This is the need of the hour.
The scope of Endpoint Security Has Increased Tremendously
Azure Sphere is a combination of hardware and software. It consists of secure microcontrollers providing hardware-based Root of Trust to ensure a secure boot. In fact, it also includes cryptographic authentication and a complete protection of device communications. Surprisingly, this is the first non-Window OS from Microsoft. The OS is a custom Linux Kernel. This is because of two reasons – speed and security. Another surprising move by Microsoft is enabling Azure Sphere to run on any cloud. Not limiting it to just Azure. Azure Sphere aims to secure the whole IoT ecosystem right from the component level to the cloud. In fact, there will be more to see from other vendors soon. But all vendors realize the need for endpoint security and other security concerns that we discussed in these three posts.
This post is in continuation to my previous post on Top Security Concerns for 2018-2019. The key security concerns include Encryption, SOC, and Mobility. In this new concept of ‘encryption-in-use’, vendors use a number of techniques to tackle the encryption issue. These techniques include homomorphic encryption, secure multi-party compute, and secure enclaves. With the help of these techniques, vendors allow access to data for various purposes without the need for decryption. In fact, these same technologies are now used in cryptographic key management. With this, it enables the security of hardware-based key management with the help of software thus providing a higher level of flexibility and adaptability and lowering of cost. The same kind of transformation in technology is taking place in SOC. There is a severe need for the SOC of the future. There are distinct guidelines for this.
While the traditional SOC works in SIEM model. It stores event logs and alerts. These events logs and alerts from the traditional SOC aim to feed analytics engines, guide investigation teams, drive SAO processes, satisfy search requests, and interface with custom scripts. But that is not enough to tackle the current situations and security risks. The new methods include using traces in network communications in order to identify attacks that are happening in real time. The focus now is more on incident detection, exceptions reporting, and response activity. This new array of vendors include FireEye, Awake Security, Palo Alto Networks, ExtraHop, Gigamon, Darktrace, Corelight, and Vectra Networks. These newer technology vendors are becoming an integral part of SIEM deployments. Next comes Endpoint Security. This was, in fact, one of the most discussed topic at the RSA Conference.
Encryption, SOC, and Mobility remain on top of the security concerns
As we all know employee mobility is an increasing trend worldwide. And it is a point of concern from a security point of view. Almost 60% of employees in any organization demand mobility between network environments. Because they need to access a number of on-premise and cloud services through one or the other secured or encrypted tunnels. As a matter of fact, all this is becoming difficult to manage and inspect. As we see Encryption, SOC, and Mobility are changing the whole concept of security.
Finally, we shall be concluding Top Security Concerns For 2018-2019 in the next post.
There were more than 600 exhibitors in April at this year’s RSA Conference in San Francisco. The attendance was almost touching 50,000. There are some very prominent key cybersecurity concerns that will keep IT managers on toes for next couple of years, at least. Those in the Asia Pacific and Japan can register for the upcoming RSA Conference 2018 Asia Pacific & Japan. The location is Marina Bay Sands, Singapore and the dates are 25 Jul 2018 – 27 Jul 2018. More than 65 speakers will be enlightening the attendees on different aspects of cyber security including the top security concerns for 2018-2019. The zero-trust philosophy is strengthening its roots in this field. It is about discarding your orthodox concepts like earlier security model. The legacy model says trust anybody inside the premise and untrust everybody that is outside your perimeter. The whole architecture is faulty and risk-prone.
Hence, the new concept is to trust no one. That is probably the right approach to tackle Top Security Concerns For 2018-2019. In fact, zero-trust is based on a new framework that we call as reference framework or reference architecture. It is, as a matter of fact, independent of technology in place. There is no logic in granting access to resources like servers, applications, networks, and devices to everyone inside the perimeter. Rather the enterprises need to change their perception about security policies. On the other hand, zero trust concept includes all vendors including MFA, IDaaS, Network Security, SSD-WAN, and CDN service providers. In fact, Encryption is also becoming a challenge for security experts. You can keep the whole path secure that is carrying the encrypted data. What about the security at the point of decryption? That itself is highly vulnerable to attack from inside as well as outside.
Top Security Concerns For 2018 Calls To Trust No One
The point of encryption to decryption and decryption to encryption is open for attackers possessing compromised credentials. The same is also open for attackers with malicious intentions sitting inside. To counter this risk there is a new concept ‘encryption-in-use’ and virtual HSMs (VHSMs). VHM is a software suite that stores secret data outside the virtualized application environment. The key vendors for this new technology include Baffle, Enveil, Fortanix, Unbound, Inpher, and PreVeil.
We shall continue about Top Security Concerns For 2018-2019 in my next post.
Machine Learning has been in Practice now quite some time. Enterprises are adopting it fast to leverage its power. It definitely helps in excelling in business and stay ahead in the competition. Machine Learning, as we all know, has a tremendous power to automate and optimize any simple or complex business process. There are a number of machine learning use cases that we can pick from enterprise world. Enterprises either are already working on it or have immediate plans to deploy it. And those who are still away from it will feel the brunt sooner or later. It is always better to identify a critical business issue and then work towards addressing it through machine learning. Machine learning deployment can enhance business in many ways. It can create an artificial intelligence spectrum to help in critical business decisions. It also helps in automation of business processes, analytics, and operations.
As a matter of fact, machine learning use cases can derive out from many areas of the business. Like, any business task that is repetitive and/or mundane in nature in one such prominent area. Another area to look at is the activities that involve a high amount of risk or danger. In fact, it also helps a lot in quality improvements and tackling operational issues. Obviously, machine-learning software is to help humans and not replace them in the job. That is why it is wise to automate most of your low-level tasks so that human mind can concentrate on more complex tasks. Finally, it is going to be a man-machine combination to manage any kind of business. There are three valuable components of the business that play a major role in machine learning developments. These are data (both input and output), model, and algorithm.
Machine Learning Use Cases Rely On Data, Algorithm, and Model
Data, in fact, is the most valuable asset of any business. And that is the backbone of all machine learning use cases. Data is the real driver of business and business decisions.
The Generative Adversarial Networks (GAN) is, in fact, never a single network. It is a set of networks, at least two, operating at the same place but working against each other. Each of the networks brings its own unique set of results. For instance, in GAN approach, the first network creates realistic images, while the second one identifies whether those are real or not. It is like the first network is synthesizing something and the second one is monitoring its operations and controls what it creates. As the time passes, the second network trains the first one how to create unreal images in such a perfect manner that nobody is able to make out those are fake or unreal.
That means the fake images now the first network produces are as good as the real ones. In fact, you won’t be able to distinguish the fake ones. It is, rather impossible to differentiate between the two. That is the purpose of the Generative Adversarial Networks. Now, think about its applicability and usage. Which business or industry would need such kind of technology and for what purpose? For that, let us think about a few use cases. As a matter of fact, there are a plenty of use cases and many are already in production and operation. The first use case could be creating fake but realistic healthcare data. The purpose of such records is to train various models of machine-learning using different algorithms. But in this case, since you are not using real data, there is no infringement of patient’s privacy.
There are various use cases of Generative Adversarial Networks
In fact, Generative Adversarial Networks is a classical approach that we use in machine learning technology. Another use case you can think of is creating fake malware in order to test an anti-malware application. As a matter of fact, there are plenty of projects that are operational in the field of fake news videos and fake images of celebrities and famous personalities. If you look at the approach it follows, it simply matches with the unsupervised machine learning. But from another point of view, we also find adult supervision in it. Hence, would you call it an advanced version of unsupervised machine learning or a mix of supervised and unsupervised machine learning?
Reinforcement Learning is an important category of machine learning algorithms. It has a very classical connection with theories of behavioral psychology. There we learn about a reinforcement learning environment. The whole game here is of this environment training algorithms along with a deep sight on their performance. And on the basis of performance, it rewards or punishes. Before going further, let us go a few posts back and understand the background of Machine Learning better. In one of my previous posts, we understand how important it is in the current cutting edge environment for enterprises to adopt Machine Learning. On one hand, the businesses are having tougher conditions. On the other hand, technology is leveraging a lot of scopes to enhance and excel against tough competitions. This is an era where every disruption is an opportunity.
We also learned in one of the previous posts a straight connection between Deep Learning, Machine Learning and AI (Artificial Intelligence). Along with, we learned Supportive and Unsupportive Machine Learning Types, the difference between training and inference, and the relationship between datasets, algorithms, inputs, and outputs. Hence, before going further it is important to read those two posts and the previous one on Unsupportive Machine Learning and its examples. Now, coming back to Reinforcement Learning. As we see that it works on the philosophy of rewards and punishments. What happens is that during each step of the training process, the learning algorithm selects one of the observations and a suitable reward from a pool of possible actions. The model then keeps on accumulating positive or negative feedbacks by running the same process repeatedly. In fact, all this happens in a very dynamic environment.
Reinforcement Learning Works On Rewards and Punishments
Since the algorithm of Reinforcement Learning aims to collect the maximum possible rewards to enhance its next decision. In fact, it works quite intelligently. It can sacrifice short-term gains if it perceives long-term gains in lieu of them. This technology works best in gaming, robotics, and telecommunications.
In my previous post, we learned about Machine Learning and Supervised Machine Learning. Carrying it further, in this post, we will learn about Unsupervised Machine Learning and its uses. Machine Learning is a subset of Artificial Intelligence. Supervised Machine Learning has two important steps – Training and Inference. Inference happens after the completion of Training. The whole mechanism works on some algorithm and data sets. In supervised machine learning has an altogether algorithm and datasets. Its main use is for future predictions on the basis of future data. While it works on an imaginary situation that is yet to take a real shape, the prediction process works in a near-to-perfect state. That is the beauty of Machine Learning and its various applications.
Unsupervised Machine Learning is applicable in the absence or lack of training sets. In such a situation you don’t have any idea what will be the shape of the output. unlike Supervised Machine Learning, here all the input data is unlabelled or unstructured. The soul of the goal lies somewhere inherent in the data. While in previous category of machine learning we are trying to predict future with future data. In this case, we are trying to predict present but without any labeled or structured data. The algorithm, again, has to play an important role to draw out a meaningful result and output.
Unsupervised Machine Learning Works With Unlabeled Input Data
Unsupervised Machine Learning algorithms fall into two main categories. First one is Clustering. Its use is to find out hidden patterns or grouping in data. The second one is Association. Its application is to find out rules that explain parts of the data. Like, people like to go to this place also that place. Example of unsupervised machine learning algorithm would be K-means clustering. Another example you can state is Apriori. While the former is for Clustering type, latter is for Association. The best use of these two different categories of machine learning is to use them together. So when you use unsupervised and supervised machine learning techniques together, you can easily and effectively use the output of unsupervised machine learning as the training set for supervised machine learning.
While many think AI, Machine Learning, and Deep Learning as synonyms, it is not so. Each is distinct and so is their purpose and functionality. In fact, there is an interesting relationship between the three in a straight line. You can call machine learning as a subset of AI (Artificial Intelligence). Similarly, Deep Learning is a subset of Machine Learning. Let us see some of the algorithms and models that support this. In fact, a machine learning algorithm would be a series of actions or computations. Best way to understand this is to think of Random Forest. Think of applying random forest to a dataset. The result produces as an output will be this algorithm. The model will change if the data of the algorithm changes. Another way to look at it is to use the same data with a different algorithm.
Basically, two important steps to understand in case of Machine Learning are Training and Inference. Treat training as a process to optimize the whole mechanism. Here, we use an algorithm for a specific purpose. The purpose is to derive a mathematical function that has an ability to reduce any kind of errors in the training data. Once training is over, inference comes into the picture. It helps in making predictions on the basis of new data that is coming in. Now, let us try to understand what is Supervised Machine Learning. In Supervised Machine Learning, we use algorithms trained with labeled datasets. These datasets are highly structured or organized. Here, you use independent variables as input and get numeric or binary results as output. As a result, we use this technology to predict the future results on the basis of future inputs.
Supervised Machine Learning Is For Future Predictions
Supervised Machine Learning is of two types. First one is Classification. In this, the output is a category. Like, this or that, good or not good, relevant or not relevant, etc. The second type is Regression. Here, the output is a value. Like, dollars, temperature, etc. Support Vector Machines (SVMs) fall in the first category.
Any enterprise if not working in any of these three technologies now will certainly be in trouble tomorrow. These three technologies include Artificial intelligence (AI), Machine Learning, and Deep Learning. In fact, all three have a deep connection with each other. These three are right now the most powerful transformational technologies available in the current period. And all three are set to touch most aspects of our lives with or without us knowing about it. That is the penetration these will have on the mankind. As a matter of fact, the combination of AI and ML are taking practical shape in production rather than just demonstrating their possibilities in academics and R&D. Rather it has become a most sought after enabler in today’s technology. Industries are working on it with real-life use cases and ROIs. The adoption is increasing at an exponential rate across the globe.
Many startups working in this area are able to integrate machine learning functionality well with business requirements and draw out appropriate results to enhance and automate business processes. And the results are phenomenal. In fact, it brings a tremendous increase in power, availability, applicability, and flexibility of resources. Probably without the adoption of these technologies it would be impossible to explore the amount and accessibility of digitized data flowing from various sources. An improvement in efficiency is only possible despite an increase in complexity and volume of data. It is giving good results only because of the machine and deep learning algorithms that are the driving factors of AI. As a result, the future is appearing to be more promising with the help of fast adoption of machine learning in the enterprises. In fact, companies not adopting it even now will be out of the race automatically soon.
Machine Learning Is Changing the World Faster
In all these circumstances, control, access, and ownership of data is the key to drive your business.
Information security is the utmost priority for any business these days. While there are a number of projects that a CIO/CTO/CISO can initiate in his organization, few are important to keep on top of the agenda. These projects are not a one-time activity. These are of continuous nature. They basically work on the pattern of PDCA. Plan, do, check, and act. That means deployment is not the end of the project lifecycle in this case. Rather the real project begins from there. Once you deploy any information security projects, there is a need for regular audits and enhancement. In fact, technology is changing and progressing too fast. The same implies to its negative side too. The more you secure it, the more it becomes vulnerable. As a matter of fact, threats to an organization are not only there from the external world. It is equally threatening from inside.
To cope up with all these threats and vulnerabilities, there has to be an assessment mechanism in place in the organization. Following is the list of 22 Information Security Projects for an organization. These are all critical irrespective of the size and volume of the business. If these are not in place, ensure them to be right in place.
- Vulnerability Assessment
- Data Loss Prevention (DLP)
- Mobile Device Management (MDM)/Enterprise Mobility Management (EMM)
- Artificial Intelligence/Machine Learning for security
- Security Automation
- Security Operations Changes
- Security Awareness Initiatives
- Cloud Infrastructure Security
- Cloud Access Security Broker (CASB)
- Monitoring Improvements
- Patch Management
- Multi-factor Authentication
- Security Information and Event Management (SIEM)/Security Analytics
- Application Security
- Firewall Deployment/Management
- Regulatory Compliance (e.g. PCI Compliance, GDRP, PSD2, NIST)
- Privileged User Management
- Incident Response
- Intrusion Management
- Identity As A Service (IDAAS)/Single Sign-On
- Endpoint Security
Information Security Projects If Not Started In Time Can Lead to A Big Loss
Another point to note here is for the top information security projects currently being implemented within your organization, how do you ensure to place key determinant in place to get the approval in time. Otherwise, your information security projects will remain only on papers and never will see the light of the day.
Zoho is probably among those unique businesses where the business model, business benefits, customer value, and product value remains same irrespective of the size of the business of the customer. So whether it is a single professional, single person company, or a multinational; the pricing and all business & support prepositions remain same. There is no disparity. There is no confusion. And hence there is no differentiation. That is the beauty of this international Indian company having more than 6000 employees on board working in more than 19 offices across the globe. Well, the latest news is its Financial Suite, Zoho Finance Plus is 100% GST compliant. Rather, it has a lot more to offer than a number of other popular and tremendously costlier products. Basically, it all about awareness of the beauty of the product. It offers the least investment and great benefits to the business.
Various businesses fighting with their existing business applications to cope with GST regulations and requirements must have a look at the smooth operations and outcomes Zoho Finance Plus has to offer. That too at a very nominal cost. GST is a mandatory regulatory financial requirement for all businesses. The product offers complete integration with any legacy business applications. What businesses lack while working on other expensive applications is a 360 degrees visibility into their order and fulfillment cycle. That even lacks in most of the world-class business and ERP suites. That is where this product takes the front seat ensuring zero accounting errors and a trouble-free tax period. With the launch of this GST compliant financial suite in April 2017 that too well before the official roll-out of GST by the Ministry of Finance, Government of India, it demonstrated its phenomenal strength, depth, and dedication towards its customers.
Zoho Finance Plus Is A Complet Finance Suite++
Zoho Finance Plus, like its other products, being a cloud-based product, has zero capital investment and nominal operational cost. As Sivaramakrishnan Iswaran, Director of Product Management, Zoho, explains it better saying, “The proliferation of smartphones, broadband connectivity, and upcoming GST regimen is a great opportunity for businesses to move their accounting and other operations online. With Zoho Finance Plus, businesses get a beautiful interface to manage their transactions day to day and file their GST returns, all from a single platform. Zoho Finance Plus simplifies returns filing for businesses and increases compliance.”
GST is a business reform rather than merely a tax reform. Being mandatory for any business, the success of its implementation depends largely on technology infrastructure and right strategies in place for a business. Zoho Finance Plus is becoming the first choice of millions of SMEs thus empowering them with a right application for invoicing, filing tax returns and other critical transactions. That too along with being completely GST-enabled and compliant. Basically, it is not a single application that any business thrives and survives on. There are a number of financial apps that require comprehensive integration to communicate with each other and seamless exchange of data. Zoho Financial suite does the same thus ensuring management and key users have real-time information for taking fast and right business decisions. Filing return on GST portal becomes just a matter of a click of a button.
Zoho Finance Plus Includes Different Modules On Single Database
Zoho Finance Plus includes different modules like Zoho Books, Zoho Invoice, Zoho Expense, Zoho Subscriptions, and Zoho Inventory. For filing GST returns while using these modules, there is no data duplication across apps or requirement of a manual addition of transactions. Everything works in a smooth flow, flawlessly. In fact, Zoho Books create monthly returns automatically. Rather, it is just a matter of filing the return with a click of a button. In addition, there is an automatic matching and reconciliation of transactions. So, along with One Nation, One Tax, it becomes One Vendor. There are many other key features of this product. Like greater visibility into orders and payments, faster reimbursements and accurate accounting, and so on.
The pricing model of Zoho Finance Plus(https://www.zoho.com/in/financeplus/) is quite simple. It is INR 2,999 per organization per month. This includes 10 users having access to multiple Zoho Finance apps. All these capabilities of Zoho gave ample trust to GSTN to select Zoho as a GST Suvidha Provider (GSP) (https://www.gstn.org/ecosystem/). In total, there are not more than 70 GSPs in India (https://www.gstn.org/gsp-list/). Iswaran adds, “Being a GSP ourselves helps in cost optimization and providing a great experience for our business users. Furthermore, we will leverage our in-depth expertise in developing platforms and ecosystems to support a thriving community of Application Service Providers (ASP) connecting to GSTN through us.”
In fact, Zoho (https://www.zoho.com/) has made accounting and reconciliation simpler and hassle-free by partnering with banks like ICICI and Standard Chartered to give an entirely a different kind of experience to its customers using Zoho Finance Plus.
Recently Archive360 has been Included on Microsoft List of Partners Helping Customers in Their GDPR Journey. The list is a part of the Microsoft’s latest blog with the title “Leverage the Channel Ecosystem to Create GDPR Offers”. GDPR or the General Data Protection Regulation is impacting all organizations across the globe that perform any kind of business with EU firms. In this context, I had a discussion with Dan Langille, Global Director, Microsoft, Microsoft P-AE, Archive360 (www.archive360.com). Here are the key components of the discussion we had:
What were the qualifying criteria for becoming Microsoft Partner in tackling customer’s GDPR issues?
Dan: Partners, such as Archive360, that have services and/or solutions which assist customers with their journeys toward GDPR compliance must be nominated for consideration by a Microsoft employee (usually the Partner Development Manager). Those nominations go to a team Microsoft has within its overall One Commercial Partner (OCP) organization which reviews and approves (or declines) the services and solutions.
What value does it add to an organization becoming Microsoft Partner?
Dan: Microsoft’s investment in curating this list of partners is yet another great example of Microsoft’s commitment to being a partner-led organization focused on collaborating with partners to drive high-value business outcomes (in this case GDPR compliance) through the Co-Sell motion between partner sellers and Microsoft sellers.
What additional responsibilities does it bring with this partnership?
Dan: Partners get recognized for inclusion in programs such as this through a company-wide understanding of (and alignment to) Microsoft’s go-to-market priorities. Archive360, as one such partner, we incur no additional responsibilities here other than to have brought to market a qualifying solution that is also listed in Microsoft’s internal OCP Catalog.
With so many partners on board, does Microsoft apply any performance measurement approach for each of their partners?
Dan: Many Microsoft programs do have performance criteria, but this is not one of them due to the complex nature of GDPR compliance and the myriad ways and means for customers to meet their obligations. (As an aside on the number of partners in this program: The list is actually quite small and exclusive relative to the sheer size of Microsoft’s global partner network and relative to the number of companies around the world that are or might be affected by GDPR.)
Recently I had a discussion with Sejal R. Dattani, Marketing Analyst, Zoho on custom apps and their impact on business. Is custom apps a costly affair for organizations? According to her, building custom apps for your business is a one-time investment. Below are her valuable views and comments on the same.
Is your organization using the right tools? For years, companies provide their employees with packaged software that’s proven and widely used. But business has begun to change. Today, custom apps are quite affordable and easy to build. More businesses make their own apps to run their daily activities. Here’s why we expect more businesses to start adopting custom applications in the next few years.
When you depend on the same packaged software as your competitor, it becomes difficult to outrank them. To get an edge, you need to update your processes and implement changes frequently to offer better services.
Since custom apps have become easier to build, even people without a technical background can build a software to manage data and automate their processes. And, when you have applications that work exactly the way you want, your teams can react faster to customers’ changing demands.
The time required to develop custom apps has drastically reduced from months to weeks. For example, with a cloud-based DIY platform like Zoho Creator, you can launch your apps without installing new software or configure servers. And if you need some expert advice, you can always get in touch with certified developers to help you out. What’s more, when you create an application on Zoho Creator, you don’t have to waste your time and money re-building it for various operating systems. Your app automatically works on mobile devices, allowing your team to access vital information and follow tasks at any time of the day.
Custom Apps is a one-time investment
Businesses nowadays realize that packaged software is rigid. They make you change your business to fit them. And to make things worse, packaged apps are often incompatible with your existing services, too. Custom apps, on the other hand, let you change them to fit your business and are even integrate with your internal applications and other third-party services. For example, think of a scenario of running a retail store. You can integrate with a logistics service like FedEx and keep your customers informed of their order status.
The number of businesses switching to custom apps will accelerate in the coming years. And judging from the benefits, it’s no surprise. Building a tailored solution that’s focused on scalability and efficiency, is an investment for life.
One of the hottest trends to emerge in the world of enterprise cloud computing is “multi-cloud data management,” which, in a nutshell, is simply keeping track of data assets that reside across multiple data centers and cloud services. As enterprises increasingly move IT operations to the cloud, ensuring the security, availability, and performance of their applications and data becomes increasingly challenging. Today, I speak with Tom Critser, co-founder, and CEO of JetStream Software, about his company launch and cross-cloud data management platform.
Q. Please tell me about JetStream Software.
JetStream Software is a new company, but we have a software engineering team that has been together since 2010, and we’ve invested more than 200 developer-years in our core technology. This April, we announced the JetStream Cross-Cloud Data Management Platform. Our mission is to give cloud service providers (CSPs) and Fortune 500 enterprise cloud architects a better way to support workload migration, resource elasticity, and business continuity across multi-cloud and multi-data center infrastructures. Currently, our platform is designed to complement VMware cloud infrastructures including VMware Cloud Provider Partners (VCPPs) and VMware Cloud on AWS. We are headquartered in San Jose, California, with a second development center in Bangalore, India.
Q. How did the company get started?
Our three co-founders and much of the engineering team have been working together for a long time. Our first startup was FlashSoft Software, which developed software for storage IO acceleration. Our objective was to enable enterprise flash memory in a host server to handle IO operations for “hot data” and to deliver the performance of enterprise flash storage, but without replacing the existing storage of the enterprise. FlashSoft was acquired by SanDisk in 2012, and then SanDisk itself was acquired by Western Digital in 2016. At SanDisk, the team grew in size, and we collaborated closely with VMware to design the vSphere APIs for IO Filters framework, which is a key technology for our new company’s cross-cloud data management platform. After Western Digital acquired SanDisk, we worked with Western Digital to establish JetStream Software as an independent company.
Q. Who is your ideal customer, and what problems are you solving for them?
Our ideal customer is a cloud service provider (CSP), serving enterprise customers, and in a similar way, the enterprise cloud architect. We address two key problems for these customers:
- The first is to take friction out of the enterprise’s migration of its on-premises virtual machines (VMs) and applications to the cloud. Enterprise cloud migration today is an expensive, hands-on operation, typically requiring a lot of professional services. There are new tools that help organizations plan their data migration and prepare configurations at the cloud destination, but getting huge volumes of enterprise data to the cloud with minimal disruption remains a challenge, and that’s the problem we target.
- The second problem we address is to help CSPs and private cloud operators deliver enterprise-grade resilience, availability, scalability, performance, and manageability, even across multiple data centers and services.
Q. You say the JetStream Cross-Cloud Data Management Platform provides “built for the cloud” data management capability. What exactly does this mean?
A lot of the technologies used in today’s cloud data centers were originally designed for a single-owner, on-premises operation. But CSP operations are different, so legacy enterprise data management tools aren’t always a perfect fit for the dynamics of cloud operations, such as efficiently managing resources across multi-tenant services, supporting dynamically changing workload demands, and providing mobility, agility, and recoverability across multi-site operations. Rather than trying to adapt legacy on-premises data management tools and methods to this strange new world, we built our platform from the ground up with these dynamics in mind.
Q. Tell us more about the newest product on the platform, JetStream Migrate. What makes it unique?
JetStream Migrate is a software product that enables the live migration of virtual machines to a cloud destination. That means that the VMs and their applications continue to run on-premises while their data is being moved to the cloud destination. JetStream Migrate is the first data replication solution for cloud migration to run as an IO filter in VMware vSphere. This design gives the solution some unique capabilities:
- It supports live migration of applications, even when their data is moved to the cloud via a physical data transport device.
- It enables live migration without snapshots, which is much better for application performance.
- It’s fault tolerant, so if interrupted, the data replication process resumes from the point of interruption.
- Because of the IO filter-based design, it’s a lightweight application that runs seamlessly within a VMware-based data center.
- It gives the administrator powerful capabilities, including the ability to accurately estimate the time required for data replication and the ability to automate many tasks.
Q. There are many solutions for cloud migration, so when would an organization choose JetStream Migrate over other options?
It’s important to note that JetStream Migrate is specifically focused on ensuring reliable data replication for live migration. It will typically be used in conjunction with other cross-cloud tools, such as VMware vRealize, vCenter and NSX. The technologies are complementary, and they each play an important role in a cloud migration project.
With respect to data replication specifically, the unique design of JetStream Migrate makes it especially useful when:
- Live migration is required, but data will be transported to the cloud on a physical device.
- The data migration network has insufficient or inconsistent latency or bandwidth.
- The cloud destination is based on vSphere, but the CSP is not running the entire VMware Cloud stack.
- A lightweight deployment is preferred, at the source, on the network or at the destination.
- Network reliability concerns and high data ingest fees to make a fault-tolerant replication process preferable.
Q. You’re launching with an impressive list of partners. Can you tell us how you’re working with these partners and what about JetStream Software caught their attention?
Our team has been engaged with VMware for a long time. We previously worked as VMware’s design partner for the development of the APIs for IO Filters framework, so we have been working with these APIs to integrate our products with VMware vSphere for years. Through our partnership with VMware, we’re now also collaborating with Amazon to support the migration of VMs to the VMware Cloud on AWS.
Because of our history partnering with Dell, EMC, Cisco, IBM and HPE, we’ve also resumed and further developed our partnerships with those vendors, starting with the JetStream Accelerate product, which was familiar to our partners. And because all of these vendors are rapidly developing cloud solution portfolios, we’re discussing our new solutions with them as well.
Q. JetStream Software appears to have deep technology credentials. What is your history working with enterprises and cloud service providers?
One of the unique advantages of our particular “startup” is that we’re launching with a full development operation and a technology foundation representing over 200 developer-years of effort invested. We developed and supported a software solution that was deployed at thousands of data centers, both large and small. In doing so, we had a front-row seat to the transition from enterprises operating all-on-premises to the cloud, hybrid cloud, and cross-cloud operations.
Q. You’ve just officially launched the company and the platform with the newest product on the platform. What can we expect later this year?
Our JetStream Cross-Cloud Data Management Platform is maturing rapidly. The focus of our first product has been to remove the friction from cloud migration. Our releases in the second half of the year will bring similar advantages to cloud disaster recovery (DR) and cloud-based disaster recovery as a service (DRaaS).
About Tom Critser, Co-Founder and CEO, JetStream Software
Tom Critser has more than 20 years of experience growing and launching software companies. Previously, Tom was GM of Cloud Data Center Solutions at SanDisk. Tom was a member of the founding team of FlashSoft Software which was acquired by SanDisk in 2012. Prior to FlashSoft, Tom was VP of Worldwide Sales and Business Development at RNA Networks, the memory virtualization software company, which was acquired by Dell. Prior to RNA Networks, Tom was the VP of Worldwide Sales and Business Development at Infravio, the SOA management software company, which was acquired by Software AG. Tom graduated from Oregon State University with a BS degree in International Business and where he was a Pac-10 All-Academic Team member. For more information, please visit www.jetstreamsoft.com, @JetStreamSoft and www.linkedin.com/company/jetstream-software-inc/.