Quality Assurance and Project Management


June 30, 2018  10:55 PM

Serverless Monitoring Startups In Experimenting Mode

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Serverless computing, Startups

If enterprises find a suitable, stable, and reliable monitoring tool for their production applications, they would not mind shifting to serverless architecture at a faster pace. The adoption would become easier and quicker, in that case. The biggest challenge of visibility in serverless environments is monitoring. There are a number of vendors offering serverless monitoring services and capabilities. These include SignalFX, Datadog, New Relic, etc. It was, in fact, AWS Lambda creating a new concept of serverless architecture. The concept was new though but quite interesting offering function as a service (FaaS). As a matter of fact, serverless means the organization doesn’t need a provision of servers. That doesn’t mean servers are not there in the picture. They are, in fact. But the organization doesn’t need to manage servers. This is quite interesting. Isn’t it? Then who handles server management? Who ensures scaling at the right juncture?

Serverless Architecture, in fact, involves a metering mechanism thereby charging users on the basis of certain parameters. These parameters could be the time of code execution and the number of times a code triggers. That makes serverless monitoring quite interesting. Is it costly? Let’s see. Many organizations are already moving from onsite data centers to serverless architecture. That avoids them bothering about containers or even virtual machines. While AWS was the pioneer in creating serverless technology, there are other players now like Google Cloud Platform and Microsoft Azure. Serverless model comes with certain benefits. These include an improvement in code quality, improvement in developer productivity, cost saving, scalability, to name a few. One of the biggest complaint that comes from this technology’s users is lack of visibility into their servers. Serverless environment demands a different monitoring mechanism. Normal APM (application performance monitoring) and IM (infrastructure monitoring) systems don’t suffice the purpose.

Serverless Monitoring Doesn’t Gel With Traditional Systems

In the nutshell, we can say that serverless computing is currently in a nascent stage and undergoing various experimenting from vendors to attain a substantial reduction in overhead. Some more startups in this field are IOpipe in Seattle, Dashbird in Estonia/San Francisco, OpenGenie (Thundra) in Falls Church, Epsagon in Tel Aviv, and Stackery in Portland. In fact, it will be interesting to watch biggies like Amazon’s next moves in the field of serverless monitoring.

June 30, 2018  9:17 PM

Is Public Cloud Gaining Momentum Among Enterprises?

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Cloud migration, DataCenter, Public Cloud

What are the enterprise datacenter preferences worldwide? Is public cloud gaining momentum among enterprises? Well, a number of studies and statistics say the shift is happening but at a slower pace. While the centralized datacenter and core business apps still remain within the boundaries, the local datacenters are reducing. One of the key reasons for this could be a shift towards hosted solutions by the organizations. Obviously, embracing public cloud cuts down your investments especially the capital investments. Organizations are moving to public cloud platforms rather than investing in IT Infrastructure. As a matter of fact, any addition in existing infrastructure not only eats a major chunk of your annual budget but also increases your recurring expenses in terms of their upkeep and maintenance. Organizations are preferring to reduce these costs and enhance their operational performance. Of course, the amount of effort largely depends on certain factors.

Public Cloud Gaining Momentum

Photo on Visualhunt

For larger organizations, it becomes more challenging to migrate to the public cloud. On the other hand, smaller organizations can migrate their workloads easily. Basically, it depends on the volume of data and the complexity of databases and applications. The simple applications and databases are easier to move. Despite such hurdles, it is interesting to see public cloud gaining momentum among larger enterprises. It is interesting for organizations to study how this shift impacts their IT environment usage and workload. Existing IT infrastructure and assets become a worry point for organizations while taking a call to move to public cloud. An increase in cloud service providers clearly states the mood and trends. In many small organizations, in fact, server rooms and local datacenters have vanished. Noticing this trend and success, even larger organizations are now thinking of moving more workloads to the public cloud.

Public Cloud Gaining Momentum But Slowly

Traditional on-premise deployments are decreasing at the cost of public cloud gaining momentum. Colocation is also becoming a favorite choice, especially for mid-level organizations. That way they are able to consolidate their infrastructure and datacenters. Overall, in-house IT footprints are decreasing across the organizations of all sizes and geographies.


June 28, 2018  5:11 PM

Machine Learning Use Cases for Enterprises

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
CRM, Data Analytics, Deep learning, Enterprise, HCM, Information security, iot, Machine learning, Marketing automation, Scm, Supply Chain analysis, use cases

This post is in continuation of my previous two posts. The agenda of these posts is to highlight how enterprises can leverage machine learning in various segments thus enhancing their business decisions. In my first post, we discussed How Machine Learning Transforms Customer Experience in CRM? Similarly, in the next post, we talked about How To Use Machine Learning In Supply Chain Analytics? In this post, we will discuss few more important use cases that are applicable in most of the enterprises. So, let us start with few more machine learning use cases for enterprises. Next use case that comes to my mind is Data Analytics. In fact, it is the first use case I think that originated as soon as machine learning came into existence. The good point here is that it can easily handle unstructured data thus making analytics more meaningful with wider coverage of relevant data.

Machine Learning Use Cases For Enterprises

Photo credit: b_d_solis on Visualhunt / CC BY

When we talk about machine learning use cases for enterprises, it is analytics that becomes a foremost priority. The reason for that is coverage of wider datasets and capability of having predictive models while embracing unstructured data. It can, in fact, result in prescriptive analytics. The real beauty is letting it used by those who are not data scientists. Thus the real power comes into the hands of business people who need to take in-time decisions that are business critical. Next use case that we can discuss here is HCM i.e. Human Capital Management. Machine learning is already impacting or rather empowering HR specialists with recruitment, development, training, growth, measurement, and retention of employees. There has been, in fact, a radical shift in recruitment in terms of the way job-finding sites function as well as recruiters and organizations identifying most suitable candidates.

Machine Learning Use Cases for Enterprises Are Helping Them In A Big Way

The next class of machine learning use cases for enterprises comprises of Information Security. Through the application of analytics, machine learning is enhancing information security for various issues, detection, alerts, correction, and so on. With the increasing size of end users especially in large organizations, it is impossible for IT department to check security even logs manually. That is where this technology becomes handy. Machine learning is helping a lot in understanding user behavior, identifying risks and vulnerabilities, mitigating risks without manual intervention, and proactively taking appropriate action against external threats.


June 27, 2018  9:03 PM

How To Use Machine Learning In Supply Chain Analytics

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Machine learning, Supply chain, Supply Chain analysis, Supply Chain Management

When we think about using machine learning in supply chain analytics a lot of ideas would come to your mind. There is an important thing to keep in mind before heading towards any conclusion in this regard. As we all know some components of supply chain management talk in analog terms. More so, there are certain things that you would still be performing with pens and clipboards. But there is a brighter side to it. The other parts of it like autonomous trucks, drones, analytics, driverless cars, etc. are using the latest technology. Let us keep our focus on analytics part for now. Some of the prominent issues that we use in supply chain management are common in B2C and B2B segments. Like, delivering same-day. Be it service or product. The world is becoming more and more demanding day by day. The boundaries between B2B and B2C are fading.

Supply Chain Analytics

Photo credit: CIMMYT on VisualHunt / CC BY-NC-SA

Every business and customer expect 24×7 which means businesses need to be ‘always on’. Customers prefer to get personalized information. It is already on the verge of on-demand and real-time. There are amazing outcomes possible while using machine learning in supply chain analytics. It works more efficiently in case of organizations having trouble in handling scales. Like, when it comes to managing a huge number of stocking locations. Forecasting is another area where businesses require a high level of accuracy and the information should come in real-time. Gone are the times of weekly or monthly forecasts. It is now daily or intra-daily forecasting that is in demand looking at the changing scenarios of businesses. To keep getting the data in real time, point-of-sale (POS) systems need to be in place and integrated well with the centralized system. Security is a big concern.

Machine Learning in Supply Chain Analytics Can Create Wonders

There needs to be a mechanism for identifying and alerting against fraud and theft. Another important area is anticipating unusual events in advance. There comes the need of integrating with big data sources like the weather system. All this is not possible without encashing the benefits of latest technologies. You need to use machine learning in supply chain analytics to achieve all these goals. One of the best business use in this regard is UPS’s ORION (On-Road Integrated Optimization and Navigation) system. The system is working well for more than a decade now. It helps UPS drivers find the best possible route with the aid of GPS systems. There are many other things that only machine learning can handle like dynamic pricing, online customer handling on social media, fraud detection, and defect detection. That is not all. These are just a few of the pointers.

In today’s environment to take the complete leverage of machine learning in supply chain analytics, it is important to use technologies and tools like image recognition, social media analytics, video analytics, and integrating with relevant information aggregators. Ultimately the goal is to gain the advantage with the help of technology to take your business to next level of competition.


June 26, 2018  5:52 PM

How Machine Learning Transforms Customer Experience in CRM?

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
CRM, Customer experience management, Machine learning

Every success in any business has only one factor behind it. That is Customer Experience. Let us see how it impacts in CRM. The foremost goal of any CRM is to provide a 360-degree view of a customer. And thus create a great customer experience in CRM. That is the one factor the whole CRM vertical is striving for. Whether it is customer side or the vendor side. The 360-degree customer view was lacking until the early 2000s. Why it was not possible or thought of because the primary focus at that time was on transactional data. That data that was residing in various formats in databases.

At that time it was touching only the structured data. All unstructured data was being ignored or was treated as useless. This unstructured data used to have most of the valuable customer transactions like communications, phone calls, emails, and social media posts. Though social media posts at that time were too less. It was more or less a partial analysis of customer experience in CRM.

Customer Experience in CRM

Photo credit: Marc_Smith on VisualHunt.com / CC BY

Discarding all this data could result in only partial customer experience in CRM. Because such kind of data was not at all analyzed because of the drawback of the technology in use at that time. Machine Learning is now able to give a major thrust to 360-degree customer view because of its ability to analyze huge sizes of disparate data routing from various sources. In fact, it does not matter if it is structured or not. Basically, with the evolution in experience, the experts have been able to define four prominent stages in customer analytics. These are Acquire, Serve, Nurture, and Grow. Let’s see how machine learning is playing a major role in each of these stages. When we talk about the ‘Acquire’ stage, machine learning based use cases would include micro-segment activities on the prospects thereby improving the level of accuracy.

Customer Experience in CRM Is A Constant Evolving Journey

Similarly, during ‘Serve’ stage, machine learning has an ability to create an intelligent chatbot or virtual assistant for customer self-service. That itself simplifies many complex processes and makes things simpler for customers. Customers, in fact, find a lot of value in it. Optimizing average hold time, taking standard requests without the involvement of customer service representative, and delivering faster are some of the gains for the customer. Machine learning helps a lot in the ‘Nurture’ stage thus transforming customer experience in CRM in a big way. it manages the process of customer interactions in such a way that the annoyance factor goes off and satisfaction level goes high. As a matter of fact, it removes all the customer friction points. Finally, in ‘Grow’ stage, machine learning optimizes and customizes by providing best suitable offers. It enhances conversion rate and profitability.


June 25, 2018  11:31 PM

Do Enterprises Fear The Public Cloud?

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
IaaS, Private Cloud, Public Cloud, SaaS

Nearly 50% of businesses currently use IaaS (Infrastructure as a Service). Around 12-15% of growth is estimated for the next 12 months. Doesn’t it clearly indicate a majority of businesses will have moved to public clouds in the next year? Do enterprises fear the public cloud? Or they have a valid reason and a high amount of clarity to resist it? Are they really resisting it? Cloud services include SaaS, IaaS, and private cloud. Most of the organizations are using it in one or the other form at least for one of their business applications catering to at least one of their critical functions. Also, a study says the majority of investment in hosted infrastructure is happening to public clouds (IaaS). Still, it is far from popularity or wider acceptance. What could be the reason for this? There is a hidden war going on between vendors and enterprises.

There are in fact different kind of scenarios. There are more than 40% of organizations that are not using the public cloud. They neither intend to in the near future. On the other hand, there are organizations that intend to adopt it but are slow to adapt to the cloud. The third segment is of organizations that rely completely on the private cloud which could be hosted or in-house. Cost and security are the two key glitches that businesses have in mind when it comes to adoption of public clouds. Also, it asks for a huge transformation for which it seems they are not ready mentally. It could be because of attributes of their organizations or their IT setups. Still, some prominent features have emerged. For instance, there is a straight connection between the size of the organization and its IaaS adaptability.

Public Cloud adoption is far from expectations

On the other hand, this trait takes a reverse sweep when it comes to the age of the business. The companies that are under five years of existence have highest rates of IaaS adoption. While the older businesses are late and slow adopters. These could all lead to public cloud resistance from various perspectives.


June 24, 2018  11:13 PM

Are You Using Data Storytelling In Your Organization?

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
ai, Artificial intelligence, Data storytelling, data visualization, Visualization

Smart Visualization is not the only way of making AI transform BI in a big way. There are other ways too. Before coming to those, let us discuss smart visualization a little more. As we understand from my previous post, it helps in eliminating the gap between experts and non-experts. That means it helps in getting better business results by actually involving and engaging business experts who are not too tech savvy and don’t use any query languages. Therefore, they henceforth don’t need any tech assistants in boardrooms and other top-level meetings to run some critical analytics that helps them in making crucial business decisions in time. It, in fact, makes machine learning to suggest the selection of right graphic for the right query thus making AI much easier. Another important tool is embedding AI into Data Storytelling. It happens with the help of NLG technology.

AI is able to make data storytelling a powerful tool through the integration of NLG technology. NLG, in fact, makes storytelling more narrative-driven. It happens through telling narratives employing data in preparing visualizations and business dashboards. NLG, as a matter of fact, generates words and sentences from data using NLP. It is definitely quite an interesting part to understand how it happens. There are a number of recent business case studies having integration of NLG into dashboards thus enhancing data storytelling. This, in fact, provides critical business insights that are not easily understandable in numbers and graphics. It involves usage of sentences in natural language thus making it more meaningful by providing additional context and understanding. That altogether gives a different meaning to visualizations, reports, and metrics in dashboards. The whole purpose is to make them easier to comprehend.

Data Storytelling Is Evolving At A Faster Pace

That is one of the reasons for the fast growth of integration of NLG into dashboards. This integration of NLG into dashboards not only makes data storytelling easier but it also makes it easier to query the data with the help of NLP algorithms in order to tell the story in a comprehensive and impressive manner.


June 24, 2018  10:39 PM

AI Transforming Business Intelligence In A Big Way

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Artificial intelligence, Business Intelligence, data visualization, Machine learning, NLP

AI is transforming business intelligence in a big way. The era of pilots and POCs is over. It is the action time now. The things are happening in production now. As we all know Artificial Intelligence i.e. AI is a combination of a number of technologies like Machine Learning (ML), Natural Language Processing (NLP), and Natural Language Generation (NLG). In fact, this integration of AI with BI is creating wonders. What it does is it makes analytics more crisp and user-friendly. This will lead to BI becoming accessible to masses (or the non-experts) thus speeding up the process of its adoption. As a matter of fact, organizations are becoming data-driven that can help greatly in decision-making. The real catch is to make BI so friendly for business users who are not data analytics or BI professionals that they get the real benefit out of it.

There is something called Smart Visualization. It is visualizing the data smartly. It works with the help of Visual Query Technology. This technology helps in the visual analysis by means of graphically answering BI queries in charts, graphs, and other visual forms. That, in turn, makes analytics faster to perform and easier to operate. This, definitely, removes the hurdles to adoption by removing the requirement of writing queries in code. That was one of the reasons that BI despite being a powerful tool was out of reach of users having no SQL or other programming skills. The same kind of skill of introducing machine learning into smart visualization is emerging as a smart move to apply AI to BI thus removing the wide knowledge gap between experts and non-experts. This is how AI is Transforming Business Intelligence In A Big Way.

AI is Transforming Business Intelligence through Smart Visualization

Smart Visualization, in fact, is enabling users to create powerful dashboards with impressive infographics. The users in case need not necessarily be having deep data analytical skills. That is a wonderful way of AI transforming Business Intelligence and making it accessible easily and effectively.


June 24, 2018  9:56 PM

On-Premise vs Off-Premise – Where is Your Enterprise?

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
On-premise to Cloud, On-premises

With the changing trends of Converged Infrastructure and Server Trends, it is interesting to see the shuffle taking place across the globe in terms of on-premise vs off-premise. IT infrastructure is transforming in a big way. So are the perceptions of today’s IT managers. In fact, for many organizations, the decisions between the on-prem and off-prem are far away from clarity. On the other hand, there are many who are changing their IT strategies to make more room for going in favor of off-prem thus reaping benefits out of it. Organizations are still facing troubles for moving ahead in the direction of complete orchestration and automation. It is happening either due to lack of in-house knowledge, business direction, or finding a right vendor to achieve it. And this is happening at the organizations having the most skilled IT teams with no shortage of funds.

One thing is true when it comes to On-Premise vs Off-Premise decision. A small progress in favor of the latter promises to return tangible infrastructure-provisioning profits. The role of servers and converged infrastructure is changing drastically. The key factors impacting it are hyper-convergence, workload balancing, and containerization. Basically, it is all about right-sizing for the sake of future. While many organizations are sure that the existing infrastructure in place is more than enough to cater to their future needs. All it will need is a little expansion and tweaking but no major replacements. On the other hand, there are very large sized enterprises that are worried about a wide gap between what they have currently and their future needs. Hyperconverged, in fact, is becoming a core need of an organization that earlier was having a minor role to play. This is creating major changes in data centers.

Things are Clearer for On-Premise vs Off-Premise

Similarly, a large number of organizations are having or talking about containers. They believe that container technology has an inherent ability to fasten application provisioning times. These are my thoughts on the current trends of On-Premise vs Off-Premise.


June 16, 2018  5:59 PM

Endpoint Security Means Complete Lifecycle Security On A Single Platform

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Azure, Endpoint security, Microsoft

This is my concluding post on Top Security Concerns For 2018-2019, a 3-posts series. As we see Encryption, SOC and Mobility remain the top concerns in this regard. In my last post, we were talking about the increasing trends of mobility of employees of an organization. Thus, automatically, a new demand arises from tackling staff mobility between various network environments. The users need to access a number of applications and services lying in a heterogeneous environment. Some of these lie on-premise while the other are residing on the cloud. With a tremendous increase in encrypted tunnels, it is becoming difficult to manage the whole ecosystem. Thus, the new equation is to have a complete visibility and control of the endpoints. The strategies and points of importance are changing shape at a faster pace. Endpoint Security Functionality is, in fact, is a vast area of work.

As a matter of fact, endpoint security begins with telemetry collection for the purpose of analysis. And it goes up to complete lifecycle security, inspection, detection, and real-time response. The toughest task is to integrate all this on a single platform. This brings in a new layer of vendors with a complete focus on endpoint security. These include CrowdStrike, SentinelOne, Carbon Black, ESET, Endgame, Cybereason, and Cylance. IoT security and connected devices are the next big thing when we talk about major security concerns. More sockets, more endpoints, more devices, and more coding automatically pitches in more scope of vulnerabilities and threats to any enterprise. In this context, in the recently concluded RSA Conference in San Francisco, Microsoft launches Azure Sphere. Azure Sphere is a new security platform that focuses on protecting any kind of embedded devices in a smart manner. This is the need of the hour.

The scope of Endpoint Security Has Increased Tremendously

Azure Sphere is a combination of hardware and software. It consists of secure microcontrollers providing hardware-based Root of Trust to ensure a secure boot. In fact, it also includes cryptographic authentication and a complete protection of device communications. Surprisingly, this is the first non-Window OS from Microsoft. The OS is a custom Linux Kernel. This is because of two reasons – speed and security. Another surprising move by Microsoft is enabling Azure Sphere to run on any cloud. Not limiting it to just Azure. Azure Sphere aims to secure the whole IoT ecosystem right from the component level to the cloud. In fact, there will be more to see from other vendors soon. But all vendors realize the need for endpoint security and other security concerns that we discussed in these three posts.


June 16, 2018  5:19 PM

Top Security Concerns Include Encryption, SOC, and Mobility

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
cybersecurity, Encryption, Mobility, SoC

This post is in continuation to my previous post on Top Security Concerns for 2018-2019. The key security concerns include Encryption, SOC, and Mobility. In this new concept of ‘encryption-in-use’, vendors use a number of techniques to tackle the encryption issue. These techniques include homomorphic encryption, secure multi-party compute, and secure enclaves. With the help of these techniques, vendors allow access to data for various purposes without the need for decryption. In fact, these same technologies are now used in cryptographic key management. With this, it enables the security of hardware-based key management with the help of software thus providing a higher level of flexibility and adaptability and lowering of cost. The same kind of transformation in technology is taking place in SOC. There is a severe need for the SOC of the future. There are distinct guidelines for this.

Encryption, SOC, and Mobility

Photo credit: brewbooks on Visual Hunt / CC BY-SA

While the traditional SOC works in SIEM model. It stores event logs and alerts. These events logs and alerts from the traditional SOC aim to feed analytics engines, guide investigation teams, drive SAO processes, satisfy search requests, and interface with custom scripts. But that is not enough to tackle the current situations and security risks. The new methods include using traces in network communications in order to identify attacks that are happening in real time. The focus now is more on incident detection, exceptions reporting, and response activity. This new array of vendors include FireEye, Awake Security, Palo Alto Networks, ExtraHop, Gigamon, Darktrace, Corelight, and Vectra Networks. These newer technology vendors are becoming an integral part of SIEM deployments. Next comes Endpoint Security. This was, in fact, one of the most discussed topic at the RSA Conference.

Encryption, SOC, and Mobility remain on top of the security concerns

As we all know employee mobility is an increasing trend worldwide. And it is a point of concern from a security point of view. Almost 60% of employees in any organization demand mobility between network environments. Because they need to access a number of on-premise and cloud services through one or the other secured or encrypted tunnels. As a matter of fact, all this is becoming difficult to manage and inspect. As we see Encryption, SOC, and Mobility are changing the whole concept of security.

Finally, we shall be concluding Top Security Concerns For 2018-2019 in the next post.

Continued »


June 16, 2018  10:48 AM

Top Security Concerns For 2018-2019

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Cyber security, Encryption, enterprise cyber security, HSM, RSA, RSA Conference

There were more than 600 exhibitors in April at this year’s RSA Conference in San Francisco. The attendance was almost touching 50,000. There are some very prominent key cybersecurity concerns that will keep IT managers on toes for next couple of years, at least. Those in the Asia Pacific and Japan can register for the upcoming RSA Conference 2018 Asia Pacific & Japan. The location is Marina Bay Sands, Singapore and the dates are 25 Jul 2018 – 27 Jul 2018. More than 65 speakers will be enlightening the attendees on different aspects of cyber security including the top security concerns for 2018-2019. The zero-trust philosophy is strengthening its roots in this field. It is about discarding your orthodox concepts like earlier security model. The legacy model says trust anybody inside the premise and untrust everybody that is outside your perimeter. The whole architecture is faulty and risk-prone.

Top Security Concerns for 2018

Photo credit: U.S. GAO on VisualHunt.com

Hence, the new concept is to trust no one. That is probably the right approach to tackle Top Security Concerns For 2018-2019. In fact, zero-trust is based on a new framework that we call as reference framework or reference architecture. It is, as a matter of fact, independent of technology in place. There is no logic in granting access to resources like servers, applications, networks, and devices to everyone inside the perimeter. Rather the enterprises need to change their perception about security policies. On the other hand, zero trust concept includes all vendors including MFA, IDaaS, Network Security, SSD-WAN, and CDN service providers. In fact, Encryption is also becoming a challenge for security experts. You can keep the whole path secure that is carrying the encrypted data. What about the security at the point of decryption? That itself is highly vulnerable to attack from inside as well as outside.

Top Security Concerns For 2018 Calls To Trust No One

The point of encryption to decryption and decryption to encryption is open for attackers possessing compromised credentials. The same is also open for attackers with malicious intentions sitting inside. To counter this risk there is a new concept ‘encryption-in-use’ and virtual HSMs (VHSMs). VHM is a software suite that stores secret data outside the virtualized application environment. The key vendors for this new technology include Baffle, Enveil, Fortanix, Unbound, Inpher, and PreVeil.

We shall continue about Top Security Concerns For 2018-2019 in my next post.

Continued »


May 31, 2018  4:22 PM

Machine Learning Use Cases in Enterprise World

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Enterprise, Machine learning, use cases

Machine Learning has been in Practice now quite some time. Enterprises are adopting it fast to leverage its power. It definitely helps in excelling in business and stay ahead in the competition. Machine Learning, as we all know, has a tremendous power to automate and optimize any simple or complex business process. There are a number of machine learning use cases that we can pick from enterprise world. Enterprises either are already working on it or have immediate plans to deploy it. And those who are still away from it will feel the brunt sooner or later. It is always better to identify a critical business issue and then work towards addressing it through machine learning. Machine learning deployment can enhance business in many ways. It can create an artificial intelligence spectrum to help in critical business decisions. It also helps in automation of business processes, analytics, and operations.

As a matter of fact, machine learning use cases can derive out from many areas of the business. Like, any business task that is repetitive and/or mundane in nature in one such prominent area. Another area to look at is the activities that involve a high amount of risk or danger. In fact, it also helps a lot in quality improvements and tackling operational issues. Obviously, machine-learning software is to help humans and not replace them in the job. That is why it is wise to automate most of your low-level tasks so that human mind can concentrate on more complex tasks. Finally, it is going to be a man-machine combination to manage any kind of business. There are three valuable components of the business that play a major role in machine learning developments. These are data (both input and output), model, and algorithm.

Machine Learning Use Cases Rely On Data, Algorithm, and Model

Data, in fact, is the most valuable asset of any business. And that is the backbone of all machine learning use cases. Data is the real driver of business and business decisions.


May 31, 2018  12:14 PM

What Is Generative Adversarial Networks (GAN)?

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
adversarial, Network

The Generative Adversarial Networks (GAN) is, in fact, never a single network. It is a set of networks, at least two, operating at the same place but working against each other. Each of the networks brings its own unique set of results. For instance, in GAN approach, the first network creates realistic images, while the second one identifies whether those are real or not. It is like the first network is synthesizing something and the second one is monitoring its operations and controls what it creates. As the time passes, the second network trains the first one how to create unreal images in such a perfect manner that nobody is able to make out those are fake or unreal.

Generative Adversarial Networks

Photo on Visual Hunt

That means the fake images now the first network produces are as good as the real ones. In fact, you won’t be able to distinguish the fake ones. It is, rather impossible to differentiate between the two. That is the purpose of the Generative Adversarial Networks. Now, think about its applicability and usage. Which business or industry would need such kind of technology and for what purpose? For that, let us think about a few use cases. As a matter of fact, there are a plenty of use cases and many are already in production and operation. The first use case could be creating fake but realistic healthcare data. The purpose of such records is to train various models of machine-learning using different algorithms. But in this case, since you are not using real data, there is no infringement of patient’s privacy.

There are various use cases of Generative Adversarial Networks

In fact, Generative Adversarial Networks is a classical approach that we use in machine learning technology. Another use case you can think of is creating fake malware in order to test an anti-malware application. As a matter of fact, there are plenty of projects that are operational in the field of fake news videos and fake images of celebrities and famous personalities. If you look at the approach it follows, it simply matches with the unsupervised machine learning. But from another point of view, we also find adult supervision in it. Hence, would you call it an advanced version of unsupervised machine learning or a mix of supervised and unsupervised machine learning?


May 30, 2018  10:55 PM

What is Reinforcement Learning And Its Relation With Machine Learning

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
algorithmic, Machine learning, Reinforcement

Reinforcement Learning is an important category of machine learning algorithms. It has a very classical connection with theories of behavioral psychology. There we learn about a reinforcement learning environment. The whole game here is of this environment training algorithms along with a deep sight on their performance. And on the basis of performance, it rewards or punishes. Before going further, let us go a few posts back and understand the background of Machine Learning better. In one of my previous posts, we understand how important it is in the current cutting edge environment for enterprises to adopt Machine Learning. On one hand, the businesses are having tougher conditions. On the other hand, technology is leveraging a lot of scopes to enhance and excel against tough competitions. This is an era where every disruption is an opportunity.

We also learned in one of the previous posts a straight connection between Deep Learning, Machine Learning and AI (Artificial Intelligence). Along with, we learned Supportive and Unsupportive Machine Learning Types, the difference between training and inference, and the relationship between datasets, algorithms, inputs, and outputs. Hence, before going further it is important to read those two posts and the previous one on Unsupportive Machine Learning and its examples. Now, coming back to Reinforcement Learning. As we see that it works on the philosophy of rewards and punishments. What happens is that during each step of the training process, the learning algorithm selects one of the observations and a suitable reward from a pool of possible actions. The model then keeps on accumulating positive or negative feedbacks by running the same process repeatedly. In fact, all this happens in a very dynamic environment.

Reinforcement Learning Works On Rewards and Punishments

Since the algorithm of Reinforcement Learning aims to collect the maximum possible rewards to enhance its next decision. In fact, it works quite intelligently. It can sacrifice short-term gains if it perceives long-term gains in lieu of them. This technology works best in gaming, robotics, and telecommunications.


May 30, 2018  9:32 PM

What Is Unsupervised Machine Learning And Its Examples

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Machine learning

In my previous post, we learned about Machine Learning and Supervised Machine Learning. Carrying it further, in this post, we will learn about Unsupervised Machine Learning and its uses. Machine Learning is a subset of Artificial Intelligence. Supervised Machine Learning has two important steps – Training and Inference. Inference happens after the completion of Training. The whole mechanism works on some algorithm and data sets. In supervised machine learning has an altogether algorithm and datasets. Its main use is for future predictions on the basis of future data. While it works on an imaginary situation that is yet to take a real shape, the prediction process works in a near-to-perfect state. That is the beauty of Machine Learning and its various applications.

Unsupervised Machine Learning is applicable in the absence or lack of training sets. In such a situation you don’t have any idea what will be the shape of the output. unlike Supervised Machine Learning, here all the input data is unlabelled or unstructured. The soul of the goal lies somewhere inherent in the data. While in previous category of machine learning we are trying to predict future with future data. In this case, we are trying to predict present but without any labeled or structured data. The algorithm, again, has to play an important role to draw out a meaningful result and output.

Unsupervised Machine Learning Works With Unlabeled Input Data

Unsupervised Machine Learning algorithms fall into two main categories. First one is Clustering. Its use is to find out hidden patterns or grouping in data. The second one is Association. Its application is to find out rules that explain parts of the data. Like, people like to go to this place also that place. Example of unsupervised machine learning algorithm would be K-means clustering. Another example you can state is Apriori. While the former is for Clustering type, latter is for Association. The best use of these two different categories of machine learning is to use them together. So when you use unsupervised and supervised machine learning techniques together, you can easily and effectively use the output of unsupervised machine learning as the training set for supervised machine learning.


May 30, 2018  8:14 PM

Deep Learning Is A Subset of Machine Learning Is A Subset of AI

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
algorithm, Artificial intelligence, Deep learning, Machine learning

While many think AI, Machine Learning, and Deep Learning as synonyms, it is not so. Each is distinct and so is their purpose and functionality. In fact, there is an interesting relationship between the three in a straight line. You can call machine learning as a subset of AI (Artificial Intelligence). Similarly, Deep Learning is a subset of Machine Learning. Let us see some of the algorithms and models that support this. In fact, a machine learning algorithm would be a series of actions or computations. Best way to understand this is to think of Random Forest. Think of applying random forest to a dataset. The result produces as an output will be this algorithm. The model will change if the data of the algorithm changes. Another way to look at it is to use the same data with a different algorithm.

Basically, two important steps to understand in case of Machine Learning are Training and Inference. Treat training as a process to optimize the whole mechanism. Here, we use an algorithm for a specific purpose. The purpose is to derive a mathematical function that has an ability to reduce any kind of errors in the training data. Once training is over, inference comes into the picture. It helps in making predictions on the basis of new data that is coming in. Now, let us try to understand what is Supervised Machine Learning. In Supervised Machine Learning, we use algorithms trained with labeled datasets. These datasets are highly structured or organized. Here, you use independent variables as input and get numeric or binary results as output. As a result, we use this technology to predict the future results on the basis of future inputs.

Supervised Machine Learning Is For Future Predictions

Supervised Machine Learning is of two types. First one is Classification. In this, the output is a category. Like, this or that, good or not good, relevant or not relevant, etc. The second type is Regression. Here, the output is a value. Like, dollars, temperature, etc. Support Vector Machines (SVMs) fall in the first category.


May 25, 2018  10:01 PM

Why Machine Learning Is Important For Enterprise To Adopt Faster

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Machine learning

Any enterprise if not working in any of these three technologies now will certainly be in trouble tomorrow. These three technologies include Artificial intelligence (AI), Machine Learning, and Deep Learning. In fact, all three have a deep connection with each other. These three are right now the most powerful transformational technologies available in the current period. And all three are set to touch most aspects of our lives with or without us knowing about it. That is the penetration these will have on the mankind. As a matter of fact, the combination of AI and ML are taking practical shape in production rather than just demonstrating their possibilities in academics and R&D. Rather it has become a most sought after enabler in today’s technology. Industries are working on it with real-life use cases and ROIs. The adoption is increasing at an exponential rate across the globe.

Many startups working in this area are able to integrate machine learning functionality well with business requirements and draw out appropriate results to enhance and automate business processes. And the results are phenomenal. In fact, it brings a tremendous increase in power, availability, applicability, and flexibility of resources. Probably without the adoption of these technologies it would be impossible to explore the amount and accessibility of digitized data flowing from various sources. An improvement in efficiency is only possible despite an increase in complexity and volume of data. It is giving good results only because of the machine and deep learning algorithms that are the driving factors of AI. As a result, the future is appearing to be more promising with the help of fast adoption of machine learning in the enterprises. In fact, companies not adopting it even now will be out of the race automatically soon.

Machine Learning Is Changing the World Faster

In all these circumstances, control, access, and ownership of data is the key to drive your business.


May 25, 2018  9:11 PM

22 Information Security Projects For An Organization

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Artificial intelligence, Data Loss Prevention, DLP, EMM, Encryption, Endpoint security, Enterprise mobility, IDaaS, Incident response, Information security, Intrusion management, Machine learning, MDM, mobile device management, Network vulnerability, Tokenization, vulnerability management

Information security is the utmost priority for any business these days. While there are a number of projects that a CIO/CTO/CISO can initiate in his organization, few are important to keep on top of the agenda. These projects are not a one-time activity. These are of continuous nature. They basically work on the pattern of PDCA. Plan, do, check, and act. That means deployment is not the end of the project lifecycle in this case. Rather the real project begins from there. Once you deploy any information security projects, there is a need for regular audits and enhancement. In fact, technology is changing and progressing too fast. The same implies to its negative side too. The more you secure it, the more it becomes vulnerable. As a matter of fact, threats to an organization are not only there from the external world. It is equally threatening from inside.

To cope up with all these threats and vulnerabilities, there has to be an assessment mechanism in place in the organization. Following is the list of 22 Information Security Projects for an organization. These are all critical irrespective of the size and volume of the business. If these are not in place, ensure them to be right in place.

Information Security Projects

Photo credit: Ardonik on Visualhunt.com / CC BY-SA

  1. Vulnerability Assessment
  2. Data Loss Prevention (DLP)
  3. Mobile Device Management (MDM)/Enterprise Mobility Management (EMM)
  4. Artificial Intelligence/Machine Learning for security
  5. Security Automation
  6. Security Operations Changes
  7. Security Awareness Initiatives
  8. Cloud Infrastructure Security
  9. Cloud Access Security Broker (CASB)
  10. Monitoring Improvements
  11. Patch Management
  12. Multi-factor Authentication
  13. Security Information and Event Management (SIEM)/Security Analytics
  14. Application Security
  15. Firewall Deployment/Management
  16. Regulatory Compliance (e.g. PCI Compliance, GDRP, PSD2, NIST)
  17. Privileged User Management
  18. Incident Response
  19. Intrusion Management
  20. Identity As A Service (IDAAS)/Single Sign-On
  21. Encryption/Tokenization
  22. Endpoint Security

Information Security Projects If Not Started In Time Can Lead to A Big Loss

Another point to note here is for the top information security projects currently being implemented within your organization, how do you ensure to place key determinant in place to get the approval in time. Otherwise, your information security projects will remain only on papers and never will see the light of the day.


May 24, 2018  10:17 PM

Zoho Finance Plus Manages Finances & Operations Including GST returns

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Financial

Zoho is probably among those unique businesses where the business model, business benefits, customer value, and product value remains same irrespective of the size of the business of the customer. So whether it is a single professional, single person company, or a multinational; the pricing and all business & support prepositions remain same. There is no disparity. There is no confusion. And hence there is no differentiation. That is the beauty of this international Indian company having more than 6000 employees on board working in more than 19 offices across the globe. Well, the latest news is its Financial Suite, Zoho Finance Plus is 100% GST compliant. Rather, it has a lot more to offer than a number of other popular and tremendously costlier products. Basically, it all about awareness of the beauty of the product. It offers the least investment and great benefits to the business.

Various businesses fighting with their existing business applications to cope with GST regulations and requirements must have a look at the smooth operations and outcomes Zoho Finance Plus has to offer. That too at a very nominal cost. GST is a mandatory regulatory financial requirement for all businesses. The product offers complete integration with any legacy business applications. What businesses lack while working on other expensive applications is a 360 degrees visibility into their order and fulfillment cycle. That even lacks in most of the world-class business and ERP suites. That is where this product takes the front seat ensuring zero accounting errors and a trouble-free tax period. With the launch of this GST compliant financial suite in April 2017 that too well before the official roll-out of GST by the Ministry of Finance, Government of India, it demonstrated its phenomenal strength, depth, and dedication towards its customers.

Zoho Finance Plus Is A Complet Finance Suite++

Zoho Finance Plus, like its other products, being a cloud-based product, has zero capital investment and nominal operational cost. As Sivaramakrishnan Iswaran, Director of Product Management, Zoho, explains it better saying, “The proliferation of smartphones, broadband connectivity, and upcoming GST regimen is a great opportunity for businesses to move their accounting and other operations online. With Zoho Finance Plus, businesses get a beautiful interface to manage their transactions day to day and file their GST returns, all from a single platform. Zoho Finance Plus simplifies returns filing for businesses and increases compliance.”

GST is a business reform rather than merely a tax reform. Being mandatory for any business, the success of its implementation depends largely on technology infrastructure and right strategies in place for a business. Zoho Finance Plus is becoming the first choice of millions of SMEs thus empowering them with a right application for invoicing, filing tax returns and other critical transactions. That too along with being completely GST-enabled and compliant. Basically, it is not a single application that any business thrives and survives on. There are a number of financial apps that require comprehensive integration to communicate with each other and seamless exchange of data. Zoho Financial suite does the same thus ensuring management and key users have real-time information for taking fast and right business decisions. Filing return on GST portal becomes just a matter of a click of a button.

Zoho Finance Plus Includes Different Modules On Single Database

Zoho Finance Plus includes different modules like Zoho Books, Zoho Invoice, Zoho Expense, Zoho Subscriptions, and Zoho Inventory. For filing GST returns while using these modules, there is no data duplication across apps or requirement of a manual addition of transactions. Everything works in a smooth flow, flawlessly. In fact, Zoho Books create monthly returns automatically. Rather, it is just a matter of filing the return with a click of a button. In addition, there is an automatic matching and reconciliation of transactions. So, along with One Nation, One Tax, it becomes One Vendor. There are many other key features of this product. Like greater visibility into orders and payments, faster reimbursements and accurate accounting, and so on.

Zoho Finance Plus

The pricing model of Zoho Finance Plus(https://www.zoho.com/in/financeplus/) is quite simple. It is INR 2,999 per organization per month. This includes 10 users having access to multiple Zoho Finance apps. All these capabilities of Zoho gave ample trust to GSTN to select Zoho as a GST Suvidha Provider (GSP) (https://www.gstn.org/ecosystem/). In total, there are not more than 70 GSPs in India (https://www.gstn.org/gsp-list/). Iswaran adds, “Being a GSP ourselves helps in cost optimization and providing a great experience for our business users. Furthermore, we will leverage our in-depth expertise in developing platforms and ecosystems to support a thriving community of Application Service Providers (ASP) connecting to GSTN through us.”

In fact, Zoho (https://www.zoho.com/) has made accounting and reconciliation simpler and hassle-free by partnering with banks like ICICI and Standard Chartered to give an entirely a different kind of experience to its customers using Zoho Finance Plus.


May 15, 2018  11:20 PM

Microsoft List of Partners Helping Customers in Their GDPR Journey

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
DH2i, GDPR, Microsoft Partner Network, Microsoft partners

Recently Archive360 has been Included on Microsoft List of Partners Helping Customers in Their GDPR Journey. The list is a part of the Microsoft’s latest blog with the title “Leverage the Channel Ecosystem to Create GDPR Offers”. GDPR or the General Data Protection Regulation is impacting all organizations across the globe that perform any kind of business with EU firms. In this context, I had a discussion with Dan Langille, Global Director, Microsoft, Microsoft P-AE, Archive360 (www.archive360.com). Here are the key components of the discussion we had:

What were the qualifying criteria for becoming Microsoft Partner in tackling customer’s GDPR issues?

Dan: Partners, such as Archive360, that have services and/or solutions which assist customers with their journeys toward GDPR compliance must be nominated for consideration by a Microsoft employee (usually the Partner Development Manager). Those nominations go to a team Microsoft has within its overall One Commercial Partner (OCP) organization which reviews and approves (or declines) the services and solutions.

What value does it add to an organization becoming Microsoft Partner?

Dan: Microsoft’s investment in curating this list of partners is yet another great example of Microsoft’s commitment to being a partner-led organization focused on collaborating with partners to drive high-value business outcomes (in this case GDPR compliance) through the Co-Sell motion between partner sellers and Microsoft sellers.

GDPR

Photo credit: Cerillion on VisualHunt / CC BY

What additional responsibilities does it bring with this partnership?

Dan: Partners get recognized for inclusion in programs such as this through a company-wide understanding of (and alignment to) Microsoft’s go-to-market priorities. Archive360, as one such partner, we incur no additional responsibilities here other than to have brought to market a qualifying solution that is also listed in Microsoft’s internal OCP Catalog.

With so many partners on board, does Microsoft apply any performance measurement approach for each of their partners?

Dan: Many Microsoft programs do have performance criteria, but this is not one of them due to the complex nature of GDPR compliance and the myriad ways and means for customers to meet their obligations. (As an aside on the number of partners in this program: The list is actually quite small and exclusive relative to the sheer size of Microsoft’s global partner network and relative to the number of companies around the world that are or might be affected by GDPR.)


May 15, 2018  10:50 PM

Maximize Your Business Productivity With Custom Apps

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
apps, Business

Recently I had a discussion with Sejal R. Dattani, Marketing Analyst, Zoho on custom apps and their impact on business. Is custom apps a costly affair for organizations? According to her, building custom apps for your business is a one-time investment. Below are her valuable views and comments on the same.

Is your organization using the right tools? For years, companies provide their employees with packaged software that’s proven and widely used. But business has begun to change. Today, custom apps are quite affordable and easy to build. More businesses make their own apps to run their daily activities. Here’s why we expect more businesses to start adopting custom applications in the next few years.

When you depend on the same packaged software as your competitor, it becomes difficult to outrank them. To get an edge, you need to update your processes and implement changes frequently to offer better services.

Custom Apps

Photo on Visual hunt

Since custom apps have become easier to build, even people without a technical background can build a software to manage data and automate their processes. And, when you have applications that work exactly the way you want, your teams can react faster to customers’ changing demands.
The time required to develop custom apps has drastically reduced from months to weeks. For example, with a cloud-based DIY platform like Zoho Creator, you can launch your apps without installing new software or configure servers. And if you need some expert advice, you can always get in touch with certified developers to help you out. What’s more, when you create an application on Zoho Creator, you don’t have to waste your time and money re-building it for various operating systems. Your app automatically works on mobile devices, allowing your team to access vital information and follow tasks at any time of the day.

Custom Apps is a one-time investment

Businesses nowadays realize that packaged software is rigid. They make you change your business to fit them. And to make things worse, packaged apps are often incompatible with your existing services, too. Custom apps, on the other hand, let you change them to fit your business and are even integrate with your internal applications and other third-party services. For example, think of a scenario of running a retail store. You can integrate with a logistics service like FedEx and keep your customers informed of their order status.

The number of businesses switching to custom apps will accelerate in the coming years. And judging from the benefits, it’s no surprise. Building a tailored solution that’s focused on scalability and efficiency, is an investment for life.


May 6, 2018  6:05 PM

Q&A with Tom Critser, Co-Founder and CEO, JetStream Software

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
AWS, Cisco, Cloud architecture, Cloud service providers, CSP, Data Management, Dell, EMC, HPE, IBM, SanDisk, VMware

One of the hottest trends to emerge in the world of enterprise cloud computing is “multi-cloud data management,” which, in a nutshell, is simply keeping track of data assets that reside across multiple data centers and cloud services. As enterprises increasingly move IT operations to the cloud, ensuring the security, availability, and performance of their applications and data becomes increasingly challenging. Today, I speak with Tom Critser, co-founder, and CEO of JetStream Software, about his company launch and cross-cloud data management platform.

Q. Please tell me about JetStream Software.

JetStream Software is a new company, but we have a software engineering team that has been together since 2010, and we’ve invested more than 200 developer-years in our core technology. This April, we announced the JetStream Cross-Cloud Data Management Platform. Our mission is to give cloud service providers (CSPs) and Fortune 500 enterprise cloud architects a better way to support workload migration, resource elasticity, and business continuity across multi-cloud and multi-data center infrastructures. Currently, our platform is designed to complement VMware cloud infrastructures including VMware Cloud Provider Partners (VCPPs) and VMware Cloud on AWS. We are headquartered in San Jose, California, with a second development center in Bangalore, India.

Q. How did the company get started?

Our three co-founders and much of the engineering team have been working together for a long time. Our first startup was FlashSoft Software, which developed software for storage IO acceleration. Our objective was to enable enterprise flash memory in a host server to handle IO operations for “hot data” and to deliver the performance of enterprise flash storage, but without replacing the existing storage of the enterprise. FlashSoft was acquired by SanDisk in 2012, and then SanDisk itself was acquired by Western Digital in 2016. At SanDisk, the team grew in size, and we collaborated closely with VMware to design the vSphere APIs for IO Filters framework, which is a key technology for our new company’s cross-cloud data management platform. After Western Digital acquired SanDisk, we worked with Western Digital to establish JetStream Software as an independent company.

Q. Who is your ideal customer, and what problems are you solving for them?

Our ideal customer is a cloud service provider (CSP), serving enterprise customers, and in a similar way, the enterprise cloud architect. We address two key problems for these customers:

  • The first is to take friction out of the enterprise’s migration of its on-premises virtual machines (VMs) and applications to the cloud. Enterprise cloud migration today is an expensive, hands-on operation, typically requiring a lot of professional services. There are new tools that help organizations plan their data migration and prepare configurations at the cloud destination, but getting huge volumes of enterprise data to the cloud with minimal disruption remains a challenge, and that’s the problem we target.
  • The second problem we address is to help CSPs and private cloud operators deliver enterprise-grade resilience, availability, scalability, performance, and manageability, even across multiple data centers and services.
  • Q. You say the JetStream Cross-Cloud Data Management Platform provides “built for the cloud” data management capability. What exactly does this mean?

    A lot of the technologies used in today’s cloud data centers were originally designed for a single-owner, on-premises operation. But CSP operations are different, so legacy enterprise data management tools aren’t always a perfect fit for the dynamics of cloud operations, such as efficiently managing resources across multi-tenant services, supporting dynamically changing workload demands, and providing mobility, agility, and recoverability across multi-site operations. Rather than trying to adapt legacy on-premises data management tools and methods to this strange new world, we built our platform from the ground up with these dynamics in mind.

    JetStream Software

    Q. Tell us more about the newest product on the platform, JetStream Migrate. What makes it unique?

    JetStream Migrate is a software product that enables the live migration of virtual machines to a cloud destination. That means that the VMs and their applications continue to run on-premises while their data is being moved to the cloud destination. JetStream Migrate is the first data replication solution for cloud migration to run as an IO filter in VMware vSphere. This design gives the solution some unique capabilities:

  • It supports live migration of applications, even when their data is moved to the cloud via a physical data transport device.
  • It enables live migration without snapshots, which is much better for application performance.
  • It’s fault tolerant, so if interrupted, the data replication process resumes from the point of interruption.
  • Because of the IO filter-based design, it’s a lightweight application that runs seamlessly within a VMware-based data center.
  • It gives the administrator powerful capabilities, including the ability to accurately estimate the time required for data replication and the ability to automate many tasks.
  • JetStream Software

    Q. There are many solutions for cloud migration, so when would an organization choose JetStream Migrate over other options?

    It’s important to note that JetStream Migrate is specifically focused on ensuring reliable data replication for live migration. It will typically be used in conjunction with other cross-cloud tools, such as VMware vRealize, vCenter and NSX. The technologies are complementary, and they each play an important role in a cloud migration project.

    With respect to data replication specifically, the unique design of JetStream Migrate makes it especially useful when:

  • Live migration is required, but data will be transported to the cloud on a physical device.
  • The data migration network has insufficient or inconsistent latency or bandwidth.
  • The cloud destination is based on vSphere, but the CSP is not running the entire VMware Cloud stack.
  • A lightweight deployment is preferred, at the source, on the network or at the destination.
  • Network reliability concerns and high data ingest fees to make a fault-tolerant replication process preferable.
  • Q. You’re launching with an impressive list of partners. Can you tell us how you’re working with these partners and what about JetStream Software caught their attention?

    Our team has been engaged with VMware for a long time. We previously worked as VMware’s design partner for the development of the APIs for IO Filters framework, so we have been working with these APIs to integrate our products with VMware vSphere for years. Through our partnership with VMware, we’re now also collaborating with Amazon to support the migration of VMs to the VMware Cloud on AWS.

    Because of our history partnering with Dell, EMC, Cisco, IBM and HPE, we’ve also resumed and further developed our partnerships with those vendors, starting with the JetStream Accelerate product, which was familiar to our partners. And because all of these vendors are rapidly developing cloud solution portfolios, we’re discussing our new solutions with them as well.

    Q. JetStream Software appears to have deep technology credentials. What is your history working with enterprises and cloud service providers?

    One of the unique advantages of our particular “startup” is that we’re launching with a full development operation and a technology foundation representing over 200 developer-years of effort invested. We developed and supported a software solution that was deployed at thousands of data centers, both large and small. In doing so, we had a front-row seat to the transition from enterprises operating all-on-premises to the cloud, hybrid cloud, and cross-cloud operations.

    Q. You’ve just officially launched the company and the platform with the newest product on the platform. What can we expect later this year?

    Our JetStream Cross-Cloud Data Management Platform is maturing rapidly. The focus of our first product has been to remove the friction from cloud migration. Our releases in the second half of the year will bring similar advantages to cloud disaster recovery (DR) and cloud-based disaster recovery as a service (DRaaS).

    ###

    About Tom Critser, Co-Founder and CEO, JetStream Software
    Tom Critser has more than 20 years of experience growing and launching software companies. Previously, Tom was GM of Cloud Data Center Solutions at SanDisk. Tom was a member of the founding team of FlashSoft Software which was acquired by SanDisk in 2012. Prior to FlashSoft, Tom was VP of Worldwide Sales and Business Development at RNA Networks, the memory virtualization software company, which was acquired by Dell. Prior to RNA Networks, Tom was the VP of Worldwide Sales and Business Development at Infravio, the SOA management software company, which was acquired by Software AG. Tom graduated from Oregon State University with a BS degree in International Business and where he was a Pac-10 All-Academic Team member. For more information, please visit www.jetstreamsoft.com, @JetStreamSoft and www.linkedin.com/company/jetstream-software-inc/.


    May 6, 2018  5:27 PM

    18 Information Security Pain Points For An Organization

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Information security

    I am listing 18 Information Security Pain Points that cause quite embarrassing situations in an enterprise. Each of these may cause minor to major losses to an organization. These losses may be in terms of finance, reputation, or business. In fact, all three have a deep connection. These lapses in information security may happen due to lack of knowledge of internal IT staff, lack of ownership of technology department, selecting a wrong technology partner, lack of sponsorship from top management, and so on. That is an altogether different topic to explore. Well, here are the top causes listed below:

    1. Compliance related costs/requirements:

    At times, your IT department is able to identify the non-complying issues. But is not able to get an approval for the deployment or mitigation. Even at times, it is not able to assess the mitigation process or actual cost for it.

    2. Phishing:

    There are various tools to handle it. Ensure that your organization has a strong tool. Also, ensure that it has latest patches and updates all the time.

    3. User Behavior:

    It is not about policing or trailing but having an eye on user behavior is important. A user might initiate a wrong practice to cause a huge damage to the organization. Another instance is about a user who might notice something unacceptable but is not initiating a reporting mechanism.

    4. Keeping Up With New Technology:

    This is of utmost importance to stay away from information security risks. Ensure it is not a blind chase. Include some intelligent factor in it.

    5. Security Awareness Training:

    Information Security is everybody’s responsibility. Right from the top executive to the delivery boy in the organization. Awareness and regular training are important in this regard.

    6. Mobility Security:

    A lot of business apps are not moving to mobile devices. Ensure the security factor while such deployments.

    7. Application Security:

    Exhaustive testing is the key to it. Have top class staff and tools for testing any loopholes or punctures in the information security framework of your organization.

    8. Cloud Security:

    It applies same as mobility security. Any initiative in this regard has to have in-depth analysis and assessment.

    9. Ransomware:

    The whole world is in its grip. It, in fact, knows no boundaries. Ensure to take appropriate measures. Have regular audits and testing.

    10. Third Party/Supplier Security:

    Your external threads have to be as secure as you.

    11. Organizational Politics/Lack of Attention To Information Security:

    Engage top management in every step. Make them understand that a small gap may cause a huge damage.

    12. Staffing Information Security:

    Have background checks done. At least for all crucial positions.

    13. Data Loss/Theft:

    Data is the new currency. Treasure it and protect it as real money.

    14. Accurate & Timely Processing of Security Events:

    If you are not conducting any within the organization or for your external stakeholders, start it. If already doing it, ensure precision.

    15. Malicious Software (Malware):

    Testing, Audits, and Evaluation. It has to be a cyclic process. Regular iterations.

    16. Endpoint Security:

    All other information security measures will fail if you have a glitch at endpoint security.

    17. Lack of Budget:

    If an investment is important and crucial. Ensure it to happen. Any delay might end up in a suicide of the whole enterprise.

    18. Firewall/Edge Network Security.

    Hope, these 18 crucial information security pain points help in understanding the appropriate needs of your organization. Do let me know if there is a skip or miss anywhere in listing these points.


    April 29, 2018  11:38 PM

    Is Internet Explorer The Most Vulnerable Browser Now?

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Data Leakage, eavesdrop, Internet Explorer, Microsoft Edge, Phishing, Ransomware, vulnerability

    Has Internet Explorer become the most vulnerable browser? Has Microsoft lost control over it? Or Is it that Microsoft is having no more focus on it? Whatever is the case, it is not as secure to use IE. At least in your enterprise environment. In fact, it is now a legacy browser having not much attention from its creator. I think three decades is a good period for a browser to rule the internet world. There have been issues with any browser for that sake. In terms of security it is not that earlier there were no threats and vulnerabilities using IE. But the action was always prompt from Microsoft to tackle those. That is not the case now. At least after the launch of Microsoft’s new browser Edge. It is a more secure browser as the company claims. But this also has certain issues.

    vulnerable browser

    Photo credit: Richzendy on Visual hunt / CC BY

    Edge, in fact, is missing a number of legacy capabilities that Internet Explorer was having. Even out of the two which is more vulnerable browser is difficult to say right now. But for some valid reason, Microsoft is still installing IE on all Windows operating systems it is releasing in the market. IE, in fact, is currently the most eligible soft target for attackers. Chinese security firm Qihoo 360 calls it zero-day vulnerability Double Kill. The company confirms there is an advanced persistent threat (APT) targeting these systems. Qihoo 360 explains Double Kill as follows. It is an IE vulnerability that targets Microsoft Word documents for the purpose of attacking a device. The Word documents usually goes as an attachment. This document is not clean. It contains certain malicious shellcode. The shellcode provokes IE to open in the background process. That leads to an attack.

    Internet Explorer has become the most vulnerable browser

    The background process thus further prompts an executable program to be downloaded and executed. In fact, all this happens with the help of Internet Explorer and without any warning signal to the user on whose machine it is happening. Once this malicious document opens with Double Kill, it immediately controls user’s computer. That is the beginning of a ransomware infection. It also starts causing eavesdropping and data leakage. That makes IE the most vulnerable browser.


    April 29, 2018  11:26 AM

    How To Handle Secure Shell (SSH) Vulnerabilities?

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Data Encryption, IT security, Secure Shell, SSH

    Is Secure Shell or what we call it as SSH completely secure? It is almost more than two decades when Tatu Ylonen from Finland realized a strong need for security components in the online transactions. Realizing that, he created SSH, a powerful protocol to access anything on the internet. What it does is, it creates a trusted access by means of encrypting all kind of communication that takes place. In turn, that secures it from any attack in transit. So basically, SSH builds a tunnel where every communication gets encrypted. So that, there is a secure communication between any two points. It was simple yet powerful. In fact, it was an immediate need of the online world. Hence, it was popular in no time. As a result, every OS and device vendor ensured to pre-install it in their software/device. Like, all Unix, Mainframe, Mac, or Linux devices had it.

    Secure Shell

    Photo credit: xmodulo on Visualhunt / CC BY

    Not only that, most of the network devices also had the SSH or Secure Shell in-built. The whole story is all about access. If it is so strong, then why there are so many cases of cybersecurity? It is because of various reasons. The first and foremost is that it is taken for granted as it comes pre-installed. I don’t think there is a technical attention in any organization to monitor SSH transactions within the organization and with the outside world. Rather, everybody thinks if SSH is there, it means complete encryption and hence complete security. But who will check for flaws in the system? and what about any customization need of the organization in this regard? Who will manage it? In fact, before you think of managing it, there has to be someone who understands it. As a matter of fact, encryption alone doesn’t ensure 100% protection.

    Secure Shell Needs An Enterprise Wide Technical Attention

    When we talk about SSH or Secure Shell, it is basically all about authorized access. The challenge for any organization is to protect its data from illegal entities. Let us see what are the main risks of SSH. As we know, there is a private key and a public key to access any data. A public key, in fact, relates to a lock and the private key is its key. The lock remains on a door and the key is in the safe hands of a person. The main risk is of granting access to critical applications in an organization. If keys are self-provisioned, anybody can grant access having rights to do so. As a matter of fact, all security tools fail if this happens. The risk increases when people start sharing keys. In those cases, it becomes difficult to catch the culprit after a blunder.

    Another high-risk factor in case of Secure Shell is no expiry date of its keys. to avoid all these blunders, it is important to have an effective SSH key management mechanism in place. this should include periodic reviews, proper documentation, and appropriate IT controls.


    April 28, 2018  9:30 PM

    Data-Driven Cloud Skills Development For Your Enterprise?

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    cloud implementation, Cloud market

    Data-driven cloud skills development is becoming critical for any enterprise today. Whether you are in the cloud already or are in the process, you definitely need to upskill your technical staff for the same. In fact, even if you are completely having an on-premise model currently, you need to understand the technology from its core. Keeping that in mind, Cloud Academy reveals the general availability of training plans in the relevant field. These training plans aim to help CIOs, CTOs, and technical managers to chalk out target skillsets, assign training materials for cloud transformation, assess competence, certification, upskilling, and onboarding. Any training has to have a purpose. In addition, it should be measurable. One should be able to assess the progress and development. That is where training plans come into the picture. With the help of these plans, Technical Leads can map them well with the business needs.

    Data-driven cloud skills development

    Not only that. Data-driven cloud skills development helps you define target skillsets for your technical staff. Because then it becomes easier to assess each individual’s competence. Also, it helps you to ascertain if there is a need for any customization for any individual. So that, you can then assign each individual unique set of training materials to draw out the results at scale. As we all know, public cloud implementations are increasing at an exponential rate. In fact, a recent study estimates the global public cloud market to touch $200B in 2018. In 2017, it was around $150B. There is a minimum growth of 20 percent annually. But that is on the lower side. In reality, it would definitely be higher than this. Even if cloud adoption is at its nascent stage, there is a serious need of preparing organizations for its challenges, risks, and vulnerabilities.

    Data-driven cloud skills development Help in many ways

    Data-driven cloud skills development not only prepares you well for handling the upcoming challenges well but also makes you conversant with the volume and different kinds of options available in the market. Before any kind of deployments, it becomes necessary to map it well with the complexity of IT environments.


    April 28, 2018  1:09 PM

    Zoho Creator Empowered With Mobile App, Page, and Workflow Creator

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Creating mobile app, Mobile app, mobile app development, page, Workflow

    Before coming to the re-launch of Zoho Creator, I would like to talk about some unique features of the organization Zoho. Zoho calls itself the operating system for business. This becomes apparently true after the launch of Zoho One that is a complete suite of applications for an enterprise. I think this is the only company that has at least one product to cater to every major category of the business. Like, sales, marketing, accounting, customer support, and so on. Everything that this company creates for its customers is in-house. No collaborations and no outsourcing. Interestingly, Zoho offers many of its products free of cost to its customers. None of these products have any kind of ad-revenue model. It has more than 30 million users across the globe working in hundreds of thousands of companies. In fact, Zoho runs its complete business using its own products.

    Zoho Creator

    Page Live

    The new avatar of Zoho Creator comes with mobile app creation feature. Zoho Creator as we know is a low-code application building platform. What that means is now you have Mobile App Creator, Page Creator, Workflow Creator all together while creating powerful web and mobile apps. This is, in fact, a major update to the product. The latest version is Creator 5. There is a complete transformation in the core functionality of the tool. In fact, there is a significant enhancement and refinement not only in its core functionality but in all other modules. Now, the developers can design and develop mobile apps in their own native custom manner. Zoho One Admin Panel has the capability of large deployments and manages apps created by Zoho Creator in a simple manner. This, in fact, provides a unified interface to deploy and manage a complete suite of enterprise applications.

    Zoho Creator with Zoho One Provides A Unified Enterprise Interface

    Zoho Creator is more than a decade-old product. Hyther Nizam, VP of Business Process Products, Zoho Corp. says, “Over the last 12 years, the Zoho Creator platform has enabled citizen developers to design and deploy over two million custom applications. Having refined our low-code, no-code approach to app development, Zoho Creator has become the app builder for those of us with no formal programming and deployment experience. With this update, we are raising the bar yet again by enabling mobile app creation for both smartphones and tablets, no programming skills required. On top of this, all two million existing Zoho Creator applications are now automatically mobile-enabled.”

    Nizam concludes saying, “Think about that for a second. Web applications built on Zoho Creator 12 years ago—before mobile operating systems like iOS or Android even existed—are automatically mobile-enabled and ready for deployment on smartphones and tablets with no effort on the user’s end.”


    April 27, 2018  4:53 PM

    VMware Research Reveals A Wide Gap Between CIOs and End Users

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Business applications, Enterprise mobility, Mobility, VM mobility, VMware

    A latest VMware Research in collaboration with Forbes Insights reveals a wide gap between CIOs and end users in regards to acceptability and applicability of business applications in the enterprise. As a matter of fact, the business app is becoming a point of dispute in the Asia-Pacific Region. Basically, it is not about the quality of the business app in place. Rather, it is about its deployment and availability on various platforms that are becoming a pain point amongst the employees of an organization. It could be a feature rich app. But if you are not able to access it from anywhere anytime then it becomes a productivity deterioration factor. On the other hand, it could be available on all platforms but the app is not rich and stable enough to support all platforms. This kind of scenario also will create a high level of disappointment.

    The third angle to this is slightly different. The app is stable and running properly on different platforms. But is not as feature rich as the end users expect. VMware is a leader in cloud infrastructure and mobility. This particular VMware research was conducted in APJ (new Asia-Pacific and Japan) region. The purpose the study was to interpret the impact of business apps on business and end users in terms of performance, acceptability, and usability. On one hand, most of the organizations in APJ are speedily implementing business apps. On the other hand, there is a high level of dissatisfaction among their employees. While the CIOs claim that the apps are the right fit for the organization, the employees or end-users feel that are causing loss of productivity. The employees believe that the apps are incapable of meeting their business requirements or are capable of creating new business opportunities.

    VMware Research in APJ highlights

    VMware research clearly states there is a wide gap between CIOs and end-users. This gap needs an immediate attention. It is important to create a right kind of atmosphere for productive collaboration and employee satisfaction. The title of the research is “The Impact of The Digital Workforce: The New Equilibrium of the Digitally Transformed Enterprise”. The study includes more than 2,000 global CIOs and end-users of large enterprises. The focus of study stays on availability and accessibility of business apps and how they impact their work and business. It includes Australia, Japan, India, and China. Very few end-users are not happy with the current situation.


    April 27, 2018  1:48 PM

    Digital Marketing Is Not A Subset Of Enterprise Marketing

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    digital marketing

    Digital Marketing has a close connection with Customer Experience. In fact, the sole aim of organizations adopting digital marketing is to optimize customer experience. This, as a matter of fact, is a major shift. Because businesses are talking about customized and personalized customer experiences. More than 30% business belonging to world’s largest brands according to KPMG are increasing their budgets for online and mobile advertising. Similarly, around 35% of these businesses believe in the personalization of customer experience to gain more traction. For this, the best way is to merge services with technology. Because without technology it is not possible to achieve. Buying a product is not a primary concern anymore for a consumer. The demands are changing on this front. It is all about meeting their increasing expectations. Brands need to shorten the gap between them and their consumers. The whole paradigm is changing quite fast.

    Digital Marketing

    Photo credit: mkhmarketing on Visual Hunt / CC BY

    In fact, customer experience includes the high quality of the product that they buy. It also includes a meaningful conversation between brand and its consumers. Loyalty, these days, is not easy to gain. It needs more transparency, closeness, promptness, and a deeper concern. As Olivier Njamfa, CEO, Eptica says, “Consumers are ever-more demanding, and expect fast, high quality and informed conversations with brands if they are to remain loyal.” Thus, when we talk about Digital Marketing, it is all about the developing a brand’s strategies with a high focus on the customer. The more is the trustworthiness of a brand, the higher are the revenues. Businesses are talking about digital footprints and digital impressions. A good amount of past data can provide in-depth analysis of their past successes and how to leverage that now to gain a rich customer experience. It is more to do with Analytics now.

    Digital Marketing and Customer Experience Go Hand in Hand

    As a matter of fact, enterprises are keeping their generic marketing separate from digital marketing. The strategies of two are entirely different. Therefore, businesses need to improve their current framework. They need to map it well to create a highly effective digital marketing funnel to ensure an enriching customer experience. It is, in fact, the time to discover an entirely different approach with an aim to re-optimize business infrastructure in order to cater online consumers and a drastic improvement in customer adoption. Before the execution of digital implementation, it has to have a solid base of relevant business cases. The whole purpose is to increase opportunities to build a flawless multichannel customer experience framework.


    April 26, 2018  8:31 PM

    Top Barriers To Using Machine Learning In An Organization

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Machine learning

    What would be an organization’s top barriers to using machine learning? Machine learning, big data, IoT, Industrial IoT, etc. are the newest trending technologies. These are becoming, in fact, global phenomenon. Rather, these are becoming a necessity for businesses to strive. the businesses that are not thinking in these terms now will have a tough time tomorrow. Like any new technology, these also have their own advantages and constraints. But early adaptors will definitely have an edge over their competitors in time to come. The same applies to individuals also. Technology professionals, CIOs, and CTOs who are not thinking about applying these technologies in their organizations will have a tough time tomorrow. They will have to answer to their management for not doing it in right time. As a matter of fact, these technologies will emerge as big differentiators among success and failure of your competitors.

    What comes to your mind when we think of top barriers to using machine learning in your organization? To me, the top factors could be data, skill, applicability, budget, deployment, and approval of top management. All these points have a deep connection with each other. The core strength or the power of driving this lies in the technology leadership in your organization. You, being the technology leader, have to be the frontrunner in preparing business cases and presenting them well to the top management for their approvals. Obviously, the key lies in your hand. Thus, you only have the power to open this mysterious lock and show the world to your management. Now, let us look at these barriers one by one and see how to handle those. Accessing and preparing data is the core of machine learning, IoT, and IIoT.

    It is Important to Identify Top Barriers To Using Machine Learning

    An organization needs to have a clarity on the source, volume, relevance, and applicability of data. The actual task, in fact, begins once you are able to identify the top barriers to using machine learning. Merely availability of humungous historical data will not suffice the purpose. It has to be in a proper shape. As a matter of fact, preparing data is a bigger task in this direction. You won’t be able to allocate a budget in lack of clarity of your objectives. Finding relevant skill in the market or developing from your current pool is the next big decision you need to take. Running a pilot and showing good results to management is fine, but are you ready to deploy it in production? Are your operational systems ready for this? Finally, top leadership’s sponsorship and engagement is important.


    April 26, 2018  7:39 PM

    Some Business Cases for Machine Learning And its Applicability

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    business case, IIoT, Industrial IoT, Internet of Things, iot, Machine learning

    What would be an organization’s top reasons for exploring machine learning? Well, it depends on the business vertical in which that organization is working. But a few business cases apply to all. Like, Workforce utilization and optimization could be one of the top reasons for any kind of industry. In the current scenario, any industry strives on performance. Market Forecasting, for instance, is always a top agenda for any production and service industry. Especially, the ones that are consumer-centric. A small glitch in the market forecasting could cause a huge damage to an industry. Not only in terms of finances but also in terms of reputation. Similarly, there are industries that strive for various other kinds of forecastings. For instance, weather forecasting. If it goes wrong, the whole farming and other businesses depending on it can go haywire.

    Business Cases for Machine Learning

    Photo credit: WSDOT on Visual Hunt / CC BY-NC-ND

    In fact, any kind of forecasting needs to have a high level of accuracy. Without that, the system will not survive for long. It will lose its sanctity and credibility.The same is true for Marketing Analysis. A wrong kind of marketing analysis may result in wrong product recommendations and offers. Some more business cases for machine learning are Logistics Analysis, Physical Security, Price Prediction, Supply Chain, Cyber Security, Advertising, Healthcare, Scientific Research, Clinical Research, Preventive Maintenance, Customer Service, Customer Support, Fraud Detection, Social Network Analysis, Communication Analysis, and so on. This is not all. There are ample use cases. And, in fact, for a specific industry, there would be specific business cases in addition to generic business cases. As a matter of fact, while applying any of these business cases in your environment, it is important to chalk out the significant benefits your organization would draw out of it.

    There are ample business cases for Machine Learning

    When you think of business cases for machine learning, think of the results or gains after deployment. These could be an improvement in customer experience, increase in sales, gain in competitive advantages, reduction in errors and mistakes, reduction in risks and vulnerabilities, faster responses to opportunities and threats, faster risk mitigations, lowering of costs and expenses.


    April 25, 2018  10:16 PM

    NAKIVO Backup and Replication v7.4 With Automated VM Failover

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Backup, Hyper-V, Nakivo, Replication, VM

    With the release of Backup and Replication v7.4, NAKIVO achieves a new landmark. The key feature is Automated VM Failover for Near-Instant Disaster Recovery. In fact, there are a number of new features like Automated VM Failover, Self-Backup, File Recovery to Source, and so on. NAKIVO Inc. is a dynamically evolving virtualization and cloud backup software enterprise. Starting its operations in 2012, this US-based organization aims to produce excellent data protection solution for Hyper-V, VMware, and various cloud environments. Its double-digit growth for the last 20 quarters consistently and consecutively demonstrates its strength in the product line it caters to. Within a short span of 5 years of its existence, it has more than 10,000 deployments across the globe. The organization banks on its post-deployment support to its customers. It is not easy to achieve more than 97% in customer satisfaction with a stringent focus on customer support.

    With all this, NAKIVO www.nakivo.com is undoubtedly among the fastest-growing global organization data protection spectrum. Its customers include China Airlines, Honda, Microsemi, Coca-Cola, to name a few. Operating from 124 countries and with around 3,000 channel partners across the globe, the organization is spreading its wings fast. The latest Backup and Replication v7.4 comes with 11 new and unique features. The main purpose of these new features is to simplify existing disaster recovery mechanism by increasing convenience and comfort to the customers. Some of the prominent features are as below:

  • Automated VM Failover:
  • Businesses can’t afford any downtime in their mainstream applications and infrastructure. In fact, zero downtime is the demand of businesses from their vendors. NAKIVO Backup and Replication v7.4 aim to achieve this goal of their customers by helping them restore their systems in case of disaster without undesirable delays. Automated VM Failover replicates VMs to the DR location and then runs a single failover job. With the help of this feature, businesses can transfer their workloads to the DR location without the loss of critical business time. This near-instant feature thus minimizes downtime to least. Moreover, since the fully automated process performs with perfection with the help of re-IP rules and network mapping. This, in turn, happens in an automated manner without any manual interventions. In fact, there is no need for manual reconfiguration or replicas.

  • File Recovery to Source:
  • Though this is in Beta stage, it has a lot of promises to fulfill. The feature takes care of recovery of accidentally deleted or corrupted files to their source VMs or a different location. This, in fact, doesn’t require recovery of the entire VM first. That makes the whole recovery process fast and accurate. The recovery is performed from deduplicated VM backups.

  • Enhanced AWS EC2 Instance Backup:
  • NAKIVO Backup and Replication v7.4 protect AWS EC2 instances if so desired. By enabling and configuring this feature, it stores the backups onsite or in the cloud as per the requirements of the business. With the launch of this version, the number of recovery points per EC2 instance is scaled up to 1,000 now. That brings a high amount of reliability and different recovery options as per the suitability of the business environment.

  • Bandwidth Throttling:
  • The product has the capability of running jobs at the highest speed possible using the available bandwidth thus optimizing the whole process of Backup and Replication. Even if during the peak hours network administrator limits the bandwidth used by data protection processes, it automatically sets such limits on a per-job basis. This, therefore, allocates sufficient bandwidth for critical business applications.

  • Self-Backup:
  • This is a fantastic feature from NAKIVO Backup and Replication system. In case, there is a failure in VM or the physical server that is managing the VM backup software, a new instance can be installed within no time. As a matter of fact, it takes less than a minute. Keeping in mind that reconfiguring all the backup setting manually is a time-consuming task, v7.4 tackles this complex situation in a simple manner. It automatically backs up the complete scope of settings that exist in its web interface and as a result saves these self-backups in available backup repositories. The moment it senses the installation of a new instance, it imports all the previous settings from the backup repository instantly. This includes schedules, preferences, jobs, and inventory.

  • Global Search:
  • We all are aware that manually finding a VM, replica, or a job in a large setup of the virtual environment is a humungous task. It is, in fact, time consuming and painful. NAKIVO Backup and Replication v7.4 have a feature of Global Search. That helps to find any job, repository, or a transporter in a convenient manner. In fact, it is not about only finding any item. It also helps to perform individual or bulk actions straight from the search results in an instant and easy manner.

  • Flash VM Boot for Hyper-V:
  • Flash VM Boot helps in many ways. It can near-instantly boot VMs directly from deduplicated and compressed VM backups thus decreasing the downtime tremendously. These VMs, in fact, can be utilized for various useful tasks like sandbox testing. This is an extended support feature of Flash VM Boot to Hyper-V VMs.

  • Screenshot Verification for Hyper-V:
  • Merely availability of VM backups or replicas does not ensure their recoverability. NAKIVO Backup and Replication brings in a surety factor in this. For the purpose of ensuring recovery of VMs, it includes the Screenshot Verification mechanism. What it does is after the completion of every backup or replication job, it can automatically test the recovery of VM and take a screenshot of the OS. This screenshot can be handy for the reporting purpose. The feature now is available for Hyper-V and VMware VMs.

  • Log Truncation for Microsoft SQL Server 2017:
  • The product saves space and thus infrastructure, hardware and upkeep costs. It not only saves space in the backup repository but also truncates transaction logs of Microsoft SQL Server on the source VM. Actually, this truncation happens automatically after the successful completion of each VM backup or replication job. This, in turn, helps in saving a lot of disk space and thus avoiding a server crash.

  • Instant Object Recovery for Microsoft SQL Server 2017:
  • As the title of this feature suggests, it allows a quick recovery of tables and databases. This recovery happens from deduplicated backups thereby not requiring the full VM recovery first. This definitely saves a huge amount of critical time of the business.

  • Live Chat with Technical Support:
  • NAKIVO Backup and Replication v7.4 have a live chat feature that enables to connect with their technical support team instantly in case of any need.

    Bruce Talley, CEO, NAKIVO Inc. says, “We always take our customers’ feedback into account. That’s why NAKIVO Backup and Replication v7.4 introduces cutting-edge features that we know our customers want to protect their virtual environments even more efficiently. We also work to make the product more user-friendly and convenient.”

    A fully-functional free trial of NAKIVO Backup and Replication v7.4 can be downloaded at www.nakivo.com.

    RESOURCES

    · Trial Download: /resources/download/trial-download/

    · Success Stories: /customers/success-stories/


    April 24, 2018  11:41 PM

    Machine Learning Is A Subset of Artificial Intelligence

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Artificial intelligence, Machine learning

    The key task of Artificial Intelligence (AI) is to make your machines and devices think and thus act intelligently. And this happens with the help of corresponding software that makes machines and devices to think and perform like humans. That is why it is not a misnomer to call Machine Learning a subset of AI. In this case, ML’s main focus stays on using data and respective algorithms. Because with the help of these two, it keeps learning and predicting. Organizations bend towards AI and ML initiatives is thus not a myth. It is happening. And it is happening for a reason. There are plenty of ML frameworks for enterprises to choose from. Like Caffe, SparkKML, Theano, Torch, Keras, Tensorflow, and many more. Similarly, there are a number of ML tools like Jupyter, OpenCV, NumPy, Beaker, Pillow, Pandas, Zeppelin, and Scikit.

    Machine Learning

    Photo credit: WSDOT on VisualHunt / CC BY-NC-ND

    For creating Machine Learning environment an organization needs to depend on various data sources. These could include customer data, employee data, data outsourced from data brokers, data from various government and non-government sources, location resources, market resources, and social media. This is how big data comes into consideration. To manage this volume of humungous data, you need a different set of tools and environment. There are various data brokers in the market these days providing voluminous data. These brokers include Acxiom, Experian, Oracle Data Cloud, and DataSift. Similarly, when we talk of government sources, it could be census data, for instance. Location resources like TomTom and Google maps are also quite popular these days. Social media data sources are Facebook, Twitter, LinkedIn, Google+, Instagram, Pinterest, etc. Non-government data sources would include Weather, Environment etc. And then there are Market sources like SymphonyIRI, Nielsen, Reuters, and so on.

    Machine Learning Is a Revolution In Industry 4.0

    Machine Learning can be helpful in various ways. For instance, it can help in workforce utilization and optimization. But that is not all. There are a lot more business cases for an organization for using ML.


    April 22, 2018  7:35 PM

    Modern Requirements Management Tool – Modern Requirements4TFS

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    CompTIA, Microsoft, TFS

    I can build multiple use cases to depict the importance of Modern Requirements Management Tool – build on and for Microsoft Team Foundation Server. But before talking about Modern Requirements4TFS, let us understand what the biggest threat to the success of a project is. It is the business or user requirements. A minor ambiguity in requirements finalized at the initial stage of a project can create a major threat to the project at a later stage. PMI (Project Management Institute) research says there are only 3 things that can kill a project. Those are People, Process, and Communication. There are 7 factors that lie in these 3 segments. Lack in any of the 7 factors can significantly cause a delay or failure. PMI says, “Provide the project team members the tools and techniques they need to produce consistently successful projects.”

    That shows the importance of a good tool like Modern Requirements4TFS in the success of a project. A tool that covers people, process, and communication to get a strong hold on the project throughout its lifecycle. “Most project problems are caused by poor planning,” says CompTIA. And mostly in all those cases, it is the requirements that play havoc. The orthodox requirements management tools have failed drastically in capturing modern day’s dynamic requirements in a crisper manner. Requirements captured only in text leave a lot of scope of loopholes and ambiguities. That is why you need a tool that captures requirements with visual contexts and use cases. This not only helps in capturing the requirements clearly and unambiguously but also helps in setting a faster pace of development of the project. Project Insight states the top four causes of project failures are:

  • 1. Lack of Visibility of all projects
  • 2. Lack of Clarity of Project Objectives
  • 3. Lack of visibility into Resource Workload
  • 4. Gaps in Communication
  • Team Foundation Server from Microsoft is popularly known as TFS. To use it to its full potential and to draw out the maximum benefits for requirements management, you need an intelligent tool. Modern Requirements Management Tool is one of those. It is uniquely integrated into Microsoft Team Foundation Server and VSTS. Modern Requirements4TFS gains you access to a series of new hubs and features. It is a web-based tool having multiple modules. These modules empower you to have a full control of requirements management. With its help, you can define and manage quite easily and run the show efficiently. It can integrate with Microsoft Team Foundation Server and Visual Studio Team Services thus giving you a great level of control on Microsoft platform. The complete setup can be arranged as per your requirement. It can reside on-premise in case you have limitations on the cloud. Else it can reside in the cloud.

    Requirements4TFS

    Most of the organizations building for Microsoft platforms use Modern Requirements4TFS for Requirement Management. In fact, Modern Requirements Management Tool – build on and for Microsoft Team Foundation Server is the most appropriate tool in such an environment because all requirements are stored natively as work items in TFS (Team Foundation Server) / VSTS (Visual Studio Team Services). Many organizations also use TFS for tracking of development work elements, testing, and release management. In the nutshell, Modern Requirements users get following features and functionalities:

  • They can define requirements visually with diagrams and models. This makes things clearer than the textual methods. It becomes better and crisper with the help of use cases and mock-ups.
  • With the help of these diagrams and use cases, users can automatically create test cases and test/business scenarios.
  • By changing, adding or removing new requirements, users can analyze the impact in a real kind of environment.
  • Documentation becomes easier at every stage. Users can automatically create custom-formatted documents in Word or PDF.
  • The inbuilt process flow empowers users to initiate review and approval requests online. For compliance purposes, these requests and approvals are verified with e-signatures.
  • Audit trail helps in performing traceability analyses. It helps in maintaining high quality. It also helps in finding out any gaps or dependencies.
  • Managing project scope becomes easier by enabling users to create, edit, view, and compare baselines. It, in fact, also helps in tracking any changes.
  • The success of any project lies on three key factors These are Open Communication, Effective Collaboration, and the use of Intelligent Systems. It is a tool like Modern Requirements4TFS that enables these three factors to help any organization in achieving effective results in project management and team collaboration. Sharing your plans visually acts as a catalyst in the process of presentation and understanding. Thus, it reduces the risks drastically. On top of it, the tool improves scope management, change management, and project quality.


    April 10, 2018  10:32 PM

    DxEnterprise v17.5 and Its Relevance to Enterprise IT Professionals

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    DH2i, Enterprise IT

    DH2i, a leading provider of multi-platform high availability/disaster recovery, just released Version 17.5 of its DxEnterprise software. To gain a better understanding of this software and what it offers to Enterprise IT users, I recently sat down with Don Boxley, the co-founder, and CEO of DH2i (www.dh2i.com).

    What key issues are you addressing with the v17.5 release of DxEnterprise?

    Boxley: At the core, the issue comes down to how enterprises can achieve digital transformation: IT teams are under pressure to do more with limited resources. Businesses and other enterprises are increasingly dependent on data, so they require high availability – as little as 10 minutes of downtime can be disastrous. Finally, with sprawling infrastructures, IT resources are strained.

    To overcome these issues, enterprises need to think strategically about integrating legacy infrastructure. For the majority of today’s enterprises, the primary obstacle is with the legacy infrastructure. It is expensive to maintain, both financially and in terms of required labor. This is the most pressing issue facing today’s enterprises.

    How might v17.5 help with the legacy infrastructure constraint enterprises face?

    Boxley: DxEnterprise, and more specifically, version 17.5, alleviates the legacy infrastructure issue with an application-based approach. It supports an industry-first unified Windows/Linux automatic failover and fault detection. The company initially had a Windows focus. This new release builds on earlier DxE versions, allowing management and servicing of Linux, with automatic Windows/Linux failover for SQL Server 2017. It features a single-console Windows/Linux management. SQL Server 2017+ users benefit from this multi-platform environment as it allows them to move workloads and data to and from any cloud. They can also use it to scale cloud-based data analytics and business intelligence.

    Another key component of DxE v17.5 is it enables users to create a new class of distributed frameworks which allow workloads to move to the best execution venue, based on computational and budgetary considerations – we call this Smart Availability. This often means fewer operating system environments are required and reduces time spent on system maintenance. Ultimately, it frees IT, professionals, to spend their time on higher-yield activities that impact the bottom line.

    You talk about Smart Availability, as opposed to high availability. Can you describe the difference and what it means for the IT user?

    Boxley: High availability refers to overall uptime, while Smart Availability is an evolved, strategic approach in the direction of that same general goal. Smart Availability decouples databases, Availability Groups, and containers from the underlying infrastructure, and hence allows workloads to move to their best execution venue. High availability alone is often counterproductive: it simply adds to the infrastructural complexity without regard to the overall objectives. Smart Availability instead adapts to the overall business objectives and constraints.

    Are there any other applications you see being created by this new release?

    Boxley: The single cross-platform service, with its built-in HA capabilities, will be useful to managed service and public cloud providers. They’ll be able to increase recurring income by offering this service to customer applications – previously we saw many of these providers leaving it to the customer to ensure high availability. With this release, providers can include high availability as a service.

    We’ve also included enhanced features such as InstanceMobility for dynamic Smart Analysis workload movement, and intelligent health and QoS performance monitoring. These help ensure DxE v17.5 cuts costs, simplifies IT administration, and frees the IT team up to do the most impactful work in the enterprise.


    March 28, 2018  9:39 PM

    Performance Monitoring of Application and Infrastructure is Critical

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    application, Application performance, Big Data, iot, Machine learning, Performance monitoring, SaaS

    Performance Monitoring has become a critical factor for all business applications running in an enterprise. There are various reasons for it. Firstly, no application functions in isolation. There is always a dependency either backward or forward. It is either pushing data to another application. Or is pulling data from another application. As a matter of fact, infrastructure is not away from the scan of performance monitoring. Everything has to be in sync. Because it is the overall performance that matters in the organization. So, even if your infrastructure is modern and state-of-the-art performing at a rocket speed it loses its value if applications residing on it are under-performing. Let us have a look at the rising trends in performance monitoring for both – application as well as infrastructure. The key contributors to it are Big Data, Machine Learning, IIoT, etc. SaaS delivery model plays a significant role in this.

    Performance Monitoring

    Photo on VisualHunt

    Overall, industrial architecture is not as simple as it was a few years back. This is the age of complex applications. Features like Containerization, Microservices, and Heterogenous clouds to tackle data overloads are becoming critical and important. Data, in fact, is flooding from all directions. It is very important to analyze it. There are various ways to adapt to a proper performance monitoring mechanism. It is necessary to learn about each. These are Code-level APM (Application Performance Monitoring), Network Performance Monitoring or NPM, Performance Testing, Real User Monitoring or RUM, and Synthetic Monitoring. Code-level APM is a good tool to report load time and response time. In fact, it smartly figures out the lines of code in the application that are causing these troubles. New technologies obviously require new approaches. For instance, technologies like containerization and microservices need tracking of the tremendous amount of data to ascertain performance.

    Performance Monitoring Needs Better Tools

    Looking at the complexities of performance monitoring, vendors offering APM and similar services include machine learning methodologies to attain optimum results in data mining and generate important information. After all, it is the performance that matters most in an organization. And it becomes the responsibility of the IT department to ensure the performance of any employee in the organization doesn’t get any negative impact due to the poor performance of an application or infrastructure in place. Usually, it is somebody on the top in the technology department of an organization to own this responsibility. In fact, this is the person who is answerable for any kind of issues in the performance.


    March 28, 2018  12:41 PM

    Chatbots Can Minimize Operating Cost And Maximize Productivity

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Artificial intelligence, Chatbots

    An increase in the applicability of Artificial Intelligence (AI) in real life is responsible for the development of Chatbots. In fact, it has reached a maturity level where it is interestingly able to engage a prospective customer quite significantly. There is a steep growth estimation in the Chatbots industry from 2015 to 2024. The valuation as a report from Transparency Market Research is US$100 million. It is about to touch US$1000 million in 2024.That is a phenomenal jump in all respects. On a similar note, a PWC research paper states AI will be contributing US$20 trillion to the world economy by the year 2030. That signifies a tremendous potential of this market in coming years. The drastic drop in data rates is one of the key contributors to this phenomenal growth. Alibaba Group is investing a large pool of their profits in R&D, especially AI.

    Note that Alibaba is having a sales growth at the rate of 20% currently amounting to US$100 million. These figures are from a report by Market Realist. In fact, such a stupendous growth in Chatbots industry is changing the whole paradigm of the business model. New avenues are coming up in infrastructure, cost-saving alternatives, and business transactions. As a matter of fact, Artificial Intelligence (AI) is contributing significantly to business development in almost all industry segments. And, in fact, that is getting quite positive results in terms of business growth and development. It is turning out to be a highly beneficial proposition. Rather, Chatbots are becoming a streamline companion of the operative system. It is, therefore, important to understand how you can augment market size through Chatbot. If you are able to do that you can easily minimize operating cost and maximize productivity.

    Chatbots Industry Sees Tremendous Growth in AI

    As a matter of fact, Chatbots are a clear example of a new form of collaboration between manpower and machine. That means if you are able to harness the power of artificial intelligence in business automation, especially Big Data, it can increase your operational efficiency manifold.


    March 27, 2018  9:15 AM

    Digital Technology and Digital Transformation Go Hand In Hand

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    digital technology, Digital transformation

    What is a digital transformation? Different enterprises define it in their own way. The extent of going wrong is directly proportional to the defect in understanding it rightly. The more you are away from the right definition, the more are the chances to go wrong. Basically, it is how you integrate digital technology into all areas of your business. The transformation will lead to a drastic change in your way of functioning. It will, in fact, change how you operate. In addition, it will also change the manner in which you deliver value to customers. That means when it happens at your end, your customers will also experience a huge amount of transformation in the way they are doing business with you. It demands a big cultural change within the organization. In fact, not within the organization, but also around effecting all stakeholders in one way or the other.

    Digital transformation involves digital technology extensively. In fact, there will be a large amount of mobility entering into your day-to-day functionality. It also demands a continuous change involving a lot of experimentation, failures, and successes. The best way to get the best out of it is to keep challenging your status quo. The more you challenge, the more you get ideas to improvise it. Digital transformation is important for all kind of businesses. Also, for all sizes of businesses. So, whether you are a small business or a large enterprise, the importance remains the same. It is important to adopt in order to stay competitive in the market. If you don’t take appropriate steps in this direction, your competitors will leave you behind in no time. It also keeps your relevance intact. But it is just not merely moving to cloud as a lot of business leaders think.

    Digital Technology Is A Lot More Than Moving To Cloud

    It is important for an enterprise or a small business to understand digital technology and digital transformation correctly. What specific steps do they need to take? What changes in the job profiles might happen? Rather, what new jobs come into existence? In fact, what could be the right framework to start with? Do you need a consultant to start? What changes in business strategies will happen? And most importantly, what is the real worth of it? What are you gaining out of it? All these things are very important to understand.

    “Digital technologies continue to transform the work, how we interact with colleagues, and the value we deliver to clients and customers,” says Asoke Laha, CEO, InterraIT at the event ‘The Future of Digital Transformation’ at their Noida office. “This means all decision making is data-driven, and leadership must focus on providing insights into marketing and customer engagements,” he concludes.


    March 25, 2018  7:06 PM

    Is Cloud Lessening Data Breaches Burden On Enterprises?

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    cloud, Cloud adoption, Information security, security breach

    There are few things to notice about data breaches. Enterprises are preferring cloud over on-premise for less critical applications. That means information security trends are changing noticeably. But more important is to understand is Cloud driving shift in security spending. In fact, is that shift upward or downward. Studies reveal security budgets are rising consistently across the globe. A portion of credit should go to the grand publicity to security breaches. Especially breaches like Spectre and Meltdown. These are actually unanticipated risks that could take an extra bite than the IT budget you keep for security. In the nutshell, security has become one of the top two budget components. The first one being the cloud. As a matter of fact, it might take the top slot in time to come. Despite all constraints, security budgets are moving up. And that trend is visible in all size of companies.

    Even though Cloud service providers maintain their own security controls internally or through third parties but still it is a topmost concern of businesses. Rather few enterprises depend solely on their cloud service providers to raise an alarm on data breaches. According to a report on an average, almost 20% of IT budget is allocated to Information Security by most of the organizations in 2018. Around 5% of organizations say their spending on information security will be less in 2018 than earlier years. But that is negligible. That means more than 95% of businesses are spending more on information security than the previous year. Organizations are spending more on application security than hardware and network security. The security spending trends are changing drastically. This shows a substantial impact of cloud on these spendings. Testing and Performance are becoming two major thrust areas in cloud environments for identification of data breaches.

    Data Breaches Are One Of The Biggest Threats

    Such proactive approaches to testing will decrease their reliance on cloud vendors. Because they would get automatic alerts before their cloud vendors notify them of data breaches.


    March 22, 2018  9:38 PM

    Io-Tahoe CEO Shares Insights on Smart Data Discovery

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Big Data, data discovery

    Io-Tahoe has just announced the General Availability (GA) release of its smart data discovery platform. I sat down with Oksana Sokolovsky, CEO of Io-Tahoe, to better understand the data challenges facing modern enterprises, and how Io-Tahoe is attempting to address them.

    You’ve just announced the GA launch of the Io-Tahoe platform. What challenges are you hoping to address with it?

    Sokolovsky: I founded Rokitt Astra (Io-Tahoe) in 2014, together with Rohit Mahajan, our CTO and CPO, with the goal of providing the go-to platform for data discovery. The modern digital enterprise faces a complex set of challenges in maximizing the business value of data. For one, enterprises struggle with how to integrate a growing number of disparate platforms, with a formidable volume of data stored across databases, data lakes, and other silos. This makes it difficult or impossible for organizations to comprehensively govern, and ultimately utilize enterprise data.

    How does Io-Tahoe address these challenges?

    Sokolovsky: We built Io-Tahoe with the goal of providing a fundamental building block for all data discovery. This vision entails making data available to everyone inside the organization and automatically weaving through the data relationship maze to provide actionable insights to the end user.

    The platform is built on a machine learning base. It uses machine learning to identify data relationships, including within both metadata and the data itself. It operates in a “platform agnostic” manner and allows organizations to uncover data resources across diverse technologies.

    The platform enables a variety of disciplines in the data field – from analytics to governance, management, and beyond. It also commits to a leverageable view of data – data insights should be available to everyone in an organization. This is made possible through an easy-to-use interface, built on a scalable architecture.

    We’ve also included a new Data Catalog that allows organizations to compile or enhance data information so it can be leveraged across the organization.

    What do you see the future has in store for data discovery?

    Sokolovsky: In two words: dramatic growth. A recent report from MarketsandMarkets (1), for instance, predicts data discovery market expansion from $4.66 billion, its 2016 estimated size, to $10.66 billion by 2021. This represents a year-on-year compound growth rate of nearly 20 percent. Most of this growth will be in Europe and North America, with retail services, financial services, and utilities as three of the largest opportunities.

    The primary foundation of this demand is the increasing need for data-driven decision processes, but other factors are also playing a role in driving this explosion. A few other factors we’ve identified include regulatory pressures, such as GDPR, the rise of intelligent technology, which utilizes predictive analytics in a smart computing functionality, the shortage of qualified data scientists, the explosion of available data, increased demand for understanding it, the monetization of data assets, and unification of data platforms and management.

    It sounds like it’s perfect timing for your release of the Io-Tahoe platform. Can you explain why this launch is so exciting for the end users?

    Sokolovsky: I’ll be glad to. The GA launch of our data discovery platform is opening our unique algorithmic product to all enterprises. The machine learning aspect will allow them to auto-discover patterns and relationships in their data, and the Data Catalog promises to guide data owners and stewards through business rules and data policy governance.

    For example, it can automatically uncover data across the entire enterprise in a matter of minutes, rather than weeks. This reduces labor costs and allows organizations to tap into potentially valuable data.

    It also offers self-service features, empowering the end users to engage and share data knowledge. The Data Catalog feature, in particular, enables users to govern data across heterogeneous enterprise technologies, comply with regulations such as GDPR, and automate the previously manual process of data discovery. This will increase efficiency and use of enterprise resources.

    How about a use case – can you give us a clearer picture of what Io-Tahoe looks like in practice?

    Sokolovsky: Sure – we’ve actually developed three representative use cases to illustrate how customers could use Io-Tahoe. First, the systems use case: the platform can help them understand data lake and database migration. It can also help with system migration, modernization, as well as M&A system integration/divestiture. Second, the data analytics use case: this comprises analytics improvement, increased revenue potential, and improvement of complementary products. Third, the regulatory use case. The Io-Tahoe platform can assist with data governance, as well as regulatory compliance.

    Has Io-Tahoe already seen an application?

    Sokolovsky: It has. We have multiple successful examples to share with you. First, a customer used Io-Tahoe’s platform for data discovery and impact analysis as part of its re-platforming efforts. The customer’s analysis time was reduced three times, and cost decreased by 80 percent, with dependencies well-managed and accounted for.

    A major investment bank used Io-Tahoe for data asset discovery and appointed a new Chief Data Officer (CDO) to manage data assets. The organization reported similarly positive results, with the data discovery process becoming automated, reliable, and less labor intensive. This freed staff, including the CDO, to focus on analytics.

    It sounds like it’s the perfect timing for Io-Tahoe. Do you have any last words or thoughts to share?

    Sokolovsky: I want to emphasize, we’re excited about the opportunity to use our technology to address growing, real-world challenges with data discovery. Few of our competitors are addressing these issues. Enterprises require effective and comprehensive access to their data, regardless of where it’s stored. They require data governance, and compliance with regulations, along with a deeper view and understanding of data and data relationships. Hence, we believe Io-Tahoe may soon be a priority purchase for every CDO.

    (1) Data Discovery Market by Type (Software and Service), Service (Professional and Managed), Application (Risk Management, Sales & Marketing Optimization, and Cost Optimization), Deployment, Organization Size, Vertical, and Region – Global Forecast to 2021. marketsandmarkets.com. January 2017.


    March 19, 2018  4:16 PM

    Doing Business In India? Learn From Industry Experts

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    India

    It was quite an insightful day meeting industry leaders, academicians, and students together. The event was at InterraIT. In fact, it was an in-house event but the point of discussion was no less than a topic of global relevance. The topic was – Doing business in India. We had Dr. AD Amar, Professor, USA Seton Hall University. He had come along with a team of young, energetic, and business enthusiast students of his University. Industry expert and veteran Asoke K Laha, President and CEO, Interra Information Technologies, was the key person to guide the students about all business tactics in India and the US. He is the person who has complete knowledge of both worlds. He has offices and operations in both the countries. That is what makes him a perfect choice to guide these budding entrepreneurs hailing from various countries and studying in USA Seton Hall University.

    It was not only the students from the USA Seton Hall University eager to learn about Doing Business in India. We had a group of students from IIF (Indian Institute of Finance) along with their professor. The professor is also an active member of ASSOCHAM (The Associated Chambers of Commerce and Industry of India). Thus he was carrying a bountiful of information about the practicalities of opening and closing a firm in India. The opening of a company is now quite an easy process in the country. It takes hardly few minutes, in fact, to open a new company. And the whole process is online. The only hiccup that may haunt an entrepreneur is the process of closing a company in India. It still takes a number of years and involvement of High Court of India for the closure formalities. That is still a grey area.

    Doing Business In India Has Become Easier

    The overall motive of any country should be to facilitate young entrepreneurs from other countries to do business in India. As Mr. Laha says, learning about culture and people is the utmost thing for this.


    March 13, 2018  1:07 PM

    Digital Healthcare Can Create Lean And Effective Medical Ecosystem

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    digital, Healthcare, Lean

    Digital Healthcare is not a dream now. It is happening and evolving across the globe. Though the speed of evolution may vary from country to country. But every country acknowledges its potential and hence speedy adoption. It can not only help in creating a lean ecosystem but also an effective one. The manner in which industrial technology is advancing with the internet as its core backbone, pharmaceutical, and medical technology can achieve an astonishing amount of achievement. In fact, it is leading to a system that will have many benefits. Like, less labor intensive architecture will be a major thrust out of it. Also, the whole mechanism promises to be cost-effective. It will be an overall lean operation model for health institutions across the globe. The healthcare market is about to touch US$130 across the globe. By adopting digital technology, it is bound to create a new paradigm.

    If you belong to this field, you must attend Digital Healthcare Conference that is happening in May 2018 in Bangkok. Health and pharmaceutical sector with the help of digital movement can definitely do wonders. It can happen with the help of cloud technologies, IoT (Internet of Things), AI (Artificial Intelligence), Big Data, VR (Virtual Reality), mobility, and automation. If it is the right adoption of technology, it can optimize its precision, efficiency, and speed to an unbeatable level. Recently, there was a survey by Microsoft in this regard. They call it Microsoft Asia Digital Transformation Survey. It clearly states the importance of medical technology in everybody’s life. It states that more than 75% leaders belonging to healthcare segment in the Asia Pacific understand the importance and gravity of transformation into a digital business which is going to play a key role in future growth.

    Digital Healthcare is the Solution to many issues

    A modern-day wellness provider can’t think of its survival without the adoption of digital healthcare. In fact, there has to be a complete healthcare mechanism that promises to deliver seamless health services. And, as a matter of fact, it can happen only with a proper synchronization of physical, biological, and digital systems. Rather, this is the only way to tackle critical health issues across the globe. Obviously, this needs a proper training process that educates people on the changing trends of medical technology. It is important for end consumers to leverage these latest trends seamlessly. Otherwise, the whole efforts will go waste.


    March 11, 2018  11:19 PM

    Cloud Service Provider Market Is Marking A New Trend

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    cloud

    Enterprise cloud design is undergoing a transformation. So is the cloud service provider market. Enterprise is becoming more data hungry. The core target is to enhance their data analysis skills. Businesses are preferring to migrate their workloads to the cloud. Especially, the new businesses in the fray don’t prefer setting up their own on-premise server system. Services like PaaS (Platform as a Service), IaaS (Infrastructure), and CaaS (Cloud hosting as a service) are gaining momentum at a faster pace. Organizations are understanding that a good cloud service provider will have a better performance and security provision. Actually, it is all about a matter of cost. You throw a good cost and get whatever you require. Where businesses fail in the cloud is a different story. Those seeking cheaper solutions gradually land into a compromising condition of their data. Public Cloud or IaaS is the most popular cloud service.

    In fact, enterprise cloud transformation is a two-way process. It is a kind of knowledge enhancement at both the ends. There are generalists cloud providers. These are capable of building and operating public cloud platforms at a good scale. Some of the best examples in this class are Google, Microsoft, and AWS. On the other hand, we can now clearly see Specialist Cloud Service Provider market evolving at a faster pace. These specialists help enterprises understand and consume cloud service effectively and intelligently. They, in fact, have an expertise in this field. So, basically, there is a distinct line of separation between public cloud and CSP-owned. The latter would be able to manage complex environments in a better way. And definitely, enterprises will have to think of multi-cloud environments. Enterprises should always demand a delivery of breadth of product and services under a single banner.

    Cloud Service Provider Must Offer a Strong Relationship

    A smart cloud service provider would deliver a larger bouquet of products and services either with its own in-house capabilities or through partnerships with strong vendors.


    March 11, 2018  10:39 PM

    Are Enterprises Active For Data Analysis and Business Intelligence?

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Big Data, Business Agility, Business Intelligence, Data Analytics

    With the inclusion of social media data, enterprises have no dearth of data available for data analysis and business intelligence. In fact, it is their internal issues that stop them to do so. There is a high risk of getting behind in the race if actually, the organizations don’t perform it. The fact is that many organizations still don’t value it. They are either unaware of the fact that this is a big thing to strive in business or they don’t know how to run the show. But the fact is that all those who jump into the fray are not doing it rightly. They might have understood the power of it, but are not encashing it properly. In fact, identification of right data is important. Because of any organization has data at multiple locations and in different forms. it could be lying in relational and non-relational databases.

    As a matter of fact, it is very important for organizations or enterprises to have awareness of what data they possess and where all it resides. Unless they are clear about it, they can’t use it for immensely useful business outcomes from it with the help of data analysis and business intelligence. In fact, enterprise moving in the right direction are gaining great insights about business, customers, and competitors. Rather that is one of the reasons for cloud platforms gaining preference over on-premise storage. The fact of the matter is that wherever their data lies and in whatever shape, it has to be there at their disposal. In fact, cloud providers are adding a lot of value to attract enterprises. Any cloud provider having a better data management capabilities are high in demand against the generalists. In addition, there are third parties having specialization in this field.

    Proper Data Analysis Provides An Extra Power To An Enterprise

    Obviously, to ensure proper data analysis, an enterprise has to understand various data fabric functionalities.


    March 10, 2018  11:46 PM

    Cloud Adoption Starts With a Cloud Revolution At The Top-Level

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    cloud, Cloud adoption, Cloud Computing

    Is it the right time for enterprises to start cloud adoption at a better speed? Is it safe to leave the boundaries of the on-premise model and jump on the cloud? We all know that innovation is the key today to perform better and thus outpace your competitors. But any delay in innovations is going to cost. And at times this cost may be heavy. But along with all this, is this the right time to take chances without focusing on commitments? Can you sacrifice latter at any cost? Is it not going to create a high risk to the business? It says mass customization is the trend these days. So is flexible pricing. You may have to leave regular trends to achieve your organizational goals. Cloud revolution, in fact, needs a top-level support to thrive and strive. If top-level is lacking confidence, nothing can help an organization.

    Cloud adoption promises to deliver higher flexibility and better control to an organization. In fact, it brings in many variations along with different combinations. An organization’s burden, as a matter of fact, decreases. Like, it takes lesser energies to manage cloud provider, hardware vendor, and system integrators. All it needs is to select a right partner. With this, an organization’s running, operational, or recurring cost also stoops down. For instance, you need lesser resources, space, and infrastructure now. It is all a matter of change in culture and mindset to go for it. In fact, the cloud-procurement model is achieving maturity and hence giving higher confidence to the organizations. The best way to measure the level of your cloud adoption is the extent of infrastructure visibility in the organization. The higher is the former, the lower is the latter. Both are, in fact, inversely proportional to each other.

    Cloud Adoption Is Gaining Higher Momentum

    As I say, it is all a matter of mindset. Otherwise, risks and breaches happen within the four walls of an organization despite all control mechanism.


    March 6, 2018  10:07 PM

    NAKIVO Backup & Replication: Advanced VM Backup Software

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    AWS, Virtualisation, VMware

    An Interview with Bruce Talley, CEO, and co-founder of NAKIVO, Inc. (https://www.nakivo.com)

    Q: Please tell us about NAKIVO in two words.

    BT: NAKIVO was founded in 2012, offering customers a reliable VMware backup and replication solution. Over the years, we have developed into an efficient multinational company headquartered in Sparks (Nevada, USA) with several offices worldwide. We now have a versatile product for VMware, Hyper-V, and AWS EC2 environments, used globally by thousands of companies. Our success is confirmed by 5-star online community reviews and 97.3% customer satisfaction with support.

    NAKIVO

    Q: What are your company’s goals and priorities?

    BT: With the amount of critical data circulating in the modern business environment, data protection is the highest priority for everyone. That is exactly what lets NAKIVO take such a strong position in the market. Over the five years of our company’s operation, we have been focused on ensuring the protection of business-critical data for customers with virtualized, cloud-based, and hybrid environments. Our primary goal is to provide everyone with a powerful, affordable, easy-to-use solution that helps ensure the protection and recoverability of business data 24/7. NAKIVO’s 85% YoY revenue growth, as well as our customer base almost doubling in 2017, clearly demonstrates that a growing number of companies understand the necessity to abandon outdated legacy backup software for modern backup solutions, such as NAKIVO Backup & Replication.

    Q: These are rather impressive numbers. Could you tell us a little more about your product and its features? What exactly makes you stand out?

    BT: NAKIVO Backup & Replication is a high-performance solution for AWS EC2, VMware, and Hyper-V backup that also provides near-instant disaster recovery with VM replicas. The key principle we have built our product around is “the best data protection value for the money”. While NAKIVO Backup & Replication comes at a considerably lower price tag than our competitors’ solutions, the product offers a comprehensive set of features to meet everyone’ needs, standing tall among market leaders. Our solution is fast, scalable, and user-friendly.

    NAKIVO Backup & Replication works out of the box. The product can be installed in under 1 minute on Windows or Linux, as well as deployed as a pre-configured Amazon Machine Image or VMware Virtual Appliance. Moreover, if installed on a NAS, our solution can turn the device into an all-in-one VM backup appliance. Currently, we support NAS devices by ASUSTOR, QNAP, Western Digital, and Synology, but in the future, we plan to extend the list by adding other well-known vendors.

    NAKIVO Backup & Replication enables you to take full advantage of agentless, image-based, application-aware backup and replication options, decreasing the overall cost of your backup operations by 50% or more.

    Incremental backups, LAN-free data transfer, and built-in network acceleration can help you achieve a 2X increase in the backup speed. Automated exclusion of swap data, global deduplication, and backup compression can reduce the size of your VM backups by several times.

    Now add the abilities to verify your VM backups with screenshots and instantly recover any VM, file, or application object. As a result, you get a powerful solution which is much faster and less expensive than the alternatives. NAKIVO is truly customer oriented. We act on our customers’ feedback and continue improving NAKIVO Backup & Replication to remain the #1 VM data protection solution on the market.

    Q: How many customers use NAKIVO Backup & Replication? Also, can you name some of your key customers?

    BT: Over 10,000 companies throughout the world are now using NAKIVO Backup & Replication in their virtual environments. We are extremely proud of the fact that many of them were able to immediately see the advantages of using our product over our competitors’ solutions right after testing our full-featured free trial.

    China Airlines is one of our key customers. With two data centers, over 900 VMs, and more than 60 VMware ESXi hosts, their critical data must stay protected at all times in order to ensure the continuity of business-critical operations. By choosing to forego a legacy solution in favor of NAKIVO Backup & Replication, they reduced their weekly backup time by 10 hours, storage space consumption by 60%, and VMware backup budget by 30%.

    The Center for Scientific Computation and Mathematical Modeling (CSCAMM) at the University of Maryland has also greatly benefited from switching to NAKIVO Backup & Replication. Faculty, staff, and students required 24/7 access to research materials through their website. They also needed a powerful backup, replication, and recovery solution. With NAKIVO Backup & Replication, their backup processes now take no more than an hour. What’s more, the instant file recovery feature has managed to save them from potential disaster on multiple occasions. We are proud that CSCAMM could reach these results with our help.

    These are just several examples. Our customer base includes other major global companies, among them Honda, Coca-Cola, and Microsemi.

    Q: What is NAKIVO’s business model?

    BT: NAKIVO aims to be completely channel-based. At the moment, we have over 2,400 channel partners and a significant number of distributors in 124 countries worldwide. By partnering up with NAKIVO, they gain large discounts, sales training, deal registration, and regular promotions to drive sales.

    Q: And what about your support model?

    BT: If you have a problem or any question, you are welcome to contact our Customer Support Center. Also, if you are experiencing some technical difficulties, you can send us a support bundle right from the NAKIVO Backup & Replication web interface. The support bundle contains logs and system information, which is everything our specialists may need to help you resolve the issue as quickly as possible.

    Q: What are your plans for the future?

    BT: NAKIVO is already one of the fastest-growing companies in the industry, and we don’t intend to settle. We plan to do better and go further. Our plans for the following years are to expand our market presence and focus on large enterprises by adding new highly-demanded features.


    March 3, 2018  11:33 PM

    Zia Voice from Zoho Is The First Ever Conversational AI for CRM

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    CRM

    Now, you have a personal assistant whom you can call 24×7 for any purpose regarding CRM. Zia Voice comes from Zoho as a superstrong companion to your Zoho CRM. In fact, it is the first conversational artificial intelligence (AI) tool for sales teams. Along with it, you also get a hyper-customization platform for enterprises. This interactive voice and chat tool have intelligent predictive capabilities. Also, it manages your email sentiment in a unique manner. This, is basically, a framework. Over this, enterprises can create their own internal apps for their global teams. It helps in creating a universal and transparent system. For organizations using Zoho CRM, Zia is not a new entity. Only thing is its voice avatar. So, in fact, Zia gets the voice right on its first birthday anniversary. Exactly a year before, on 28th February 2017, Zia came into existence.

    Zia Voice

    Zia Voice is basically a boon for sales executives who are on the move most of the time. In today’s highly competitive environment, nobody can afford to lose time. Even customers today demand a real-time kind of response in most of the cases. So, Zia is now able to proactively announce deal closures. It can also analyze email sentiment in a most intelligent manner. That probably would be a very time-consuming activity if you are doing it manually. Thus, it is definitely going to enhance your efficiency and timely decision-making capabilities. The release comes along with Catalyst which is a capability that empowers your teams to create customized mobile and web apps. These apps can integrate well with Zoho CRM. Also, the teams now can create additional capsules or modules for Zoho CRM for internal use.The whole development resides on existing Zoho framework or infrastructure.

    Zia Voice Is a Powerful AI tool for Zoho CRM Users

    That means Zia Voice and Catalyst are bound to transform businesses in a unique fashion. It will not only help in deepening focus on business but also enhance overall efficiency. Mani Vembu, COO Zoho Corp says, “We are delivering the first conversational AI for CRM, with Zia Voice. Zia’s enhanced AI capabilities will now help salespeople sell smarter, with contextual assistance and access customer information through a powerful voice and chat interface.” It is basically an interactive voice and chat-bot. It can quickly answer your everyday queries in a real-time environment. These queries could be on new leads, periodic forecasts, traffic status, conversion rate, average deal revenue, and so on. The conversation happens in the form of simple chat and voice messages. Here is a small demo of Zia Voice:


    February 27, 2018  9:43 PM

    Five IT Infrastructure Approaches To Mission-Critical Applications

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    IT Infrastructure, Legacy applications

    Let us assume that you have a number of legacy mission-critical applications serving your organization for various purposes. Currently, all these are on-premise. That means you have your own data center and servers where these applications and corresponding databases are residing. You are solely responsible for their upkeep and uptime. So, everything is under your control right now. In other words, the complete headache is yours. That includes patch management, upgrades, releases, availability, backups, restorations, security, and infrastructure. Assume the whole setup needs an overhaul. The legacy applications are not serving the business well. It could be any, all, or some of the issues like performance, availability, upkeep, security, operational costs, skilled manpower, or heterogeneous environment. The business needs an overhaul because things are getting out of control. In such conditions, what will be your decision? Before taking a decision, let us see what all possibilities you have.

    Now since these are mission-critical applications, a quick decision is important. Because you can’t let the business suffer because of delay in the decision. But, on the other hand, the decision has to be the best one because it is a one-time investment. So, here are the various options you have to explore:

    1. The first option is to RETAIN. That means keep the things as it basis. Of course, this is not possible.

    2. You can think of MODERNIZATION. That means to retain all the legacy mission-critical applications on-premise. But at the same time explore to move to some modern application or change the infrastructure architecture. This also doesn’t seem to be a good solution.

    3. You can think of what they call as Refactor and Shift. That means you will redesign the existing applications using some cloud-native environment. Then you can deploy it in some off-premise cloud system.

    Various options for mission-critical applications

    4. This option is Repurchase and Shift. In this case, you will replace your current mission-critical applications with Software-as-a-Service model. Or the other option would be to replace these applications with their cloud counterpart. So you would be moving from on-premise to off-premise cloud environment.

    5. The last option is Lift and Shift. Ideally, this is the best option. You need to explore it in most feasible manner. In this model, you would be migrating existing on-premise mission-critical applications to off-premise or cloud environment. But in this case, there would be minimal changes in the code or logic.

    Now, let me know if you have been in this kind of situation regarding your mission-critical applications ever and which model did you choose? Otherwise, imagine you are the one who has to take a call. In that case, which option will you go for?


    February 27, 2018  7:13 PM

    Unpatched Applications Are Like An Open Invitation For Security Breach

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    patch, Patch management, patching, security breach

    I think no learning has come to enterprise from the famous 2017 Equifax security breach. The reason was an easy penetration through an unpatched application. 70% of enterprise worldwide are still living with this vulnerability. Merely banking on IT staff will never resolve this issue. There has to be an alert mechanism that works without fail, timely, and rather proactively. Definitely, you are safe as long as you are not the target. So, you can live happily with a safety tag on your job even if you are not deploying patches instantly on their release. Honestly, very few CIOs/CTOs would be able to answer how many of their applications are in acting as an open invitation to hackers. Because there are few of the applications functioning without latest patches. Of course, the more patches you skip, the more vulnerable your organization becomes. It just needs a hole in the balloon.

    Do you have a vulnerability disclosure mechanism in your organization? Mostly it is IT department that decides new application deployments. Like, What, Where, and How. But there are a few organizations where it is a mutual consent between business and IT for this. Interestingly, there are very few where the line of business takes a decision on this. What about your organization? Who is responsible for a security breach? IT? Business? Or nobody? I think, in next couple of years there will be a big transformation in most of the organizations. It will be non-IT functions having more influence on an organization’s applications and workloads management. I am sure, most of you might not agree with this statement. But those who agree, will have to think and answer which specific non-IT business function will it be? It could be HR, R&D, Operations, Finance, Legal, Sales, Marketing, or Executive Management.

    Security Breach is a top concern for most businesses

    The security breach is, of course, a matter of serious concern today for most of the organizations across the globe.


    Forgot Password

    No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

    Your password has been sent to: