Quality Assurance and Project Management


March 31, 2019  9:05 PM

Pearl Zhu: Fabulous Quotes From Digital Hybridity @pearl_zhu

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Digital transformation

“Persuasion with good intention helps problem-solving; manipulation often serves the ego.”

“Coherence improves business flow; resilience makes business robust and anti-fragile.”

“Taking the multidimensional hybrid models for going digital is all about how to strike the right balance of reaping quick wins and focusing on the long-term strategic goals.”

“It is in the best interest of the talent management to improve employee performance by striking the right balance of ends and means.”

“The challenge of digitalization is to have a harmonized vision and build a customized structure to enforce open communication and collaboration.”

“A hybrid organizational structure can bring greater awareness of intricacies and systemic value of organizational systems, processes, people dynamics, technology, and resource allocation, etc.”

“Digital businesses and their people learn through their interactions with the environment, to keep knowledge flow as well as business flow, and strike a delicate digital balance.”

Pearl Zhu

“The digital paradigm that is emerging is the dynamic organization with hybridity of knowledge, flexible processes, and unique competencies.”

“The paradox is the result of two opposing truths existing side by side, which can be both right”

“The purpose of digitalization is to make a significant difference in the overall levels of business performance and organizational maturity.”

“Digital kaleidoscope shows the ever-evolving dynamic view, hybrid digital patterns, and mixed cultures.”

“The hybrid nature of innovation is a combination of something old with something new, with a mixed portfolio of incremental innovations and radical innovations.”

“The hybrid decision-making style is practical because we live in such a hybrid, networked, and extended modern digital working environment.”

“The digital organization has a hybrid nature with flexibility, agility, and innovativeness.”

“Either at individual or business level, we should follow the “simplicity” principle to handle the over-complex digital reality with “VUCA” characteristics.”

Digital Hybridity by Pearl Zhu

“Inclusivity” is like a gift box, really not about the color outside, but the context inside.”

March 30, 2019  3:42 PM

11 Valuable And Meaningful Digital Transformation Quotes #Quotes

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Digital transformation

Digital transformation has a different meaning and perspective for different individuals and organizations. Let us see from the perspective of leaders in this arena through some classic digital transformation quotes.

Digital Transformation Quotes

“Education has always been a profit-enabler for individuals and the corporation. Education, both conception and delivery, must evolve quickly and radically to keep pace with digital transition. Education is a part of the digital equation.”

― Stephane Nappo

“People and organizations need to trust that their digital technologies are safe and secure; otherwise they won’t embrace the digital transformation.

Digitalization and cybersecurity must evolve hand in hand.”

― Ludmila Morozova-Buss

“Taking the multidimensional hybrid models for going digital is all about how to strike the right balance of reaping quick wins and focusing on the long-term strategic goals.”

― Pearl Zhu, Digital Hybridity

“Coherence improves business flow; resilience makes business robust and anti-fragile.”

― Pearl Zhu, Digital Hybridity

“A weak digital security can jeopardize a robust physical safety.”

― Stephane Nappo

“CISO 2.0 must lead digital transformation efforts. Act no more like a policeman. Be the dietician of the risk appetite and a business differentiator.”

― Stephane Nappo

“As truly successful business decision making relies on a balance between deliberate and instinctive thinking, so does successful digital transformation rely on interconnectedness and interdependence of the state of the art technologies.”

― Stephane Nappo

“A word ‘unprecedented’ seems too weak to convey just how much the dimensionless operational space of digital (r)evolution requires an instantaneous reaction.”

― Ludmila Morozova-Buss

“A central topic of my essays is cybersecurity.
A fundamental and delicate question at the heart of my work is: how to motivate my readers to want to learn more.”

― Ludmila Morozova-Buss

“5 Ways To Build Your Brand on Social Media:

1 Post content that adds value
2 Spread positivity
3 Create a steady stream of info
4 Make an impact
5 Be yourself”

― Germany Kent

Some fabulous Digital Transformation Quotes

“If we all work together there is no telling how we can change the world through the impact of promoting positivity online.”

― Germany Kent


March 30, 2019  11:14 AM

2 Catalysts In Enterprise Digital Transformation Journey

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Digital transformation

Enterprise Digital Transformation has become a buzzword these days. You attend any technology event and you find its mention there. In one form or the other. There are many masters emerging in this field. Organizations are claiming fast their adoption of this journey. It matters a lot whether they are taking it as a fancy journey or have some meaningful business-oriented goals in mind to achieve. There are many master concepts emerging in its name.

A lot of vendors are emerging with some very beautiful and lucrative concepts for your organization to adapt to become a digitalized organization. A lot of business deals are happening in their name. Whether these deals are able to see the day of the light is something that needs to be seen. Because after all, it’s your money that is going into the vendor’s pocket. The capability lies not in solution but results.

Enterprise Digital Transformation is a journey

Digital Transformation is also omnipresent these days in any technology publication. Be it online or print. But I feel, it is more of making noise than achieving something significant in enterprise digital transformation. Let us understand the top 10 killers in an enterprise journey that misleads them from a real path of digital transformation. The first and foremost is = DIGITAL TRANSFORMATION CAN’T HAPPEN ALONE OR IN ISOLATION.

Most of the organizations fail in their journey because either their understanding level is quite low in this regard. So they fail to understand the real idea behind it. Or it is the driver of their journey who acts weird. This is not a journey of haphazard ideas and goals. Everything is interlinked. It has to be sequential with a single major milestone to achieve one at a time. There can be though micro-segments or parallel sub-milestones to achieve in that.

Enterprise Digital Transformation Needs Super Engagement

The top two things that come to my mind for achieving significant success in enterprise digital transformation are data and people.


March 28, 2019  10:52 PM

Escher Transforming customer experience for posts @escher_group

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Customer engagement, Digital transformation

Digital transformation carries a different meaning for different organizations and individuals. The course of digital transformation also depends on the current level of any organization in this foray. For instance, an organization with zero digital initiatives will have a different course of action as compared to an organization already ten steps ahead to the former. On the other hand, a number of startups are taking very innovative initiatives and transforming the world in a real digital way. One such example in this segment is Escher. Escher is an organization that has a clear focus on modernization of global Posts for which the best way is to ensure a superb customer engagement platform. This customer-centric approach definitely helps in achieving business goals on better terms with their customers without any compromise in quality of service or product. What Escher is doing is probably one of its own kind of initiative.

Escher

Source: Escher.com

The core strength of Escher lies in its innovative customer engagement mechanism that has become a benchmark for others. Their vision is to enable posts to leverage digital transformation technologies to drive the whole ecosystem with greater speed and better economics. As a matter of fact, this is the first customer engagement application for posts. Currently,35 postal and courier customers are using this platform globally. The application is helping them to shorten the gap between the quality of service in comparison to their counterparts in the private segment. The private shipping companies, in fact, are equipped with better systems as they can afford to pay higher costs. On the other hand, globally, all customers expect a flawless experience despite all kind of digital disruptions happening across the globe. Irrespective of their locations, almost all customers have similar kind of expectations that are real-time informative as well as interactive.

Escher Creates A State-of-the-Art Platform

Nick Manolis, chief executive officer of Escher says, “Today’s postal customers have options for doing business, and we are focused on helping postal operators and couriers meet customers’ high expectations by evolving into digitally-driven, multi-channeled organizations. The Escher platform goes beyond a counter solution in posts’ retail stores. We help posts meet customer demands on their terms.” He further adds, “At Escher, we understand postal and courier operations and the pressures and constraints they face, and we have invested over $80MN in R+D for customer-focused technology. We are dedicated to helping modernize postal and courier operations using the expertise we have gained over two decades in helping to transform over 35 postal and courier operations globally and processing more than two billion transactions annually”.

You can find more details here.

Home


March 23, 2019  3:30 PM

NVIDIA Inception Startups Get Free MapR Enterprise License @mapr

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Artificial intelligence, Machine learning, NVIDIA

MapR announces MapR AI Edge Program. This program benefits all NVIDIA Inception startup members with a free MapR enterprise license. This is something phenomenal. And it means a lot for the startup community. What it means is that all NVIDIA AI startups can boost their AI development lifecycle with MapR Data Platform. That too with no additional investment. MapR, as we all know, is the visionary creator of next-generation AI and Analytics Data Platform. MapR AI Edge Program, in fact, is an AI accelerator program completely free for this exclusive segment. This enables deployment agility along with data management. This applies to any kind of data between and across edge, on-premise, and cloud. And it covers all Machine Learning (ML) and Artificial Intelligence (AI) products. Probably it will be of great help to Startups working in these respective fields. They get a number of opportunities and enhancements at no cost.

MapR Enterprise License

Jack Norris, Senior Vice President, Data and applications, MapR says, “Customers will be able to provide more impactful demos of their AI product by running GPUs anywhere and being able to take advantage of all of the features and capabilities built into MapR. MapR AI Edge Program enables faster deployments and the ability to spotlight NVIDIA in mixed-use environments, eliminating barriers and expediting value creation of AI apps from development to testing and demonstration.” In fact, MapR Enterprise License is a bonus for the startups working in this arena. It will definitely help them to accelerate their business giving them new dimensions to achieve faster deployments. It is going to give them multiple benefits. MapR Enterprise License comes with no storage limitations that gives startups a free hand to develop, test, and demonstrate products. It’s a boon for the startup fraternity. A must grab benefit for them.

MapR Enterprise License is a boon for startups

MapR Enterprise License free for startups will include a number of benefits for them to thrust their business in the right direction. To know more visit here.


March 23, 2019  11:52 AM

2019 Retail Quality Report Indicates Big Quality Mishap @applause

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
quality reporting, Software Quality

The whole world knows the value of quality and how much it matters in terms of loss and profits, then why the top retailers across the world failed to adhere to its standards and procedures as per 2019 Retail Quality Report? One thing I always fail to understand – the real cause of failure in quality measurements. Is it due to time limitations? Is it there too much to achieve? It might be a lack of knowledge. It might be the absence of confidence. Rather, it might be a result of overconfidence. It could also be the result of the right set of people to achieve it. There could be other reasons that probably my readers can pinpoint in the comments section of this article. Let us try to examine these reasons one by one. Whatever is the case, these reasons can lead to big losses, blunders, and failures.

2019 Retail Quality Report

The first reason that comes to my mind is time limitations. If it is too much work to do in too short a period, then it is foolish to start it. It is foolish, in fact, at the management’s end to undertake these kinds of projects. As a matter of fact, the whole project should be broken into two proportions of 80:20. Segregate the goals. Sort out the top 20% of most important ones and then spend the whole time in achieving those with full dedication of analysis, development, and testing. These 20 top goals will get you more than 50% of the results that you were planning to achieve with a whole lot of objectives. Finding the right reason is of utmost importance when it comes to quality. Applause, a global leader in digital quality and testing through crowdsourcing, releases its 2019 Retail Quality Report recently.

2019 Retail Quality Report Is a Big Learning For Global Retail Industry

It is important to go to the root cause of the reasons causing these failures. It is equally important to ensure the same mistakes don’t repeat this year. Otherwise, the whole purpose of 2019 Retail Quality Report gets defeated.


March 21, 2019  10:14 PM

Aparavi Enhances Multi-Cloud Data Management @aparavisoftware

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Active archives, Cloud data storage, Data Management, Multi-cloud

Active Archive Platform From Aparavi gets a major thrust with some phenomenal enhancements a few days back. The platform is already quite popular among multi-cloud data management customers. Aparavi is a market leader in this technology segment. The new features give a triple benefit to its customers. These three areas where the customers get a major boost are operational efficiency, multi-cloud data management, and insight. Any improvement, in fact, brings a chain of benefits. For instance, better resource management leads to enhanced operational efficiency. This, in turn, results in better insight. And all this together helps in greater management of archived data. Some of the new features that need a special mention include direct-to-cloud data transfer, next generation of data classification, full-content search, and tagging. For customers, the addition of any number of new cloud destination becomes simpler and easily manageable including bulk data migration across multiple clouds.

Aparavi

Source: Aparavi.com

Aparavi launched its Active Archive platform in May 2018. Within a short span, it is quite popular among organizations that were grappling with the large volume of unstructured data. Retaining such data becomes an obligation for organizations in wake of compliance, business reporting, analytics, and historical reference. Thus, this intelligent multi-cloud data management empowers enterprises to actively manage their data for long-term policy-based retention, re-use, and open access. And all this happens while providing a hassle-free way to multi-cloud adoption. These new features can be divided into four major segments. The first remarkable benefit is Direct-to-cloud for improved resource management and operational efficiency. Second, Next-generation data classification and tagging for better management and simplified access. Third, Next generation search for greater insight and access. And fourth, Enhanced multi-cloud management and bulk-data migration.

Aparavi Named Gartner Cool Vendor 2018

Brian Ricci, President, Pinnacle Computer Services says, “Aparavi has helped us greatly with our client’s initiatives to move data to the cloud for long-term retention. With these new features, we can more easily organize and manage the vast amounts of data stored, and find specific data as needed by an organization. It’s a real game-changer.”

Marc Staimer, President of Dragon Slayer Consulting says, “Long-term retention is changing. It is no longer enough to simply store data for lengthy periods of time. Compliance, data migration, search, multi-cloud, and cost control of that data are all difficult problems IT organizations are wrestling with. Aparavi’s latest version tackles these problems head-on in a cost-effective manner.”

Adrian Knapp, CEO, Aparavi says, “We hear from customers all the time about their need to retain data for lengthy periods – often forever – but that they face real challenges in managing it effectively and efficiently. Our Active Archive platform with these enhancements provide the solution. That’s why we tell them, ‘Keep your data! Just do it better.’”


March 20, 2019  8:36 PM

Formulus Black Revolutionizes Computing with ForsaOS @FormulusBlack

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Computing, IN MEMORY

Formulus Black, a ventured startup, creates a new landmark in computing by changing the whole paradigm. It is something that is happening for the first time in the world. While so many giants might have been struggling with this idea and finding ways to give it a real shape, Formulus Black breaks the barrier and takes a leap in a true sense by harnessing the power of in-memory compute for all applications. That is going to be a great boon for large and medium enterprises. The product, ForsaOS breaks the barriers between memory and storage thus giving a revolutionary direction to computing. This is something phenomenal. And at the same time is very crucial for the technology guys working in enterprises to understand the whole ball game. Because only then an enterprise reaps the benefits of this power of in-memory compute.

Formulus Black

Source – Formulus Black

In-memory compute not only enhances compute operations but saves humongous cost. This was in fact, a major issue being faced by all hardware and software giants to tackle in computing. With the launch of ForsaOS, Formulus Black not only addresses these major issues but also creates a new compute benchmark. ForsaOS is a complete Linux-based software stack that is designed precisely to run all applications in memory. This revolutionary technology brings a huge benefit in compute efficiency. This, in fact, results in a significant benefit in cost-effectiveness, processing speed, memory capacity, and data security with no changes in the application. Everything keeps happening as before with a heap of benefits in cost, speed, memory, and security. No sort of compression or encryption happens in this process. The software stack keeps data in persistent memory.

Formulus Black Launches ForsaOS

This is the first time in the world a technology is launched that keeps data completely safe against power loss. That is a great achievement. Wayne Rickard, Chief Strategy and Marketing Officer, Formulus Black says, “The challenge within the computing industry continues to be how to achieve the fastest speeds with the lowest latencies needed to satisfy increasing demands of the compute side of the equation while overcoming the expense and limited capacity issues from the memory side required to achieve it. We have designed ForsaOS to address these issues by amplifying the memory. Because CPU to memory is extremely fast while I/O to external storage peripherals is slow, we have developed a software solution that utilizes fast DRAM memory as storage while providing all the necessary management tools and features needed to increase effective memory capacity by up to 24x while improving processing speed as much as 450x.

For a demo or to get further details on it, you can visit here. Additional information on the company is available on their website.


March 9, 2019  5:02 PM

SIOS DataKeeper By @SIOSTech For Enterprise-Class Protection

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
"Amazon Web Services", AWS, Azure, Disaster management, Disaster Recovery, Google Cloud, High Availability, SIOS, Virtual Server

This is an interview with Jerry Melnick, president and CEO, SIOS Technology on SIOS DataKeeper. Jerry Melnick, president and CEO, is responsible for directing the overall corporate strategy for SIOS Technology Corp. and leading the company’s ongoing growth and expansion. He has more than 25 years of experience in the enterprise and high availability software markets. Before joining SIOS, he was CTO at Marathon Technologies where he led business and product strategy for the company’s fault tolerant solutions. His experience also includes executive positions at PPGx, Inc. and Belmont Research, where he was responsible for building a leading-edge software product and consulting business focused on supplying data warehouse and analytical tools.

SIOS Technology provides IT Resilience for critical applications like SQL Server, Oracle, and SAP in the cloud, hybrid cloud or datacenter. Using SIOS high availability clustering software, applications automatically recover from infrastructure and application failures in a matter of minutes with no loss of data, keeping data protected, and applications online. SIOS was founded in 1999 and is a subsidiary of SIOS Corporation, a publicly traded company based in Japan (TYO:3744). The company is headquartered in San Mateo. SIOS software runs business-critical applications in a flexible, scalable cloud environment, such as Amazon Web Services (AWS), Azure, and Google Cloud Platform without sacrificing performance, high availability or disaster protection.

1. What is SIOS DataKeeper all about?

SIOS DataKeeper software is an important ingredient in a cluster solution that lets users add high availability and disaster recovery protection to their Windows cluster or to create a SANless cluster for complete failover protection in environments where shared storage clusters are impossible or impractical, such as cloud, virtual servers, and high-performance storage environments.

Clusters built with SIOS software protect applications including Microsoft SQL Server, SAP, SharePoint, Lync, Dynamics, Hyper-V, and more from downtime and data loss using a SAN or SANless cluster in physical, virtual, and cloud environments and provide enterprise-class protection for all server workloads at a fraction of the cost of array-based replication.

2. How does SIOS DataKeeper ensure high availability for critical applications of an organization?

SIOS DataKeeper eliminates the need for complex and costly hardware SANs when configuring systems for high availability or disaster recovery. It uses fast, efficient, block-level replication to transfers data across both local and wide area networks with minimal bandwidth. It delivers incredibly fast replication speeds without the need for additional hardware accelerators or compression devices. This allows companies to flexibly configure the optimal recovery environment to meet their business objects without the constraints of hardware SANs.

3. What kind of organizations fit in for having SIOS DataKeeper?

SIOS software is used by companies large and small and by applications with just a few gigabytes of data to many terabytes. The software works with any Windows application transparently without modification. It is employed across a wide variety of industries. DataKeeper is used where a company’s day to day operations rely on the availability of the application. With over 70,000 licenses, many of the world’s largest companies use SIOS software to protect the applications their business depend on.
SIOS software runs business-critical applications like SAP and databases such as SQL Server, Oracle, and many others in a flexible, scalable cloud environment, such as Amazon Web Services (AWS), Azure, and Google Cloud Platform without sacrificing performance, high availability or disaster protection.

4. How SIOS DataKeeper edges over other similar products in the market for its Cost-effectiveness and Application-agnostic Design?

The extreme flexibility to create high availability clusters using existing hardware with or without a hardware SAN in both high availability and disaster recovery configurations is the key differentiator. You don’t need to purchase new hardware or learn new technology. DataKeeper works out of the box with Microsoft Windows Server Failover Clusters (WSFC) as a simple add-on and is installed and configured in minutes. Since it’s based on Windows and works at the operating system level, any application that runs on Windows can be protected by WSFC with SIOS DataKeeper. And you don’t need to change the application – it’s all transparent.

5. What advantage did ALYN Hospital get after deploying SIOS DataKeeper?

ALYN Hospital IT was seeking to use existing hardware and their Hyper-V environment that was configured and operating in separate server rooms on-premises. They needed to achieve both high-availability with no loss of data and minimal uptime and disaster protection while providing them ways to maintain uptime during upgrades. SIOS DataKeeper gave them all of these capabilities without costly new expenditures or reconfiguration.

The ability to create 3-node SANless failover clusters with a single active and two standby instances has proven to be especially valuable for ALYN’s needs. They are updating systems and software continuously, and with DataKeeper they can do that without any disruption to operations. Because the data replication supports multiple standbys and enables manual, dynamic assignment of the active and standby instances, the active instance can be moved to any server in a 3-node cluster and remain fully protected during periods of planned hardware and software maintenance.

Other SIOS DataKeeper features that are important to ALYN’s needs include the ability to work with any type of storage and WAN-optimized data replication. The SIOS cluster seamlessly supports any storage volume recognized by Windows, and this substantially simplifies their operations while enabling them to utilize all of their storage resources. Additionally, the WAN optimization will prove useful as ALYN Hospital implements its remote disaster recovery site.

ALYN Hospital is confident the SIOS SANless failover cluster will perform as desired when needed: They test the configuration regularly and routinely change the active and standby designations while redirecting the data replication as needed during planned software updates, and the applications have always continued to run uninterrupted.

6. What was their evaluation criteria?

To evaluate third-party failover clustering software, ALYN Hospital established three criteria: The solution had to work with existing hardware; it had to provide both high availability (HA) and disaster recovery (DR) protections all of the hospital’s critical applications; and the total cost had to fit within the department’s limited budget. The IT staff quickly narrowed the third-party options to two, and after carefully evaluating both, found that only one met all of its criteria: DataKeeper from SIOS Technologies. While they needed a solution that was cost-effective, they were determined not to sacrifice quality or capabilities. With SIOS they found a solution that delivers carrier-class capabilities with a remarkably low total cost of ownership.

7. What are all the platform SIOS DataKeeper works on?

SIOS DataKeeper software adds disaster recovery protection to a Windows cluster or to create a SANless cluster for complete failover protection in Windows environments where shared storage clusters are impossible or impractical, including any combination of physical, virtual, cloud, or hybrid cloud infrastructures. SIOS software runs business-critical applications in a flexible, scalable private or public cloud environments, such as Amazon Web Services (AWS), Azure, and Google Cloud Platform, hybrid clouds or on-premises datacenters.

SIOS also offers SIOS Protection Suite for Linux which is one of a suite of Linux and Windows clustering solutions offered that use artificial intelligence (AI) to improve IT resilience, maintain uptime and lower operational costs.


March 5, 2019  11:15 PM

Complete Data Accuracy Platform From @Naveego

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Data access, EMR, Healthcare

Munson Healthcare is the largest healthcare system in Northern Michigan. They currently have more than 540,000 unique patients across 30 counties. There are nine community hospitals in this huge network. To manage such a huge customer base, having an accurate and comprehensive record of patients data had become a prime necessity for them to operate flawlessly. After all, healthcare is all about the highest quality of patient care and productivity for users. There was an asking for a new EMR to enhance operational efficiency. The motive was also to ensure compliance with increasing regulations. Complete Data Accuracy Platform from Naveego was not in their procurement list still. It was assumed that the new EMR will be effective enough to manage their newly acquired hospitals, outpatient facilities, and practice groups. But the new EMR system was not able to help in data integration. Something was seriously missing.

Munson’s staff was spending a substantial amount of time to manually document, execute, and validate data. In fact, data mapping was becoming a big pain. Despite investment in EMR, staff had to perform manual spot checks to validate successful migration of appointments. This was becoming a big overhead leading to escalating costs, time loss, and schedule complications. The entire initiative thus was heading towards failure. That is where their hunt for a Complete Data Accuracy Platform began. Naveego’s platform was the only system among many others to qualify to meet all their requirements to address the data accuracy challenges. With the new solution from Naveego in place, the Munson staff was able not only able to increase their efficiency manifold but adherence to processes also improved significantly. Everything was in place within a few days. It was an automatic creation and monitoring of quality checks.

Complete Data Accuracy Platform from Naveego Is A Revolution

Munson Healthcare is quite happy after selecting Naveego Complete Data Accuracy Platform. They are able to manage, detect, and eliminate data issues well in advance. It is helping them to achieve significant cost savings. Naveego achieves it by connecting multiple data sources into a single EMR system. This makes proactively achieve Global Data Health for Munson Healthcare.


March 4, 2019  10:14 PM

Signalchip brings Semiconductor Chips for 4G/LTE, 5G NR @signalchip

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
4G, 4G LTE, 5G

Signalchip Launches Semiconductor Chips for 4G/LTE and 5G NR Modems. These are India’s first indigenous semiconductor chips. Fulfilling India’s Prime Minister Narendra Modi’s dream of ‘Make In India’. In fact, this is a phenomenal achievement. This newer silicon-chip technology innovation by Signalchip places India on the global map. Bengaluru once again proves to be India’s key technology hub. This fab-less semiconductor company has achieved something exemplary that many global companies would have been striving to achieve working in various parts of the world. Telecom Secretary Aruna Sundarajan was present at the launch. These are SCBM34XX and SCRF34XX/45XX series of chips. ‘Agumbe’ is the name given to these chips. It is a result of the hard work of more than 8 years. It involved deep research by engineers with a global level of capabilities working with Signalchip. This is totally a game changer in the global Telecom industry.

Signalchip

Source: Signalchip

Signalchip unveils four chips. SCBM3412 is a single chip 4G/LTE modem. It includes the baseband and transceiver in a single device. SCBM3404 is a single chip that is 4×4 LTE baseband modem. SCRF3402 is a 2×2 transceiver for LTE. SCRF4502 is a 2×2 transceiver for 5G NR standards. All LTE/5G-NR bands upto 6GHz are covered by the RF sections. India has its own satellite navigation system known as NAVIC. These chips support positioning using this satellite navigation system. Interestingly, Agumbe is a small remote village in Shimoga district, Thirthahalli taluk in the Malnad region of Karnataka, India and is also known as Cherrapunji of the South. This is because of the heavy rainfalls there. The Agumbe series builds upon SCRF1401. That is India’s first RF transceiver chip. It adheres to high-performance wireless standards such as 3G/4G and WiFi. It was designed by Signalchip in 2015.

Signalchip Brings A Revolution in 4G/5G, In fact

Aruna Sundararajan, says, “This is a proud moment for India’s Digital Communications industry. I congratulate the Signalchip team for designing India’s first indigenous semiconductor chips for 4G/LTE and 5G NR modems. India aspires to take a leadership role in developing inclusive 5G technologies for economic self-sufficiency and strategic needs of the country. These chips are a significant step in this direction as they have the potential to cater to the growing digital connectivity needs of the next 5 billion users, by enabling high-performance mobile networks at lower cost.” She added, “The Signalchip team’s persistence over the 8-year R&D period and the commitment shown by Sridhar Vembu to believe in this vision is indeed commendable.”

Himamshu Khasnis, Founder and CEO of Signalchip says, “Currently in India, all devices and infrastructure, whether imported or domestically manufactured, use imported silicon chips. Silicon chip design is a very challenging activity requiring high-cost R&D, deep know-how and mastery of multiple complex domains. Hence, this technology is not available in most countries. Given that wireless communication is central to almost all economic, strategic and domestic activities happening today, the ability to indigenously design and develop silicon chips is vital for the security and prosperity of our country.”

Finally, Sridhar Vembu, Founder & CEO of Zoho and Mentor to Signalchip says, “India has always had the talent required to build any technology. We just need to be patient and have enough capital to put it all together. It’s a long-term commitment. Through smart planning and relentless efforts, Signalchip has acquired the capability required to create any complex and globally competitive silicon chip, indigenously from India. I truly appreciate the patience and diligence the Signalchip team has shown to build this chip. Only long-term R&D can make Indian companies globally competitive.”


February 28, 2019  10:23 PM

Database Activity Monitoring or DAM – Is it Important?

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
DAM, Data Encryption, Data-security, DLP, IBM, Microsoft, MongoDB, MySQL, Oracle, SIEM, Teradata

Data is the new currency. Data Security has become one of the top priority of any organization. If we talk about the data security landscape for an organization, it comprises of seven components. These are – discovery, classification, prevention, Rights management, Access governance, Database activity monitoring, loss prevention. Database Activity Monitoring is one of the most important components among these. In fact, the scope of DAM covers analysis, monitoring, recording access and usage for any kind of anomalous activity. And then it also covers a strong mechanism of raising alerts to potential attacks and compromises. DAM tools, thus, monitor SQL traffic ensuring internal abuse prevention. In fact, the vendors providing active blocking are high in demand in this spectrum. When auditors talk about compliance and regulations like SOX, HIPAA, GLBA, PCI, etc, the DAM is of utmost importance. DAM tools are of two types.

Database Activity Monitoring

Photo credit: christophe.benoit74 on Visual Hunt / CC BY

Database Activity Monitoring tools have two kinds of architectures. It could be network-based or agent-based. Most of the DAM vendors, in fact, rely on the technique of native audit capabilities in databases. Logically, any good DAM tool supports all kind of top range databases. These include Microsoft, Oracle, IBM, MySQL, MongoDB, Teradata, and PostgresSQL. As a matter of fact, most of the DAM tools come with additional features. These features have enhanced capability of looking into data security. This capability includes data discovery, classification, rights & access management, DLP (data loss prevention), and encryption. In fact, a few of these also cover security information and event management (SIEM) and log management tools. If we look at the top key players in the DAM space, there are many established and some new-age ones. The stalwarts include Google, Microsoft, Intel/McAfee, IBM, and Oracle.

Database Activity Monitoring Is Essential

The new-age Database Activity Monitoring vendors include STEALTHbits, Zafesoft, BlueTalon, Imperva, Datiphy, Protegrity, Huawei (Hexatier), and DB Networks.


February 28, 2019  9:30 PM

Network Monitoring Changes Due To Workloads Shifting To Cloud

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Cloud monitoring, Network monitoring, Network performance

A recent study says almost 40% of IT workloads are currently running in the cloud. In the next two years, this workload, on an average, will increase to almost 60%. With these changing scenarios of a significant shift in the cloud, the network performance monitoring requirements in organizations are changing at a faster pace. In fact, most of the organizations using traditional network monitoring tools feel these tools are not that effective and useful. It is because the tools are not designed to consider parameters related to the cloud. Most of the traditional tools are designed keeping an on-premise architecture in mind. That means, with more than 50% of the organization’s technology workload shifting in the cloud, the network performance monitoring vendors need to work on the newly evolved use cases and business scenarios to find out newer solutions catering to the latest needs of network monitoring.

Network Monitoring

Photo credit: Global Water Partnership – a water secure world on VisualHunt / CC BY-NC-SA

The newer network monitoring demands higher penetration in monitoring into networks connecting to cloud applications. As the workloads shift to cloud, the demand for insight into a network connecting to the cloud is becoming as critical as monitoring of on-premise networks. As the pressure is increasing for this new set of requirements, the network performance vendors are busy in innovating newer ways to achieve these goals. The interesting thing is so far it was about network monitoring of owned networks. In the case of the cloud, the requirement changes steeply to gain insights into a network that an organization doesn’t own. But since these external third-party owned networks are carrying important traffic related to an organization, its network insights becomes important. Irrespective of whether it is a SaaS model, other cloud-based applications, public networks, or private networks, the insights are equally important.

Changing Paradigms of Network Monitoring

So far, it was the deployment of network monitoring tools in your own networks. But that doesn’t hold good anymore. Now the boundaries are becoming boundary-less.


February 28, 2019  8:21 PM

AI and ML Can Offload Fifty Percent of IT Team Burden

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
ai, Artificial intelligence, Automation, Machine learning, ml

Artificial Intelligence and Machine Learning (AI and ML) can act as a boon for an organization. It all depends on the appropriate use cases and their deployment. It is still to ascertain whether organizations prefer specialists over generalists or vice versa? Does an organization prefer full-time employees responsible solely for IT infrastructure? Or there is a shift from specialists toward generalists? Is manpower increase directly proportional to the increase in workload? Especially in IT organizations? I think it is not so. Despite increasing workloads, the ratio of new specialists is significantly low. That means specialists within the organizations are being asked to becomes generalists. That is because a generalist can take care of two or more than two different nature of jobs easily. While a specialist keeps himself stuck to only a specific nature of the job. That is where the significant role of AI and ML comes in.

AI and ML

Photo credit: thelearningcurvedotca on Visualhunt.com / CC BY-NC-SA

As a matter of fact, tagging people as specialists have their own merits and demerits. More demerits, I think. But in most of the organizations, IT professionals have to do more with less. Deliverables are increasing. Timelines are shrinking. And expectations of top management are on the rise. In that case, the only savior of an organization is to take help of AI & ML. It is necessary, thus to get in products that are either capable to capitalize on artificial intelligence and machine learning. Or those can be made capable of with a little of tweaking here or there. Once this starts happening, the IT staff will be able to predict problems well in advance. Rather, in some cases, it could be well before the problem occurs. These products, in fact, provide a simple way to manage infrastructure in a better way. Basically, automation is the only possibility.

AI and ML need to be capitalized fast

In order to avoid the complexity of finding highly skilled specialists and to save a substantial amount of recurring cost to the organization, AI and ML is the solution.


February 27, 2019  12:05 AM

Global LiFi Congress 2nd Edition at Salons Hoche in Paris @lifi_congress

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
lifi, LiFi Technology

Are you joining me this June to attend 2nd edition of the Global LiFi Congress in Paris? It is on June 12 and 13 happening at Salons Hoche (near the Arc de Triomphe). If you decide to join and need a special discount, do let me know. All the players in the scientific world including the business people in relevant fields and global press & media are quite enthusiastic. The first edition of the World Congress was a big hit. The media coverage was from across the globe. Within a short span, it has already established itself as a unique professional platform of repute. Before going further in detail about the event, let us understand what this technology is all about. LiFi technology transmits data and location. It uses light rays via LED lights to perform it. The two days are going to be of collaboration and learning.

Basically, it is a parallel technology to WiFi that uses radio waves to get the same. The first Global LiFi Congress was very insightful. Speakers with their expertise enlightened attendees on various topics like the efficiency of LiFi in modern-day infrastructure setups such as smart cities. It also covered LiFi in reference to 5G communication networks and so on. As a matter of fact, the first edition of the World Congress witnessed all the major players of LiFi like Renault, EDF, Signify, RATP, OLEDCOMM, etc. This year the participation is definitely going to be much higher than last year looking at the zeal and response from across the globe. The congress aims to align technology and business experts so that they can collaborate to explore all possible offerings of LiFi technology. That is ultimately bound to open a large number of opportunities in those two days.

Global LiFi Congress Foresees Huge Response

There would be many key focus areas of the second edition of Global LiFi Congress in Paris this year. These include logistics, transportation, cybersecurity, aeronautics, R&D, Robotics, AI, IoT, Greenfield, and so on. Scientists and scholars would be talking about the latest developments and emerging new standards in this field. In fact, the significance of LiFi is not limited to businesses. It can do wonders for communities as it is a breakthrough technology having ample scope in everyday life. As a matter of fact, LiFi can amplify the benefits of many existing technologies and applications. Hope to see you all there on June 12 and 13. By the way, why don’t have a look at the following video?


February 25, 2019  10:10 PM

Dynamic Creation Tools In Zoho Office Suite @Zoho

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja

Today’s workforce demands dynamic creation tools. Static tools with orthodox features are becoming obsolete. The first three posts on Zoho Office Suite can be accessed here by clicking on these respective links – Post 1, Post 2, and Post 3. In this final post, I would be touching upon some of the key features of Zoho Office Suite that makes is a set of Dynamic Creation Tools.

Let us start with Zoho Writer. Zoho Writer has built-in automation features. These are – document merging, form-based document creation, fillable documents, and one-click signature collection features. The user can also work completely offline while working on Zoho Writer on the web, iPad, or mobile versions. The document automatically syncs to the user’s account once the connection revives back. There is a distraction-free mode in Zoho Writer. All pings and pop-ups get disabled in this mode. Similarly, there is a focus mode.

In the focus mode, Zoho Writer highlights the paragraph the user is working on while it dims all other text in the document. To bring back the old memories of working on a typewriter, the user can enable typewriter sounds. These are some of the dynamic creation tools I have mentioned. There are a lot more in the Zoho Office Suite. For instance, Zoho Show gets you a user-focused interface. The interface helps author populate slides faster. It also offers a variety of themes and options to include tables, path animation, charts, and smart elements.

As a matter of fact, Zoho Sheet is the world’s first spreadsheet application that offers data-cleaning. This ensures the fixing of all inconsistencies and duplicate data. Zoho Show can talk to Apple TV and Android TV seamlessly. The user’s mobile becomes a controller in that case. With its help, the user can beam slides on multiple TVs.

Dynamic Creation Tools Are The Core Strength of Zoho Office Suite

Deluge is another classic example of Dynamic Creation Tools that Zoho Office Suite presents. It helps users to create custom and personalized functions with the help of this proprietary scripting language. Zoho Office Suite pricing is most suitable for individuals, SMBs, Startups, and mid or large sized enterprises. For a single user, it is free. SMEs can avail it at INR 99 per user per month. Large enterprises can get it at INR 399 per user per month.

Dynamic Creation Tools

Source Zoho.in

David Smith, founder, and principal of Inflow Analysis says, “The future of work will be characterized by secure, contextual, and intelligent digital workplace platforms that are fully integrated across collaboration, productivity, and business applications to support seamless workflows. The approach Zoho is taking shows deep understanding of this convergence and the critical need for a fully integrated platform that supports how people actually work. We believe this a challenge to major technology providers that need to address serious gaps in their portfolios and add adjacencies.”


February 25, 2019  9:33 PM

Zoho Writer, Zoho Sheet, and Zoho Notebook With Zia

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja

This is the third post in this series of Zoho Office Suite. In the first post, we discussed the Suite. I talked about its components in the second post. In this post, we will elaborate on its components further. Also, I would talk about how these components integrate well to make suite a class apart. Overall, the suite helps in creating better document, spreadsheet, or note intelligently with the help of AI enriched Zia. The first component is the Zoho Writer. It is Zoho’s document creation application. Zia makes document creator’s life easier by helping in so many creative ways. Zia not only detects context-based grammar mistakes, but it also rates the overall readability score of the document. Not only that, but it also suggests style corrections to the user in a real-time environment. This helps the user to improve overall writing quality and a better document in place.

Zoho Writer

Source: Zoho.in

While Zoho Writer is for creating documents, Zoho Sheet is for spreadsheets. Zia is here too to assist to a larger extent. In fact, Zia gets you deeper insights into data sets. It automatically shows you the most relevant charts and pivot tables drawn out with the data available. Zoho sheet also supports Natural Language Querying (NLQ). The user can ask Zia questions related to their data. Zia, in return, responds back intelligently with relevant function, pivot table, or chart that can be added to the spreadsheet. Next comes the Zoho Notebook. This is the newest entrant to Zoho Office Suite. It is one of the best and most advanced note-taking tool in the market globally. Zia Voice that is an intelligent conversational AI assist users to create customized “smart card” by providing visuals, shopping list, instructions etc. from their favorite websites. All through voice commands.

Zoho Writer along with Zoho Sheet and Zoho Notebook makes an ideal suite

Zoho Writer, Zoho Sheet, and Zoho Notebook empowered by Zoho Zia make it the most compelling office suite. We shall be concluding this series on Zoho Office Suite in the next post. Continued »


February 25, 2019  7:49 PM

How Zoho Office Suite Components Fit Well Into Enterprise Needs?

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja

In the previous post, we talked about the Zoho Office Suite. in this post, we would be talking about Zoho Office Suite Components and their integration capabilities.

The good part is that all the Zoho Office Suite Components are integrated well among themselves. Not only that, these applications or the components are also integrated with Zoho’s communications tools like Zoho Mail and Cliq. This is a cross-platform messaging app. So, calling them Zoho’s collaboration tools will not be a misnomer. These collaboration tools include Zoho Projects and Zoho Connect. This is basically a private social network for an enterprise. The other collaboration tools include a number of other Zoho’s business applications. With the help of these contextual integrations, the user gets empowerment to merge data from Zoho CRM into a sheet or document. Then the user can send this data for signature through Zoho Sign. That creates a flawless workflow mechanism in its own ecosystem. Imagine the amount of work and efforts it saves in doing so in an automated or integrated manner.

Zoho Office Suite Components Are Well Integrated

In this context, Sridhar Vembu, CEO, Zoho Corp. says, “We built Zoho Office Suite to be the most integrated suite of productivity tools of its kind. For decades, Zoho has provided tools for users to share and work on documents quickly and efficiently. Now, with this new version of Zoho Office Suite—empowered by Zia—Zoho’s integrations are tighter than ever before, providing seamless collaboration across departments and teams. We’ve added features and tools that can’t be found anywhere else, such as Notebook’s smart cards, Sheet’s data-cleansing tool, and Show’s integration with Apple TV. Just like the line between productivity and collaboration applications is fading, we see the line between business, collaboration, productivity, and communication apps fading. It is the combination of these apps, contextually integrated, that makes the modern worker exponentially more productive!” That is a wonderful perspective about Zoho Office Suite Components.

Zoho Office Suite Components

Source Zoho.in

Zoho Office Suite Components help to create a better document, spreadsheet, or Note with the help of Zia. We will talk about various components in the next post. Continued »


February 25, 2019  6:56 PM

Zoho Office Suite To Cater To Your All Future Needs? @Zoho

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja

Is this the one the ultimate office suite that is capable of catering to all your future needs? Well, you need to experience it and assess it for yourself. But one thing is for sure. It is far beyond what you and your enterprise are living with – whether Microsoft or Google. This one has an edge over both these giants in its true sense and in many ways. For that, obviously, you have to taste the pudding. Experiencing the Zoho Office Suite is not cumbersome. Rather, you will be happy to use it. It is, of course, the best enterprise office suite. This is the next generation office suite, in fact. Besides doing routine office suite tasks, it empowers businesses with dynamic AI features. And you also need to explore its first-to-market enhancements that are bound to make life a little comfortable for your marketing and sales staff.

Zoho Office Suite

Source: Zoho

Zoho Office Suite comes along with Zia that is Zoho’s AI-powered assistant. It is also empowered with the capability of integration with Apple TV and Android. The story doesn’t end here. There is a lot more in store for an enterprise in this classy office suite. It has proprietary data cleansing and smart note card functionality. The four key components of Zoho Office Suite are Zoho Writer, Zoho Sheet, Zoho Show, and Zoho Notebook. And all have a common catalyst – Zia. Zia, in fact, is Zoho’s AI-powered assistant that has matured a lot by now. The suite, thus, enables deep contextual collaboration to help user and enterprise meet the diverse challenges and end-to-end business requirements of its users. And Zoho Suite is capable of delivering the same set of results irrespective of organization size and the number of users.

Zoho Office Suite will surpass all others by 2022?

Zoho Office Suite is equally good for a startup or small business as well as a large enterprise.

We shall continue the same in the next post. Continued »


February 19, 2019  11:10 PM

AI For Accessibility A Superb Global Initiative @MicrosoftIndia – III

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Artificial intelligence, Microsoft

This is the final post of the series AI for Accessibility. This first post can be accessed by clicking here. And the second post is here. This is the concluding post.

A. S. Narayanan, President, National Association of Deaf (NAD) says, ”This Summit is a reflection of changing times. It is so heartening that the discourse on digital accessibility has expanded to include the entire spectrum of disabilities. Just yesterday the Ministry of Information and Broadcasting has requested all private television channels to include sign language in all programs and that is such a welcome step in the right direction.”

Arman Ali, Executive Director, National Centre for Promotion of Employment of Disabled People (NCPEDP) says, “Just like disability is not homogenous, the accessibility solutions also cannot be homogenous. We must acknowledge the various categories of disabilities and their requirements and expectations from technologies. Platforms such as Microsoft’s Empowering for Inclusion summit spark dialogues around the need for inclusive technology and how solutions need to evolve and be inclusive of people with cross-disability. There is an urgent need of different stakeholders such as persons with disabilities, government, corporates, NGOs, to come together and work on these solutions.”

Microsoft’s AI for accessibility is a 5-year program with a funding of $25 million. The target is to enhance human capabilities with the help of AI. It aims to benefit over 1 billion people around the globe with a disability. The technological advancements in AI enable it to see, hear, and reason with increasing intelligence. Some of the best use cases are real-time speech-to-text transcription, predictive text functionality, and computer vision capabilities that showcase how AI is playing a vital role in helping people with disabilities. Microsoft’s approach to accessibility can be found on Microsoft Accessibility Website and Microsoft India’s video on Empowering for Inclusion.

With this, we conclude this three post series on AI for Accessibility. Any thoughts are welcome in the comment section.


February 19, 2019  11:00 PM

AI For Accessibility A Superb Global Initiative @MicrosoftIndia – II

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Artificial intelligence, Microsoft

We continue from the previous post on AI for Accessibility. Initiatives like Deque System’s Accessibility Testing Tool; Adobe Acrobat for Accessible Documents, a Video Relay Services for the hearing impaired by Dr. Philip Ray Harper and Microsoft Office 365 for Accessibility are some of the examples of AI for accessibility. Shakuntala Doley Gamlin, Secretary, Department of Empowerment of Persons with Disabilities, Ministry of Social Justice and Empowerment said while inaugurating the event, “Including people of all abilities in the development process adds to a nation’s social and economic progress. Our vision is to ensure that we empower them with equal access and opportunity, and strong public-private partnerships will go a long way in ensuring this. The Microsoft Accessibility Summit provides an ideal platform to bring together policymakers and influencers to understand the policy environment and chart a direction for making life, experiences, and opportunities accessible to all.”

Dr. Sriram Rajamani, distinguished scientist and Managing Director, Microsoft Research India says, “At Microsoft, we believe there are no limits to what people can achieve when technology reflects the diversity of everyone who uses it. Cloud and AI solutions are opening up a world of possibilities, empowering people with disabilities with tools that support independence and productivity. The Summit is a significant step forward in advancing our efforts towards sensitizing stakeholders and partners on the business and social value of accessibility. As we continue to learn and grow, we hope to inspire other entities and organizations to build and accelerate their accessibility and inclusion programs.”

Inclusivity is not an initiative limited to a handful of persons. It requires involvement and engagement of each and every human being. Only then the actual targets can be achieved. Like an interpreter is a bridge between a person with a disability and a person with no disability. But in reality, isn’t the interpreter serving the purpose of both the ends?

We conclude AI for Accessibility series in the next post. Continued »


February 19, 2019  10:51 PM

AI For Accessibility A Superb Global Initiative @MicrosoftIndia – I

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
ai, Artificial intelligence, Microsoft

AI is the best possible way to empower people with impairment. That was the core theme of the 2-day event “Empowering for Inclusion Summit 2019” hosted by Microsoft. The sole purpose of the summit was to encourage greater coaction in order to attain optimal empowerment for people with disabilities. This can only be achieved using AI (Artificial Intelligence). This was the second edition of this initiative. The beauty of it is the evolution of this platform to such an extent that there are multi-stakeholders in it. It is the AI for accessibility technology playing a substantial role in enabling people to access, collaborate, and deliver. These multi-stakeholders include non-profits, developers, enterprises, academia, scholars, and experts. All have the common goal of creating inclusive technology solutions. The credit for the creation of this platform goes to Microsoft India who is taking a lead in enhancing it to every possible extent.

The second edition of Microsoft’s Accessibility Summit – Empowering for Inclusion was held on 15th and 16th February. It was held in New Delhi in collaboration with The National Association of the Deaf (NAD) and the National Centre for Promotion of Employment of Disabled People (NCPEPD). The core theme this year was AI for accessibility. The summit plays a pivotal role in bringing together multiple stakeholders to initiate anything using inclusive technology. Catering to the enhancement of accessibility standards and giving a new dimension to policies is the primary aim to create a better accessible India. Everyone in the ecosystem has to play an important role in achieving these goals. The initiative can come from any of the stakeholders in the system like People with disabilities, policy-makers, service providers, people engaged in support systems, CSR, and developers of assistive technologies.

AI for Accessibility is a great mission

The journey of AI for accessibility is, in fact, a never-ending journey. Continued »


February 15, 2019  11:40 PM

DH2i’s New DxOdyssey to Ensure Security for Remote User Access

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Data-security, DH2i

DH2i’s New DxOdyssey, Software-Defined Perimeter (SDP) Software, Promises to Ensure Security for Remote User Access to Cloud Services

An Alternative to VPNs, Which Present Management Headaches and Security Vulnerabilities

From the IT department to the C-suite, data security has become a key priority, driven by business and competitive requirements, as well as regulations compliance. Until recently, VPNs have been considered one of the most secure methods for the transfer of data. However, recently it has become abundantly clear that in most cases VPNs are unable to meet the security requirements of today’s business environment, nor meet regulations compliance mandates. And, for many IT departments VPNs have been nothing but an expensive, time-consuming, management headache. Today, I speak with Don Boxley, CEO and Co-Founder of DH2i (www.dh2i.com) about this increasingly critical topic.

Q: The undeniable benefits of the cloud have acted as a catalyst for datacenters to expand beyond their physical walls.  However, this expansion also introduces potential security issues.  Datacenters have typically turned to VPNs – could you discuss the plusses and minuses?

A: Yes, security technology datacenter managers have historically turned to VPNs. They did so because with VPNs datacenters managers were able to give users secure connections to cloud-based services. On the plus side, it’s a legacy perimeter security technology they’re very familiar with. On the minus side, they’re obsolete for the new IoT reality of hybrid and multi-cloud. They weren’t designed for them. They create too large an attack surface. The issues that surround using traditional approaches such as VPNs to secure hybrid cloud environment include:

  • – Users/devices get a “slice of the network,” creating a lateral network attack surface
  • – Complex configuration requiring dedicated routers, ACLs and FW policies increasing risk and expense
  • – Inbound connections create attack surfaces (e.g. DDoS)
  • DxOdyssey

    Q: How are these problems exacerbated for organizations that wish to grant strategic partners access to infrastructure and information?

    A: Providing such access represents a critical security risk that can introduce a multitude of security threats to your enterprise. Besides the threat of potentially introducing malware into your systems, there are other possible technical and business dangers of which to be aware. First, granting system access to third parties instantly lowers your security level. If a vendor that you invite in has feeble security controls, they now will become the weakest link in your security chain. If an outside attacker compromises that vendor’s system, this malevolent force can use that as a backdoor into your network. In parallel, as that third party’s risk increases, so does yours. Some of the largest and most publicized retail data breaches in history have been linked back to third-party vendors.

    Q: What types of solutions/approaches overcome the limitations just discussed?

    A: One approach to secure remote user/partner access to cloud services is to deploy a software-defined-perimeter (SDP). An SDP starts with the question: Does every remote user/partner really need full access to my network to transact business? An SDP would enable organizations to give remote users/partners access to the specific computing services they need without giving them a “slice of the network” or put another way if you want to virtually eliminate network attack surfaces get users off your network by using software-defined-perimeters. This would be an essential component of moving the organization’s network to a Zero Trust (ZT) architecture. The analyst firm Forrester defines a Zero Trust (ZT) architecture as one that abolishes the idea of a trusted network inside a defined corporate perimeter. In the case of remote user/partner access to cloud services, ZT would involve the creation of micro-perimeters of control around computing assets to gain visibility into how remote users/partner use services/data across the cloud network to win, serve, and retain customers.

    Q: You recently introduced a new software called DxOdyssey. Could you tell me more about it?

    A: Sure. This fall, DH2i introduced new network security software product, DxOdyssey, that is specifically designed to enable the organization to dynamically deploy highly available micro-perimeters to isolate services for fine-grained user access without using a VPN. DxOdyssey was purpose-built to give medium and large organizations the perimeter security model needed for cloud-centric network connectivity with virtually no attack surface.

    Q: I believe that this is something that anyone that is concerned with the data security of their organization should check-out. Where can one go to learn more?

    A: Please visit http://dh2i.com/dxodyssey/ for more information and/or to schedule a live demo.


    January 31, 2019  11:29 PM

    Zoho Creator Brings The Real Low Code Platform @Zoho

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Application development, Software development

    When you hear or read low-code, what comes to your mind in the first go? Is it like you have to bend low and write code? Well, jokes apart, the technology has reached a level where you can develop a low to the mid complexity application with the help of Zoho Creator Low Code Platform. And for this, you, in fact, don’t need to be a hardcore developer. You don’t even are required to be an expert in any of the development language. You just need two basic things for becoming an expert in Creator. One, some ground-level command knowledge. Two, training from an expert, maybe, from Zoho or from someone who is already an expert in this particular technology. Another interesting part is that you can integrate this new piece with an existing application.

    Low Code Platform

    Source: Zoho.com

    So, for instance, you have SAP running as a core business application in your organization. And a new requirement comes from a user group asking to develop new functionality. There are two ways of doing it. One, call SAP experts, pay them some hefty per day cost, and get it done. Another way is to develop the piece in Zoho Creator Low Code Platform by calling all the essential data from existing SAP database and then pushing back the result in SAP to further flow in various business processes. That calls for development separately in Creator, and a backward and forward integration with SAP with no new tables creation even. Or it may be a call for creating a couple of interim tables to store data and then push that to SAP tables. That means a lot of relief in terms of money, time, and manpower.

    Low Code Platform Is A Reality Now

    As a matter of fact, Zoho Creator can be learned by non-IT persons also. It just needs the person interested to learn it to have good business knowledge and a basic interest in learning a few fundamental things about Creator. Then it is merely a matter of practice and applicability.


    January 31, 2019  9:41 PM

    ASUS Launches World’s Smallest Notebooks @ASUSIndia

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    ASUS, notebooks

    Now, this is what I call as real innovation and customer-focused innovation. Otherwise, a lot of organizations keep claiming a lot in the name of innovation but in actuality, that is of no use. ASUS launched the world’s smallest notebooks in 13″, 14″, and 15″ segment named as Zenbook 13, Zenbook 14, and Zenbook 15. These are ultrathin having four-sided unique technology called “NanoEdge” displays. The new NumberPad is not at its usual place. It is uniquely and innovatively placed on the touchpad. So, now, the NumPad and touchpad are residing at the same location and the touch is intelligently recognized by the laptop whether it is intended for Numpad or touchpad. Effectively, the touchpad has become a multilayer model. It leaves a scope of better and larger keys and keyboard size increased significantly. It also enhances productivity and let the user work with a better pace and concentration.

    smallest notebooks

    The login in these zenbooks (world’s smallest notebooks) is through a powerful 3D IT camera that works fine even in the low-light environment to recognize the face and allow the user to log in. There is an Ergolift hinge to raise keyboard at the rear only when you open the laptop to use. It actually helps in comfortable typing. Technically and design wise it also helps in improving the cooling and audio performance. It is powered by 8th Gen Intel Core CPUs along with GeoForce Graphic Cards. The laptop allows users to access gigabit Wi-Fi. These are just a few of the features. Actual revolution is in its design that is definitely mindful and user-centric. As a matter of fact, Zenbook 13 is smaller than an A4-size sheet. That is phenomenal. The numeric keypad is LED-illuminated. It gives a different kind of feel to the user.

    Smallest Notebooks are the latest Zenbooks from Asus

    The lightweight Zenbooks that are world’s smallest notebooks are in reality a powerhouse with unmatched quality and design.


    January 31, 2019  9:12 PM

    The Dark (Other) Side Of CIO / CTO Of An Enterprise IT

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    CIO, CTO, Enterprise IT

    Chief Information Officer is the chief custodian of the information of an organization. Mostly his role is to take care of the digital information but when his role combines with that of a CISO (Chief Information Security Officer) the physical information comes into his vicinity. Similarly, CISO as a separate role has to ensure the right kind of measures to be in place to ensure the safety and security of any kind of organizational information. Information scrutiny and information flow scrutiny is fine that he can create an appropriate process and ensure strict adherence to those processes. But when it comes to the safety of information in connection to the employees or external stakeholders then is he responsible to scrutinize those people too? I think yes. So basically, in that case, a background check of a new recruit also becomes essential.

    Enterprise IT

    Photo credit: Symic on Visualhunt / CC BY-SA

    Recently in an online technology magazine, there was a news about a CIO of a large retail organization about the new launch of their mobile initiatives. That was a very basic kind of mobile app that was launched. The way it was being publicized was something not matching with the initiative. A very small initiative was being projected as something extraordinary. That is synthesized news. Induced one. I put a comment below that article that this small thing should have been done two decades ago in such an old and large organization with such a large IT setup. This is a tragedy that technology heads making a mockery of technology. Actually, organizations have no criteria to measure the intellectual and monetary loss of non-automation of a business critical process. Something that could have been done years back in an organization, if stays uninitiated or under process for years denotes lethargy.

    Another example is of a CIO sacked in an organization for financial fraud. He was booked for taking money from vendors for few of the big deals happening in the organization. He was not sacked actually but was told to put down his papers and then was told to move out immediately. Today he is CIO of another large organization. Would be playing similar kind of games. Sad thing is that organizations recruiting C-suite people sometimes don’t come to know about these darker sides of their personalities.


    January 31, 2019  6:58 PM

    Five Pillars Of A True Hyperconvergence Software – II @MaxtaInc

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    hyperconvergence, Maxta

    This is the concluding post on our discussion with Barry Phillips, CMO, Maxta Inc. The first post is Software Model Versus Appliance Model – Which Business Model? In the second post, he tells how Inflexible Architecture Will Create Much Bigger Issues For IT. In the third post, that is the previous post, he elaborates the first two essential passing criteria of a hyperconvergence software. Let us conclude the series with the rest of the three important parameters.

  • 3. How easy or difficult it is for IT of an organization to add capacity within the server?
  • Barry asks – “Can you add capacity within the server? The only way to add capacity with an appliance vendor is by adding another appliance. Even though some vendors offer a storage-only node, the step-up cost of another “pizza box” isn’t trivial. With true hyperconvergence software, enables you can add capacity to an existing server by adding drives to open slots, swapping in higher capacity drives, or by adding servers. If you can only add capacity by adding nodes, you have a fake software model.”

    True Hyperconvergence Software

  • 4. Licensing:
  • Barry says, “Are you being forced into the same appliance software licensing model or do you have a choice? Hyperconverged appliances tie the software license to the appliance, so when you refresh your hardware you get the privilege of repurchasing the software. This is a “term license,” which means you get to buy the software over and over again, and it’s the only option you have in a fake software model. While many software companies are starting to offer term licenses to provide subscription-like pricing, nearly all software companies still offer a perpetual license that you own forever. You should have a choice of perpetual or term licensing. Do you like the thought of owning the software for life, but don’t want to pay for it all upfront? Just lease the software from any number of leasing companies. It gives you the best of both worlds.

  • 5. Memory and CPU Resources:
  • Barry concludes – “Can you add more memory and CPU resources? Just like adding storage capacity, you should be able to add additional memory or compute whether inside an existing server or by adding a compute-only server. A true hyperconvergence software model scales storage independent of compute. A fake hyperconvergence software model operates the same way as the appliance model.”


    January 31, 2019  6:34 PM

    Five Pillars Of A True Hyperconvergence Software – I @MaxtaInc

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    hyperconvergence, Maxta

    This post is third in series in continuation to previous two posts on Barry Phillips of Maxta Inc. talking about the five essential components of hyperconvergence software. You can read the first post by clicking here and the second post by clicking here. Any hyperconvergence software that doesn’t fulfill the following criteria is not a true hyperconvergence software. Who can tell it better than Maxta Inc? The five important criteria are:

  • 1. Does existing server hardware support new software?
  • Whenever there is new software to be put in production, do you need to buy new hardware? Every time? As Barry Phillips, CMO, Maxta Inc. says, “Can the software be installed on your existing server hardware? This is the first sniff test of whether it is a true software model or a fake software model. Of course, you need to make sure the hardware has the right specifications to run the software, but you shouldn’t need to buy new server hardware. And don’t get fooled by the old trick of being able to run “trial” software on your own hardware, but you have to buy new hardware to put the software in production. True infrastructure software vendors like Microsoft, Citrix and VMware do not make you buy new hardware to run their software.”

  • 2. Is hyperconvergence implementation dependent on a certain set of server SKUs?
  • Barry questions, “Does your server hardware have to be from an approved list of server SKUs? And then elaborates it saying, “If you do want to refresh your hardware when you implement hyperconvergence, does the hyperconvergence software vendor limit you to a certain set of server SKUs? If so, that isn’t really software; it’s just an appliance vendor separating out the appliance software from the confined set of appliance hardware.”

    True Hyperconvergence

    The basic question is there are a lot of vendors in the market giving a different kind of hyperconvergene solutions. Do they really provide a true hyperconvergence environment? Do they fulfill above to criteria? Let us have a look at the other three criteria in the next post.


    January 31, 2019  6:11 PM

    Inflexible Architecture Will Create Much Bigger Issues For IT

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Business model, hyperconvergence, Maxta

    In continuation to my previous article on Software model taking drastic precedence over the appliance-based business model, let us try to encapsulate five essential properties of a true hyperconvergene software. Are most of the ITs still creating Inflexible Architecture? Barry Phillips, CMO, Maxta Inc. says, “Once that appliance-based product has taken off, the company will want to change to a software business model from a profitability perspective. This can be a difficult pivot to make financially since revenue decreases before profitability improves, and it changes how the sales teams are paid. If the pivot is made successfully, then the company is much more profitable and financially stable”.

    Barry adds further, “Even if a pivot to software works out for the vendor, it does not always work out well for the customer – especially if the software model is an appliance “in software clothing.” If you’re considering hyperconvergence software, make sure it’s not an appliance in disguise. Many vendors will claim to offer hyperconvergence software, but still significantly restrict how their solution can be deployed and used. Ask vendors these questions to determine how much (or how little) flexibility you’ll get with their software.” “As the hyperconvergence market shifts from appliance offerings to software, vendors that started out selling hardware platforms will need to shake both the appliance business model and the appliance mentality. As you evaluate hyperconvergence, always understand what limitations and costs will be in four or five years when you need to refresh or upgrade”, he continues.

    Talking further, Barry adds, “Infrastructure platforms are evolving quickly, so the ability to scale, choose and change hardware platforms, and use different hypervisors will certainly make life easier. Getting locked into an inflexible architecture will create much bigger issues for IT down the road. By asking the right questions upfront, you’ll be able to navigate the changing landscape.”

    We will continue with Barry’s ideation on Hperconvergence Software in next post. Continued »


    January 31, 2019  5:47 PM

    Software Model Versus Appliance Model – Which Business Model?

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Business model, hyperconvergence, Maxta, software model processes

    Who can understand Hyperconvergence Software better than Maxta Inc? We have already covered an article this previous month. You can read that by clicking here. Let us understand that further a little deeper and try to understand what are the five basic requirements for hyperconvergence software. Before that let us get into some basics. Are financial analysts clear about the concept of an organization switching from their appliance model to a software model? That is making, in fact, stock prices soaring for whatsoever reason. And the fundamental reason is the software model itself. When you try to find out the pros and cons of a business model between the two, obviously software model takes a larger leap over appliance model. But if the software is a brighter business model, than why companies still keep sticking to shipping appliances? One neatly needs to understand the whole gamut behind it.

    Business Model

    Photo credit: vistavision on VisualHunt.com / CC BY-NC-ND

    Obviously, selling appliances is much easier through any channel. Directly or through a distributor and reseller network. On the same note, when it comes to software, it is equally easier to build an application for a specific or a specific set of hardware platforms. Of course, it is not difficult to support a limited number of hardware platforms. But then this kind of design will have a lot of limitations that will invite a large number of troubles. The most important thing is that such software can’t find its place among universal acceptance. That is the basic issue Maxta tries to overcome for any size of organization launching or using any kind of software. That is the most critical differentiator between the appliance-based business model versus a software business model. Most of the organizations plan to change to a software business model for higher profitability.

    Appliance model is becoming an obsolete business model

    Appliance-based model is becoming obsolete because of hyperconvergence software having higher capabilities.

    We shall continue this discussion in the next few articles Continued »


    January 29, 2019  11:06 PM

    Violin Systems Excels In Extreme Performance Enterprise Storage

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Enterprise storage, Storage, Violin Systems

    Any kind of disruption creates two reactions. One, fear out of which the players become defensive and start stepping back. Two, a very few players look at it as a new pool of opportunities and start exploring various innovative ways to cater to it. Most of the players in the former category start getting into a shell and become history sooner or later. They keep sitting on their laurels achieved in the past. Because they didn’t accept to participate in the new game of warriors and hence have nothing to prove in the newer battlefield of business. Most of the players in the latter category succeed despite swimming upstream because of two latent forces coming from within. One, courage. Two, innovative ideas taking a shape of reality. Violin Systems very distinctly stands apart as a spearhead in this category. Let’s see what makes them a class apart in technology.

    Violin Systems is a synonym to extreme performance enterprise storage. And that is provided at the price of traditional primary storage. The sole aim is to empower enterprises to get the maximum leverage of their business-critical data in a manner that was never thought of by any of the technology players across the globe. The solution provides lowest-latency and highest IOPs that is unmatched. This includes all kind of seriously essential data services like data-protection, data reduction, and business continuity, to name a few. Businesses can easily bank on Violin Systems for achieving a new level of enhancement in their application performance with extreme reliability thus taking their business service levels to newer heights along with reducing costs drastically. Immediate access to information is an organization’s top dream because that is the only key to achieve higher revenue and gain a substantial increase in customer satisfaction.

    Violin Systems is a synonym to extreme performance enterprise storage

    In today’s scenario which organization in the world would not like to be a data-driven business. Violin Systems helps enterprises drive their business-critical applications to support operations, quality, and delivery across their entire stakeholders’ ecosystem. It also helps enterprises to easily scale and extend their competitiveness thus staying ahead of the others in the fray. That is the reason enterprise customers reply on Violin Systems for unmatched extreme performance and excel to drive their business without any compromise.


    January 13, 2019  10:07 PM

    Role Of Developers: Test As You Build Is The New Mantra

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Software test design, Software testing, Testing

    There is a constant shift in the role of developers globally. Actually, it is not a shift completely. It is, in fact, an additional kind of role that is embedding within their existing role of coding and development. It is to test as you build. The new mantra is to test while you develop. Obviously, most of it would be manual testing of small pieces of codes being built. Almost like a unit testing or a segment testing. Now, this doesn’t require any additional skillset in developers. What they have to do is to just test what they are building. It is, rather, a small shift in the mindset only. A developer has to first convince himself that it is very well his job to perform it. One, because it is his own code that needs to be 95%, if not 100%, matching with the business requirements.

    It is a kind of building an assurance along with the coding as an additional part of the role of developers. It is not that a shift is happening only in the developer’s role. Testers also are facing an altogether similar kind of overhaul in their roles. The scope of testing earlier and testing now is changing in a big way. In fact, the ultimate goal of any testing is to increase quality and hence self and customer confidence. Testing and development are getting closer as never before. And this is proving to be a smarter and efficient move. While developers are doing all the testing themselves, testers are supposed to become automation engineers in the changing scenarios. Testers doing traditional kind of manual testing as before is a big NO now for most of the progressive organizations.

    Role of Developers Takes A New Turn

    Earlier testers could exist merely having manual testing skills without a few technical skills and banking completely on their functional knowledge. But now it is not possible. As the role of developers is changing, so is of testers. Testing has become more demanding thus getting more penetrative and effective. The major onus of this shift goes to DevOps and Agile. Testing needs to be in the mainstream of project lifecycle at the earliest and more frequently thereby giving fruitful results faster. Of course, no testing is complete without a human touch of discovering the unknown with the help of their intuitive exploration. Wishing a piece of good luck to all developers and testers to adopt their changing roles for the goodness of quality of the product you are developing and testing.


    January 13, 2019  4:58 PM

    Test Automation Tools: Changing Trends In Software Testing In 2019

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    automation tool, Software testing, software testing automation tools, Software testing tools, Test Automation

    Any organization in software development and still not test automation tools and completely banking on manual testing can’t stay in the mainstream software business. Even if we talk about an enterprise in any business vertical other than software testing and have a focus on in-house development for any kind of key business application can’t release a healthy product merely on the basis of manual testing. The reason for that is not that manual testing is incapable of testing a product fully. The reason is that in today’s scenario, software applications don’t run on a single platform, hardware, and operating system. Any application you develop, or for that procure from an external vendor, the first and foremost requirement would be its capability to run on multiple platforms like a laptop, tablet, smartphone, and desktop.

    Test Automation Tools

    Photo credit: Michael Kappel on Visual hunt / CC BY-NC

    This automatically calls for a heterogeneous spectrum of a number of operating systems and different environments. Now that is obviously a big challenge if you have to perform all these testings manually. One way is to have a huge size of manual testers. Another way is to use test automation tools to save a huge spend on resources, manpower, and time. These test tools simulate various environments, loads, operating systems, and capacities. In my opinion, to be in the best situation, the manual versus automated testing ratio should be 30:70 at least. Of course, it can’t be 100% automation. There are certain things that still can be managed manually only. Another way is to outsource a reliable company for testing. This way you can live with limited resources in the testing department.

    Test Automation Tools Demand Has Increased Tremendously

    But ensure to place a strong SLA with this outsourced testing organizations in terms of information and outcomes. I doubt if there is anybody at customer end accepting a software without proper testing reports. There is a substantial increase in test automation tools demand in the global market for this reason. That easily shows the change in testing trends.


    January 13, 2019  3:32 PM

    Are You An Expert In Exploratory Testing? Check It Again!

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Quality assurance, Testing

    If you are in testing or are a part of software testing ecosystem, you must be very well aware that what it takes to become an exploratory tester. In testing fraternity, not every tester is a good exploratory tester. It requires, in fact, a set of certain skillsets to be the one. The journey doesn’t stop just by acquiring these skills. It has to be an exhaustive enhancement of these skills with continuous learning, experimenting and exploring. Exploratory testing, as a matter of fact, is not everybody’s ballgame. Let us talk about some special skills that if you require can make you a superb exploratory tester. Here we go:

    Explortory Testing

    Photo credit: NRCgov on Visual hunt / CC BY

    1. Critic:
    You have not to be critic about everything that happens in life. But when it comes to testing a piece of code or a complete application, you should not accept what code says about it. You have to look at it as a critic and find out all kind of possibilities that can go wrong in the successful running of this code. All kind of permutations and combinations have to be taken care of. Everything has to marry well between the business requirements and the code. The flow of application has to gel well with the business processes and flow.

    2. Investigator:
    At times things will not be as straight as they might appear. The report might say everything is developed as per the requirement but still something might be there beneath the carpet. You might be required to dig down further to get to the other side of the iceberg for a reality check. After all, when a product or software is released, the whole organization’s reputation is at stake.

    3. Go Getter:
    Everything might not go as smooth as free-flowing water in a river. You as an expert in exploratory testing has to cover a longer distance than a normal routine. You have to stay calm under all adverse situation with one goal in mind and that is to find out the minute of the bug in the software.

    Exploratory Testing Needs A set of unique qualities

    4. Storyteller:
    You have to be a good storyteller when it comes to explaining a bug in software to make the other person understand it to the core so that the next time no mistakes repeat.

    5. Communicator:
    Besides being an expert storyteller, your communication skills have to be extraordinary. That can happen only when you have clarity about everything you discover and its difference from what it should behave like. If you can convince yourself about a thing, you can very well convince others about the same. And that is a must in exploratory testing.


    January 13, 2019  10:59 AM

    How Sacrosanct Is Requirements Analysis in SW Project Management

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    business requirements, Software test design, Software testing, Testing

    If you are in project management the first and foremost important thing to check in your project management lifecycles if you have a place for requirements analysis. In my opinion, the requirements analysis is entirely different from requirements gathering. It includes requirements testing and validation along with parallel scrutiny with actual business processes in place. If you don’t perform all these and start your coding, you are definitely calling for big trouble at a later stage. It may, in fact, lead to an entire failure with a big setback to reputation, finances, business, customer loss, and time. Any project failure leads to a risk of losing the best talent in the pool especially if the project is not managed properly. Obviously, the best of the people would not like to stay at a place where risks and mistakes have a higher stake in the projects.

    It is very important to learn the deep connection between requirements analysis and testing your requirements. Knowing that testing requirements are very important is one thing, how to do it in the best possible way to avoid later accidents is an altogether different ballgame. As a matter of fact, the requirements analysis stage has to have ample time and best of the resources to ensure foolproofing in a wholesome manner. Once the coding starts, the entire focus shifts to timelines, testing, and execution. The analysis stage is over by then or it takes a backseat by then. A thorough QA check of business requirements, processes, wireframes, and mockups are very important before the beginning of coding. Anything in the requirements that is not testable is risky.

    Requirements Analysis Has To Be Heuristic In Nature

    Business scenarios and test cases have to be complete and clear. Anything vague is meaningless. There has to be a perfect strategy. As a matter of fact, any coding and later the implementation has to align well with one or the other requirement well. If it doesn’t, there was something wrong with the requirements analysis.


    December 31, 2018  10:33 PM

    2018 Roundup Quotable Quotes From My Various Posts of 2018

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    DH2i, Encryption, manageengine

    “Timely communication is very critical in business. It impacts business, in fact, in a huge way. If it is not timely, it loses its impact and effect.”

    – Jaideep

    “Users always resist changing. Similarly, management always fears to invest in newer technologies.”

    – Jaideep

    “Now, AWS Marketplace customers can buy and deploy CloudPassage Halo Server Secure for high-performance cloud workload security in a highly predictable and cost-effective way – via a single integrated AWS bill. As global enterprises rapidly embrace the cloud for mission-critical workloads and DevOps for application development, automated security that can operate at speed and scale is becoming a critical path. AWS Marketplace helps eliminate protracted negotiations to make it easy for our customers to securely embrace the cloud.”

    John Janetos, Director, Business Development, CloudPassage.

    “Sometimes IT shops use instance stacking to help reduce the number of operating systems and licensed core counts since Microsoft allows up to 50 SQL Server instances per OS to be installed. The problem here, though, is the creation of a scenario where all of an enterprise’s eggs end up in a single basket, and one outage can thus impact many instances. If you get the stacking ratio wrong the first time, it’s also hard to move instances.”

    Connor Cox, Director of Business Development, DH2i (http://www.dh2i.com/).

    “With dynamics of digitalization fast changing and massive adoption of cloud technology, there is a greater need for automation in the endpoint management space, as endpoints are the major entry points of cyber attacks. ManageEngine’s new cloud-based patch management solution is engineered to meticulously look out for such threats on the move, thereby keeping both data and endpoints secured.”

    Rajesh Ganesan, director of product management, ManageEngine.

    “Readying macOS/iOS systems with the necessary authentication, encryption, management controls, and reporting are necessary to ensure a secure and compliant deployment. Therefore, providing the same level of protection afforded to PCs is an important consideration when integrating these devices into the business landscape.”

    Jason Dettbarn, CEO, Addigy.


    December 25, 2018  6:43 PM

    Addigy macOS-iOS Enterprise Readiness Charter @addigy

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Apple, Apple iOS, macOS

    Two significant fears I remember regarding adoption of Macs we used to have when I was working with various enterprises as CIO/CTO was lack of support and visibility. Plus the exorbitant prices of Mac desktops at that time. Apple devices used to be a sign of luxury. That fear was across almost all enterprises and CIOs. But that is not the case anymore. When I spoke to CEO of Addigy Jason Dettbarn I realized that. Both the fears are no more there in existence. For enterprise support, Addigy is there to manage any size or volume of macOS/iOS ecosystem in an enterprise. And as far as pricing is concerned, Mac systems are well within the reach and not too high in comparison to Windows PCs and laptops. Looking at the advantage and reliability of Mac systems bring, even if there is a slight price increase, it is worth accepting.

    Addigy

    Source Addigy.com

    Addigy recently released strategies to ensure macOS/iOS readiness for the enterprise. The strategies include a good amount of groundwork behind it. The company highlights processes and procedures to strengthen macOS/iOS system management, security, and compliance. Addigy is a front-runner in providing cloud-based Apple device management software. Thus irrespective of geographies, the company is fully capable to support any enterprise across the globe. The latest release includes a seven-point checklist to ensure macOS/iOS enterprise readiness. Last decade has seen substantial growth in popularity of macOS/iOS devices for their significant enhancements in productivity, least help desk needs, fewer management costs, and unmatched overall user experience. The deployment of macOS/iOS devices, however, in the enterprises seems to be a difficult task to the administrator thinking it as a cumbersome process taking several steps to make the devices ready to match the security and regulatory computing environments.

    Addigy releases a strategy to strengthen macOS/iOS system management, security, and compliance

    Jason Dettbarn, CEO, Addigy says, “Readying macOS/iOS systems with the necessary authentication, encryption, management controls, and reporting are necessary to ensure a secure and compliant deployment. Therefore, providing the same level of protection afforded to PCs is an important consideration when integrating these devices into the business landscape.”


    December 20, 2018  9:44 PM

    Hybrid Cloud Services for VMware: All About IO Filters @JetStreamSoft V

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Business Continuity, Data Replication, Hybrid cloud, VMware

    This is the fifth and the last post of Q&A with Serge Shats, Ph.D., CTO and Co-Founder, JetStream Software. Previous four posts links are here:
    Post 1
    Post 2
    Post 3
    Post 4

    Q: How are IO filters used for cloud DR?

    A: Here are four scenarios in which IO filters can be used to replicate data for cloud DR:

  • 1. Business Continuity Cloud Services:
  • With data replication from the on-premises environment to a cloud service provider, the service provider can host a warm failover destination for the VMs running at the on-premises data center.

  • 2. Data Backup to Cloud Object Store:
  • With the same method of intercepting data on-premises, the data can be continuously replicated to a cloud object store for recovery. Data may be preprocessed for the destination through the specific object store’s APIs. Again, no snapshots are required.

  • 3. Point-in-Time Recovery for Continuous Data Protection:
  • By replicating data in a continuous stream instead of discrete snapshots, point-in-time navigation is possible for recovery of all data up to immediately prior to a critical event (e.g., malware intrusion).

  • 4. Cloud Data Protection Services for On-Premises HCI:
  • Rather than requiring a “like-to-like” model for cloud data protection, data replication from within the hypervisor itself can provide DR for Virtual SAN or third-party HCI, even if the cloud destination is running entirely different compute and storage hardware.

    JetStream

    Source JetStream Software



    Q: With respect to cloud DR, how do IO filters compare to other data capture methods?

    A: IO filters enable a continuous capture of data from within vSphere, which is a game-changer for cloud DR. Traditionally, organizations have looked to the cloud for snapshot-based backup, which has its place, but it is quite limited in terms of realizing true DR as a cloud service.

    It’s well understood that snapshots degrade application performance and by definition don’t support continuous replication. The tradeoff with snapshots is the shorter you want your RPO to be, the more snapshots you create, so the greater impact on runtime performance. Also, recovering a volume from many small snapshots will increase RTO. For true DR from a cloud service, continuous data replication from an IO filter gives a better, more efficient approach.

    Prior to the availability of IO filters, continuous data capture was possible, for example, by intercepting data in a vSCSI filter. This is how vSphere Replication accesses data as it makes snapshots for data recovery. The key problem with vSCSI is that it’s a private API intended for VMware’s use, and VMware provides no guarantee of support for third-party technologies that use vSCSI intercept.

    Another approach to continuous data capture is to install agents inside the VMs to replicate data in a stream. While this method can achieve RPOs of just seconds, it is an agent-based solution, which may raise concerns about security and compatibility.

    Lastly, virtual appliances typically run within their own VMs, so they are broadly compatible, and they generally don’t take snapshots, so they can stream data. The problem is that they either stand in the data path itself, introducing IO latency, or they require a filter or agent to intercept data.

    JetStream

    Source JetStream Software

    Q: What’s next for IO filters?

    A: While the IO filters API is primarily of interest to software developers providing data management services in the VMware ecosystem, interest has been growing recently, driven primarily by cloud and hybrid cloud use cases. In the future, it’s not difficult to see IO filters applied for uses beyond performance acceleration, live migration, and data protection to other types of policy-based data management.

    The idea of cloud services moving beyond disaster recovery and data protection solutions is feasible with on-premises IO filters enabling “X as a service” offerings, with the application of specific policies to data across an infrastructure comprising on-premises operations and cloud services.

    With an IO filter in each VM on premises, a solution can intercept and process every bit of data moving up and down the storage stack, and it can help the admin set data policies for those VMs, for any type of business requirement, such as cloud cost optimization or compliance. The key is that there is no need for an external data management framework — policy-based data management can be enabled within vSphere itself — across multiple data centers and cloud services.

    ###


    December 20, 2018  9:31 PM

    Hybrid Cloud Services for VMware: All About IO Filters @JetStreamSoft IV

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Hybrid cloud, VMware

    In this series of Q&A with Serge Shats, Ph.D., CTO and Co-Founder, JetStream Software we are in the middle of the discussion. This is the fourth post in the series. Previous posts can be accessed from the links below:

    Post 1
    Post 2
    Post 3

    Q: How are IO filters used for virtual machine live migration?

    A: The problem with live migration is this: How do you keep applications running, with new data being written continuously, during the hours — or sometimes days — that it takes to move the applications’ data to the destination? There are a number of approaches, as virtual machine migration is not a new problem. But IO filters provide a capability that’s much simpler than anything we’ve seen before.

    With JetStream Migrate, the software deploys as a replication filter in the source VMware environment. The migrating VMs’ configurations and virtual disks are copied from the on-premises data center to the cloud data center, and while that copy and transfer process is taking place, newly written data from the VM is captured by the IO filter and also replicated to the destination.

    One of the advantages of this approach is that the copy of the virtual disk can be moved over the network connection, or it can be copied onto a physical device for “offline” transport to the cloud destination. So if you are familiar with the Amazon Snowball, it’s now possible for an organization to use a snowball-like device to transport data from one VMware environment to another VMware environment, without having to stop the VMs or their applications from running at the source.

    HybrCloud Services for VMware

    Source: JetStream Software

    Q: With respect to disaster recovery (DR), why would someone use IO filters instead of snapshots?

    A: One of the key goals for using IO filters for data replication is that — unlike snapshots — data can be captured for replication without a detrimental impact on application performance. Also, because data is being captured in a stream, there are better options for delivering a variety of DR capabilities, such as an extremely low RPO and RTO, as well as very fast point-in-time recovery.

    We will be concluding this series in the next post. Continued »


    December 20, 2018  9:22 PM

    Hybrid Cloud Services for VMware: All About IO Filters @JetStreamSoft III

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Hybrid cloud, VMware

    This is the third post in this series of discussion with Serge Shats Ph.D., CTO, and Co-Founder, JetStream Software. He is talking about Hybrid Cloud Services for VMware: What You Need to Know About IO Filters. The first two posts can be reached here:
    Post 1
    Post 2

    Q: What are the advantages of IO filters?

    A: First, and perhaps most obviously, because IO filters run within vSphere, they truly achieve the goal of enabling “software-defined storage.” IO filters are designed to run with any type of datastore, including shared storage, VSAN/HCI or Virtual Volumes. Second, among the various software-oriented approaches to integrating third-party data services with vSphere, IO filters are the most “vSphere native.” IO filters don’t use any agents in the VMs, virtual appliances in the data path, third-party modules in the kernel, or calls to internal APIs. Solutions deployed as IO filters provide an assurance of support, compatibility, and stability that other approaches to software-defined storage can’t match. Of course, this becomes doubly important when we’re talking about cloud or hybrid cloud deployments, where abstraction is paramount.

    Q: How are IO filters used for storage IO acceleration?

    A: Storage IO acceleration was our first application of IO filters; it’s why VMware selected us to partner with them, as our IO accelerator served as a kind of reference architecture for the API as it was in development. JetStream Accelerate uses the caching filter to enable virtual machines to write data to a non-volatile memory device in the host servers so that when the time comes to read the data, if possible, the VM will read that data from the non-volatile memory cards or SSDs rather than having to traverse the SAN to get the data from the underlying storage.

    Reading data from host-based flash generally enables much faster application performance, and with sufficient memory and CPU, it allows increased virtual machine density as well. Host-based data access is especially important for latency sensitive applications, and data center operators also like the idea of deploying two to three times as many VMs on the same number of host servers without any performance penalty, just by reducing storage IO latency.

    Enterprises also benefit from greatly reduced storage overhead. For example, in an Oracle environment at a large telecom customer, we are seeing a 90 percent reduction in read operations against their storage arrays. That means that because they’re serving those operations from flash within the host systems, they don’t need to overprovision their storage for performance, saving a lot of money on their storage budget.

    Hybrid Cloud Services

    Source JetStream Software

    We will continue in the next post. Continued »


    December 20, 2018  8:51 PM

    Hybrid Cloud Services for VMware and IO Filters @JetStreamSoft

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Cloud Services, Hybrid cloud, VMware

    This post is in continuation to the previous post. This is the second post in the series. We are in conversation with Serge Shats, Ph.D. about Hybrid Cloud Services for VMWare and IO Filters. Shats is co-founder and CTO of JetStream Software. He has more than 25 years’ experience in system software development, storage virtualization, and data protection. Previously co-founder and CTO of FlashSoft, acquired by SanDisk in 2012, Shats has served as a chief architect at Veritas, Virsto, and Quantum. He earned his Ph.D. in computer science at the Russian Academy of Science in Moscow, Russia. For more information, please visit www.jetstreamsoft.com, www.linkedin.com/company/jetstream-software-inc/ and @JetStreamSoft.

    Q: What are IO filters?

    A: The IO filters API is a feature of vSphere that allows third-party data services to be safely integrated into the data path between the virtual machine and its virtual disk(s), capturing data and events in order to provide some data management service. There are different IO filters for different data management functions, including:

  • Data replication for disaster recovery
  • IO acceleration with host-based non-volatile memory
  • Data encryption
  • Storage IO control
  • Hybrid Cloud

    Source JetStrem



    Q: How do IO filters work?

    A: An IO filter is a software component that intercepts data and events continuously, with very low latency. But there is more to IO filters than what they do at the VM level. It’s helpful to think about IO filters at the cluster level as well. IO filters are deployed from a standard VIB and installed by vSphere to every host in a cluster, including new hosts that are added after the initial deployment. Even the process of updating or uninstalling filters is managed across the cluster by vSphere. Once deployed, the filters’ operating parameters are defined by VMware Storage Policy Based Management (SPBM). So it’s fair to say that the API enables a third-party data service to act as though it is “VMware native.”

    We continue our discussion in the following post. Continued »


    December 20, 2018  8:39 PM

    Hybrid Cloud Services for VMware: All About IO Filters @JetStreamSoft

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Hybrid cloud, SanDisk, VMware, VMware vSphere

    Q&A with Serge Shats, Ph.D., CTO and Co-Founder, JetStream Software talking about IO filters in detail

    Over the past few years, we’ve seen a lot of new features introduced to the VMware platform. Many of these new developments were undertaken to make VMware an even better cloud and hybrid cloud platform. One of the less well-known developments may be one of the most important: IO filters. As organizations shift some or most of their infrastructure to cloud-based services or consume cloud services for data protection and business continuity, IO filters are becoming key to accomplishing some important capabilities, such as migrating VMs without interruption and protecting VMs in an on-premises data center from a cloud service. The VMware vSphere API for IO Filtering (VAIO) represents a significant step in how VMware can be used in cloud and hybrid cloud environments.

    We recently chatted with Dr. Serge Shats, CTO and co-founder at JetStream Software, about the company’s role in developing and applying IO filter technology for cross-cloud data management. Serge has led architecture and development at storage and data protection companies including Veritas, Quantum, and Virsto. He was CTO of FlashSoft Software, then engineering fellow at SanDisk after the company acquired FlashSoft.

    Q: Tell us about your role in developing the API framework for IO filters.

    A: Starting in 2014, while our engineering team was still at SanDisk, we began collaborating with VMware as the co-design partner for the IO filters API, so we have a rather extensive understanding of the API. It is a standard VMware API, and solutions that support the API are listed in the VMware Compatibility Guide and are certified “VMware Ready.” The VMware Ready certification ensures full support from VMware. JetStream Software has a large number of deployments of the software, mostly in large data centers running VMware for public and private cloud operations.

    We continue our interesting conversation with Serge Shats in the next post. Continued »


    December 20, 2018  6:57 PM

    Site24x7 Brings AI for Monitoring and Chatbot Integration @site24x7

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    ai, Artificial intelligence, Azure, Chatbot

    Microsoft Azure monitoring becomes stronger and easier with the help of AI-driven technology developed by Site24x7. Similarly, the technology also assists in Microsoft Teams Chatbot Integration. Let us understand how does it help the client in a substantial way. The foremost benefit the client gets is a drastic decrease in application outage resolution time with the help of AI-powered insights. Interestingly DevOps and application teams remain in the zone of work even while IT incidents happening. This happens with the help of Site24x7 chatbot for Teams. The good thing is, it is a cloud-based performance monitoring solution that works now more efficiently for DevOps and IT Operations. After this launch, IT teams can now manage more than 100 Azure products using Azure Insights API. All this happens in near real-time hence empowering IT teams to get timely alerts and thus taking appropriate action proactively thereby bringing down resolution time significantly.

    Site24x7

    Source: Site24x7.com

    All this definitely helps in gaining higher visibility of the organization into their hybrid clouds. Actually, after the deployment of Site24x7 chatbot for Microsoft Teams, it becomes quite easier for DevOps and application handling teams to gain a real-time picture of the health status of critical applications right in their workplace chat room. In fact, with the growing adoption of the hybrid cloud environment, the monitoring load increases in the sense that now the teams have to monitor not only the on-premise environment but also the multiple cloud system. The pace at which the global hybrid cloud market is growing is really phenomenal. From around $50 billion in 2018, it is estimated to touch almost $100 billion by the year 2023 as per MarketandMarkets, a B2B research company. If this growth has to sustain or go higher, a strong solution like Site24x7 needs to be in place.

    Site24x7 Enhances the DevOps experience

    Srinivasa Raghavan, Product Manager, Site24x7.com says,

    “With digital transformation picking pace, DevOps teams are happily embracing the public cloud for new workloads, but it comes with a few inevitable challenges such as getting end-to-end visibility, performance degradation and managing user experience of business-critical applications. With AI-driven monitoring and IT automation, the issues across private, public and hybrid environments help IT teams, to bring down mean time to repair incidents, thus improving productivity,”

    Bhrighu Sareen, General Manager, Microsoft Teams, says,

    “ChatOps scenarios built on Microsoft Teams empower users to collaborate, share critical documents, applications and communicate in real time. Site24x7’s AI-driven monitoring capability brings together developers, application teams and IT operations into a single efficient secure location in Microsoft Teams for quick problem identification all the way to resolution.”


    December 18, 2018  10:37 PM

    Enterprise Cloud Adoption – Go Elastifile Google Cloud Way @elastifile

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Cloud Services, File storage, Google Cloud, Infrastructure management, SaaS

    Early access program opened December 11th. General availability begins in Quarter 1, 2019. Elastifile and Google Cloud introduce scalable, fully-managed file service for Google Cloud. Yes, we are talking about Enterprise Cloud Adoption the way future needs it. The way Google and Elastifile design it to tackle all current and future requirements of any enterprise. The solution aims to bridge traditional and could-native architectures including Kubernetes, Preemptible, Compute VMs, and Kubeflow. The key features include data persistence for Kubernetes, data resilience for preemptible cloud VMs, and mobilized machine learning. The solution becomes stronger and intelligent as it grows with the time post-deployment. The target business verticals include media & entertainment, manufacturing, life sciences, and traditional enterprise IT to name a few. But the solution is not limited to these verticals only. Any other medium to the large-sized enterprise can adopt the solution and leverage its futuristic strengths.

    Enterprise Cloud Adoption

    Source Elastifile.com

    Enterprise Cloud Adoption needs a highly Scalable Cloud File environment. This solution that we are talking about is designed to serve a broad spectrum of enterprise applications of any size and scale that require file storage. It is designed for standard protocols like NFS. In addition, it consists of a full suite of enterprise features like snapshots, multi-zone access, and so on. The solution would be the right fit for any industry vertical in my opinion. Especially those where the data grows at a phenomenal pace and needs best class file storage provisions. Its flexible service class options empower a business to gain access to unlimited snapshots. In fact, snapshots price astonishingly will be as low as $0.03 per GB per month. That is how it aligns cost and performance to any business need. There are three different models to chose from depending on an enterprise’s basic requirements.

    Enterprise Cloud Adoption – A new Paradigm

    The first option of Enterprise Cloud Adoption from Elastifile leverages Elastifile ClearTier technology. This technology integrates standard persistent disks and object storage within a POSIX-compliant, unified namespace. Typical performance: 2 GB/s BW @ 120 TB capacity. The cost? It is just $0.10 per GB per month. The effective cost comes out to be w/ 30% snapshots: $0.08 per GB per month. This option is ideally suitable for capacity-driven, cost-sensitive use cases. The second option leverages Elastifile ClearTier technology to integrate SSD persistent disks and object storage within a POSIX-compliant, unified namespace. Typical performance: 10 GB/s BW @ 120 TB capacity for active data. If the first option is for capacity optimization, this one is for general use. The cost would be $0.17 per GB per month. Effective cost w/ 30% snapshots: $0.13 per GB per month. As said, this is for general purpose use.

    The third option of Enterprise Cloud Adoption from Elastifile aggregates SSD persistent disks into a POSIX-compliant, unified namespace. Typical performance would be 15.6 GB/s BW @ 120 TB capacity. It provides high transactional performance, scalable to millions of IOPS. That is just phenomenal. Isn’t it? It is for businesses who want performance optimized. Cost? Just $0.30 per GB per month. Effective cost w/ 30% snapshots: $0.22 per GB per month. Ideally for workloads requiring high performance. Elastifile Enterprise Cloud Adoption solution is a cloud-native, truly software-defined service architecture. In addition, it is future-proofed to seamlessly leverage cloud technology advancements for years to come.


    December 17, 2018  11:25 PM

    Hyperconvergence Software by @MaxtaInc Creates A New Paradigm

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    hyperconvergence, Maxta

    It would be something significant obviously when a technical support services firm Trusource Labs LLC banks on Maxta Hyperconvergence software by Maxta Inc. Trusource Labs specializes in support for the IoT (Internet of Things) providing helpdesk services to the organizations using Apple devices. In fact, its customer range is quite versatile. It consists of startups on one hand and Fortune 100 companies on the other hand. The biggest challenge for this 2013 founded company in Texas was to ensure a strong collaboration with a scalable, flexible solution for supporting their growth outpacing in legacy IT infrastructure. Within 3 years, Trusource Labs was adjudged as the fastest growing business in Central Texas. Starting with just 20 employees in 2013, Trusource has now more than 600 employees. In addition, they have just started their international operations from Limerick, Ireland. That is a phenomenal growth. Maxta had a substantial role in it.

    Hyperconvergence Software

    Source: Maxta.com

    Oklahoma Wesleyan University with over 2000 students and 500 staff had a number of specific challenges. They were finding it difficult to manage their storage array that was running out of space. An investment of $30,000 for additional capacity was almost out of question. They wanted to opt for hyperconverged appliances but that too was a very expensive solution in terms of opex and capex. That is when they found a highly cost-effective hyperconverged infrastructure that was at par with industry standard servers. It had a comfortable ability to scale both, storage or compute as and when required by the university. The solution, in fact, was radically simplified storage management with no more LUNs (Logical Unit Numbers), provisioning, or a cumbersome capacity planning. The solution was in this case too was Maxta Hyperconvergence Software.

    Hyperconvergence Software that reduces capital and operating costs by up to 70 percent

    Texas Southern University is a much larger university in comparison to Oklahoma Wesleyan University. It has around 10,000 students and 1,500 staff. It had almost similar challenges in terms of their IT infrastructure. The existing traditional storage arrays were more than difficult to manage, mostly leading to misconfiguration. There was an overall lack of common management of storage arrays with a regular IT staff turnover. The university was badly in need of a simpler way to manage storage resources without hiring a storage administrator per storage array. After a good amount of research in the market, they found Maxta’s Hyperconvergence Software solution that was capable of delivering a complete and cost-effective mechanism to manage primary workloads utilizing TSU’s existing hardware assets. There was no need for specialized storage management resources. Capacity upscaling was quite easy by adding into an existing Maxta node rather than adding a complete node.

    TSU had a capability of refreshing server hardware without needing a repurchase of the hyperconverged software license. “With hyperconverged infrastructure, we can further utilize our hardware investments while bringing the data as close to the CPU as possible,” says Kelly Dean, Senior Systems Administrator, Texas Southern University. “That was one of the most important things, trying to not only simplify from a management perspective but also simplify in terms of the sheer number of pieces involved. In addition, I am looking at it from the perspective of, ‘What if I ever leave here? Can somebody come up behind me and understand how this works? You have to leave a place in a better state than when you got there,” concludes Dean.

    TSU had a capability of refreshing server hardware without needing a repurchase of the hyperconverged software license

    “SAN solutions start at $50,000 to $60,000 and if you run out of space, you have to upgrade all the drives or buy another one,” says Larry Chapman, IT Manager, Trusource Labs. “And you need a storage engineer to manage all the LUNs. You also have to have the personnel and a pretty big
    hardware investment. That’s not really scalable. We run a pretty tight ship in our engineering department. I don’t want to have onsite engineers at every location. We can remotely manage all this stuff. Because Maxta is so maintenance-free, I don’t have to double or triple or quadruple my staff. If you calculate that cost over years and years, I’m saving a ton of money.” he concludes.

    “A lot of the options were really expensive. I was initially looking at Nutanix and VxRail, trying to figure out how to afford to put that in my environment,” says Eric Goings, CTO, Oklahoma Wesleyan University. “Ultimately, Maxta is going to save us a lot more money than just the initial up-front cost,” he concludes.


    December 9, 2018  11:51 PM

    IoT and Hybrid Cloud in 2019 According to Don Boxley @DH2i

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    DH2i, Hybrid cloud, IIoT, Internet of Things, iot, VPN

    A recent interaction with DH2i CEO and Co-Founder, Don Boxley about the key technology transformations in 2019, he talks about two key developments he foresees in 2019 regarding IoT and Hybrid Cloud. So, his 2019 predictions go as follows. The first prediction says Enterprises will replace VPNs with micro-perimeters. This will become important for them in order to secure IoT gateway communications. This clearly means VPNs will vanish and so will the threats and vulnerabilities associated with them. Security is obviously the prime priority for enterprises. The dependence on data, technology, and the internet are at its peak. This comes with a bundle of threats though. But exploring and using technology is inevitable. It is the basic necessity now. The new product differentiator for enterprises is making smart products and IoT devices. Most of the devices are coming with IP addresses. Organizations are investing in IoT initiatives.

    IoT and Hybrid Cloud

    Organizations understand very well that the IoT gateways layer is the primary key to gain a high dividend on those investments in IoT initiatives. As we all know IoT gateways involve device connectivity, protocol translation, updating, upkeep, management, predictive and streaming data analytics. Not only that, in fact, it also involves a greater volume of data flow between devices and the cloud. Opening so many gates definitely increase the risks and this seeks a high level of improvement in the security of that high volume of data flow. Nothing better than a Zero Trust security model will work in such cases. Enterprises will have to replace VPNs with micro-perimeters if they have to secure IoT and Hybrid Cloud spectra. Micro-perimeters, understandably, remove an IoT device’s network presence thereby eliminating any kind of potential attack that is there in VPN. Zero Trust hybrid cloud security will become most critical.

    IoT and Hrybid Cloud Seek New Paradigms

    While many organizations are drafting or following a hybrid strategy to manage IoT and Hybrid Cloud. This strategy involves a deep integration between on-premise systems and off-premise cloud/hosted resources. VPN software solutions are getting obsolete in wake of the new IT world of hybrid and multi-cloud environments. Because VPNs were never designed keeping these newer transformations in mind. And if you try to align VPNs with these newer environments to create a secured world, it would be too complex to achieve. Moreover, VPN means a slice of the network for each user that easily create a lateral network attack surface with a higher amount of risk. The need is very different now. Enterprises require a new class of purpose-built security software if they have to do away with these risks.

    This new security software empowers enterprises to build lightweight dynamic micro-perimeters to secure application- and workload-centric connections between on-premises and cloud/hosted environments. This, in turn, ensures virtually no attack surface. As Don says,

    “In 2019, every hybrid cloud security strategy should be updated to replace VPNs with micro-perimeters.”

    “In 2019, every VPN used for a PCI application should/will be replaced with a micro-perimeter.”

    “In 2019, if a company’s hybrid cloud network security strategy relies on VPNs, the CEO should fire their head of network security.”


    December 9, 2018  9:20 PM

    Data Protection Automation Through An Innovative Breakthrough Solution

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Amazon EC2, AWS, AWS EC2, Data backup, Data protection, EC2, Hyper-V, Nakivo, VMware

    Definitely, the new oil is data. And looking at an increase in its exponential increase and importance, it needs an utmost attention towards its protection, safety, and availability. Any organization that brings out an innovative breakthrough in any of these three segments would immediately meet tremendous attention and response. NAKIVO brings a breakthrough solution for data protection automation. This includes backup and replication. In fact, it brings automation of core data protection tasks to its next level just a week ago. The solution includes a significant and unique functionality that empowers NAKIVO customers to put their VM data protection on an auto-pilot mode. This, in turn, simplifies data protection management providing a risk-free environment that any enterprise would need. NAKIVO Inc., a software company, is growing quite fast in virtualization and could backup spectrum by bringing unique and valuable solutions for the enterprise world.

    Data Protection Automation

    Source: https://www.nakivo.com/customers/success-stories/#

    The latest data protection automation solution is a part of NAKIVO’s Backup & Replication v8.1 release on 3rd. This is a revolution in helping businesses to manage their data protection chores by means of automation thus reducing manual intervention to a large extent. In addition, the release also includes universal recovery of any application objects. The two key points to note about this solution are – (a) Policy-Based Data Protection, and (b) Universal Object Recovery. As we all know, managing a large array of VMWare, AWS, or Hyper-V infrastructure is a difficult task. NAKIVO Backup & Replication v8.1 thus focuses on Policy-Based Data Protection. This means the customers can now create a backup, replicate, and backup copy policies to fully automate data protection processes with an easy configuration and good amount of flexibility. The policy parameters can be VM name, location, size, tag, power state, or any combination of these.

    Data Protection Automation Meets An ultimate Solution

    Once the policies are set up, the system takes care of scanning the whole infrastructure for VMs defined in the criteria and ensure complete protection of the VMs in an automated manner. This means now all critical VMs and ECs instances can be provided complete protection with almost zero manual input. That is, actually, a remarkable achievement. Similarly, the Universal Object Recovery ensures customers successful recovery of any application objects back to the source, a pre-defined location, or a physical server. This, in turn, saves a lot of valuable time and resources that an enterprise spends on restoration. A step ahead, the customer gets a leverage of recovering individual items from any application or file systems by mounting VM disks from backups to a target recovery location. The best part is, it doesn’t require to restore the entire VM first.

    Bruce Talley, CEO of NAKIVO Inc. says, “ We are expanding our product’s functionality to further improve reliability, flexibility, and ease-of-use. Policy-Based Data Protection in v8.1 is yet another significant step in this direction. By fully automating core data protection tasks, NAKIVO Backup & Replication minimizes the possibility of human error and helps customers gain more confidence in their data protection strategies.” NAKIVO is the winner of “Best of VMworld 2018” and the Gold Award for Data Protection. That itself speaks about its consistent growth and path-breaking VM Backup and site recovery solutions it brings on the table. And now an ultimate solution in Data Protection Automation that makes it a consistent pioneer in this field.

    Data Protection Automation is the need of the hour

    Honda, China Airlines, and Coca-Cola are a few names in their customer list of a large number of enterprises worldwide. You name a global standard storage system and it gets a full support of NAKIVO solution. So whether you are using a storage system like Synology, Western Digital, QNAP, ASUSTOR, or NETGEAR; and/or a high-end deduplication appliance like Dell/EMC Data Domain or NEC HYDRAstor, you can be assured to get a 2X performance and protection advantage with NAKIVO solution. Obviously with its new Data Protection Automation solution, this performance and protection increases manifold thereby providing enterprises a risk-free environment. The trial download is available here. To read some of the great success stories, click here. A Datasheet for reference is available here.


    November 19, 2018  12:03 AM

    Quobyte Storage Platform For Data Centers @Quobyte

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Data Center, Data Center Storage

    If you want to measure the depth of technology, reliability, trust, and relationship in a solution providing organization, you need to look at the size and scale of its customers. Quobyte is the best example in this regard having wide acceptance on a global front for their state-of-the-art storage solution for any size data centers across the globe. UK Science and Technologies Facilities Council, earlier this year, deployed Quobyte to manage JASMIN Super-Data-Cluster that sizes to more than 40 Petabytes of storage. The reason for this is its massive scalability, operational ease, and unmatched performance. JASMIN is accessed globally by thousands of its users to search, manipulate, and analyze data. Around 1-3 PB of data is processed on daily basis. Quobyte’s Data Center File System empowers it to unify its file, block, and object storage datasets in a centralized system of more than 11,500 cores on around 600 nodes.

    Quobyte

    Source – Quobyte.com

    Earlier this year, NEC Deutschland GmbH, a leading HPC solutions provider partnered with Quobyte to develop and deploy a complete storage solution stack for HPC workloads that are based on NEC HPC hardware and Quobyte’s Data Center File System software. This solution, in turn, empowers enterprises and researchers a cost-effective and high-performance storage system. This system is easy to configure in terms of performance and capacity modules to ensure excellent operational efficiencies of hyperscales to HPC workloads. No other solution can match this massively scalable modern file system to manage any kind of storage architectures. This is, in fact, a highly configurable system that has a capability of easily scaling to hundreds of petabytes without putting an extra load on administrative costs yet managing large streaming workloads and handling millions of small file workloads. That is the beauty of this remarkable system.

    Quobyte can manage hundreds of petabytes of data easily

    Earlier this month, Actapio Inc. selects Quobyte to provide the best of the storage platform for its data centers. Actapio Inc. is a U.S. subsidiary of Yahoo Japan Corporation and thus is also commonly known as Yahoo! Japan. The internet giant with a massively scalable and fault-tolerant storage infrastructure was seeking the best possible data center file system and storage platform and thus ultimately zeroed down to Quobyte. On November 14, at OpenStack Summit – Berlin, Yahoo! JAPAN presented their deployment of OpenStack with Quobyte.


    November 15, 2018  11:32 PM

    DirectSearch Tackles Enterprise File Search Issue @Cloudtenna – 2

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    File management

    This is the concluding post about Cloudtenna and its unique innovative tool DirectSearch. You can read the previous post here. The key features of DirectSearch by Cloudtenna include AI Powered Accuracy, Security, Cross-Silo File Search, Granular Criteria, and its super speed to find the relevant files in no time. The product was given to 30 beta customers including online businesses, universities, and medium to large enterprises are using DirectSearch currently exploring six services that include Dropbox Business, Google Drive, Sharefile, Box Business, Jira, and Confluence. The production version will have additional access to Microsoft OneDrive, Outlook mail (including Office 365), Gmail, and all network drives/NAS (Network Attached Storage). Bryan Pham is the founder of Cloudtenna.

    DirectSearch

    Source Cloudtenna.com

    While Pham is an expert in cloud storage infrastructure, Cloudtenna cofounder Aaron Ganek is an expert in user experience. A seed funding of $4 million by Blazar Ventures and a strategic investment from Citrix is more than enough to state how promising the solution is for the international market. Andy Cohen, VP Corporate Development, Citrix says, “ Citrix is pleased to be part of the Cloudtenna investment round. Cloudtenna has outstanding new technology in intelligent search and data analytics and we are excited to be part of this round with Cloudtenna.” The talent in the organization includes top professionals from Silicon Valley like Rhapsody Networks, Symantec, NetApp, Oxygen Cloud, Sun Microsystems, Fusion.io, VERITAS, and EMC. The Engineering team includes key contributors to the NetApp WAFL and VxFS code bases. DirecSearch future developments include file management, auditing, analytics, and e-governance. The tool is already serving many enterprises.

    DirectSearch Works On Outstanding New Technology

    Pham who also serves as Cloudtenna CTO says, “Today’s workers are using literally dozens of file repositories, which has become a critical problem not only for individual productivity but for corporate IT departments and for vendors of platforms that would benefit from improved search functions. There are incomplete point solutions that partially solve a piece of the puzzle, but DirectSearch will revolutionize the way people find and work with files day in and day out.”

    Robert Poulin, founding partner, Blazar Ventures says, “Cloudtenna’s technology is viable today but also has broad implications for file and data management in a modern enterprise that combines local, cloud, BYOD, and SaaS assets,” said “We look forward to advising Cloudtenna on its enterprise, OEM, and direct market strategies so it realizes the full benefit of what it has to offer.”


    November 15, 2018  11:24 PM

    Cloudtenna Tackles Enterprise File Search Issue @Cloudtenna – 1

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    File management

    DirectSearch is a uniquely innovative product that aims to solve a big universal problem faced by almost all enterprises. I remember how difficult it used to be to ascertain the latest file having different versions on different locations. These locations could be a cloud, centralized file server, or local machines. Simultaneously so many working hours would go waste in search of an important file requiring some immediate action. This is still quite usual among most of the enterprises. Cloudtenna, a California based software startup company launches DirectSearch to overcome this problem in a phenomenal manner. This tool searches file across multiple platforms at a tremendous speed. These locations include clouds, local servers, remote servers, and local client machines. The time it takes for these searches is stunningly 400-600 milliseconds. The growing issue of file sprawl with enterprise file search is also because of increasing enterprise data at a fast pace.

    Cloudtenna

    Source Cloudtenna.com

    The new technology that Cloudtenna brings is already making waves in the news. Premium news and research companies like ZDNet, Forbes, The Register, Forrester, and many more are talking about this powerful and probably the most efficient tool. This is a common scenario among enterprises to have file across on-premise repositories, cloud file storage services, and hosted web servers. DirectSearch leverages the power of machine learning and artificial intelligence along with natural language processing (NLP) and rigorous automation to create this innovative all-new search engine. This search engine finds the required files that are scattered across network drives, cloud storage, email apps, and various hosted collaboration suites. More storage locations obviously mean more chaos and inefficiency. As per a report by IDC, a person spends more than 2.5 hours per day searching for files. That comes out to be around 30% of the time.

    Cloudtenna Leverages New Age Technologies To Build DirectSearch

    This IDC report clearly depicts that an enterprise having 1000 knowledge workers would waste around $50,000 per week accumulating to $2.5 million a year just for locating and retrieving information. If an enterprise is able to save that stupendous cost, it could lead to a phenomenal productivity and revenue in return.

    The second part of the article concludes in the next post. Continued »


    Forgot Password

    No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

    Your password has been sent to: