Quality Assurance and Project Management

November 12, 2019  5:00 PM

How Cybersecurity Evolves CISO/Security Vendor Relationship? @cynet360

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
CISO, cybersecurity

With any kind of evolution there is a need for realignment, reexamination, change from the existing, or discovery of new. Why is it so? Because a change seeks to change. With the change in the severity of threats and vulnerabilities, there is a need for evolution in cybersecurity. The same old methods or concepts hold no good in today’s environment of new threats and risks. The evolution of cybersecurity, in fact, automatically promises to remodel of CISO/Security Vendor relationship.

As the corporate’s dependency on technology and information systems has increased tremendously to the core of business processes, IT infrastructure and business applications security needs has taken a new shape. It needs a complete reassessment, recognition, and a new perspective of understanding of ongoing challenges of defending corporate infrastructure. Cynet, keeping all this in mind, is showcasing a new series of videos on the trials and tribulations of cybersecurity.

The IT department of an enterprise needs to understand the wide gap between what vendors promise and what pains of the corporates remain unattended. The key focus of this video series stays on educating Technology experts in an organization how vendor solutions fall short of actual deliverables at the time of crisis despite carefully drawn service level agreements. A new approach, thus, is very important. That new approach should combine technology and a team of security experts to cater to a highly effective defensive strategy. CISO being at the top of the ladder when it comes to cybersecurity in an organization must equip him or herself with all kinds of powerful tools and solutions to defend the organization against cybersecurity breaches. Because ultimately he or she will be responsible for failed efforts. There is always something more to what they are doing.

Despite deploying the best of the solutions, these top IT security experts can’t afford to just sit idle after the deployment. They continuously need to strive to enhance their IT security defenses with various actions like educating the workforce, upskilling security teams, and protective product selection to insulate their enterprise against some of the most notorious attacks. It is, actually, quite difficult for CISOs and their team of professionals to vet IT security solutions for the threats and vulnerabilities they are tackling regularly. Every solution has its sphere of coverage while catering to a variety of environments, applications, and risks to protect and resolute against cyber threats in a definitive manner. It is, thus important to get them out of that vicious circle. To tackle that, Cynet has launched a new video series (https://www.cynet.com/ciso-vs-security-vendor/?utm_source=thn) to address the concerns of CISOs when managing cybersecurity vendors.

Most of the CISOs are finding themselves helpless against the high-pressure sales tactics of cybersecurity vendors who come with over-hyped claims of protection, extremely complex operations, and lack of automation. Each of these areas is the actual pain points of IT professionals working in the security industry. Cynet, through these videos, asks CISOs to re-evaluate their approach to cyber defense.

Dori Harpaz, VP, Marketing for Cynet says, “Our newest generation technology challenges the common misperception that cybersecurity solutions are ineffective or too complicated to leverage the benefits. Cynet’s radical approach simplifies and converges cyber defense so organizations can quickly and easily handle cyber-attacks and remain focused on what they do best – their business.”


Source: A clip from one of the Cynet videos on cybersecurity

Enterprises need a more holistic cybersecurity solution in comparison to what is currently available in the market. Cynet has come out with an Automated Breach Protection Platform to safeguard and protect the complete IT environment in an organization with a simple yet powerfully integrated control dashboard that is backed by a top System & Organizational Control (SOC) Team. The company has built a unique combination of the most advanced technology and security support infrastructure. So, it is Cynet360 software empowered with Sensor Fusion technology that can collect all activity signals, analyzing them together in a real-time environment to conclude the true context of each activity based on which it enforces precise breach prevention actions.

Cynet 360 Sensor Fusion technology begins with a sensor array to monitor activities of a file, process, memory, network traffic, and user accounts to find out any kind of exceptional behavior or patterns, individually and after fusing the results in order to analyze the findings thereby delivering automated precise actions that provide complete protection than any comparable solutions of platforms. As an outcome of this, enterprises get precise monitoring and control, attack prevention, and detection, and an immediate response orchestration for the highest order of breach prevention.

The Cynet 360 technology platform is supported by CyOps 24/7 Cyber SWAT Team at no additional cost. This provides customers with the crucial cybersecurity staff required to ensure customers do not fail in keeping up with a fast-moving threatscape. The CyOps team of professionals includes security analysts, researchers, and experts capable enough to provide incident responses, forensics, threat hunting, and malware analysis.

November 10, 2019  9:49 PM

Why an Internal/In-house Proxy Operation Is No Longer Viable?

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja

There is a misconception among enterprises that operate their own server network in the cloud. They believe that having access to global IP distribution helps them a lot. It doesn’t. I am writing this so that the CEOs and CTOs/CIOs of such enterprises will understand the reason for stating this fact. Allow me to explain this and clarify why you should select an external IP proxy vendor operator and not operate your own proxy hub.

Why do many Enterprises follow the rising trends of outsourcing? Here are a few basic reasons. First of all, it saves costs and creates an environment for faster business growth. Also, it helps in improving overall results pertaining to business finances and business reputation. In fact, on the technology front, it creates an entirely new paradigm within an enterprise.

As technology is constantly changing at a very fast rate, competition among peer business players is becoming tougher day by day. In days gone by, it was not as tough for a business to survive, sustain, and grow. In fact, then when organizations had a need of access points or static IPs for the purpose of openly accessing the web from various locations across the globe, the easiest solution was to lease cloud servers from the major cloud service providers like AWS and Azure and lease a pool of IPs from their respective Internet service providers (ISPs). Similarly, when there was a need for mobile IPs, building a lab of devices, and SIM-cards was the solution. In a nutshell, it once was a creation of their own in-house proxy network.

in-house proxy

source: luminati.io

Recent research from Luminati Networks (the largest IP proxy network operator) indicates that it was just a waste of time, money and energies, without achieving any useful results.

Today all organizations realize that the management of such an infrastructure is a costly affair in terms of capital expenditures and operative costs. In addition to the costs, the infrastructure consumes a lot of time for their human resources. Outsourcing is easier and it saves significantly on both fronts.

There are overall 4 key reasons for selecting an external IP proxy network operator versus an in-house one:

#1 Cost: I remember when in my last organization we needed a large pool of static IPs and our ISP was unable to meet that requirement because of a shortage of IPV4 IPs. At the same time, whatever number of available static IPs in our pool was, it had a humongous recurring cost per annum. In today’s scenario, the situation is no different. IPV4 IPs are very costly, as much as $20 per IP. It is practically possible for a small business that wants to create an in-house proxy network to shell out $15k as a first-time IP’s purchase and continue spending at a recurring rate of $10k per month merely to manage its operating costs for engineers and servers needed. The only feasible solution here is to outsource at a much lower cost.

Leasing a smaller number of IPs will carry an even higher price tag and reach as high as 4 times the larger pool.

Another major hiccup is leasing on an annual basis. But what if you have a shorter requirement or a dynamic requirement for a changing number of IPs every now and then? What would you do then?

#2 Resources: Buying is the first hurdle. Even if you are able to cross it by convincing your head of finance and get approvals from senior management members, the recurring cost of managing an in-house global proxy network will be a killer in terms of the consumption of time and resources. You will need to source servers from data centers in all required locations. Expertise will be required to set those servers, update configurations, and install all software types an enterprise needs for its operations. Routing through a 3rd party upstream provider to manage IPs and servers, is another large-scale task. The story doesn’t end here. It needs continuous updating or fixing of each IP’s geolocation in various databases that your target domains would be referencing.

All this will require regular monitoring of your in-house proxy network on a daily basis. Above all, you need to refresh IPs on a regular basis by replacing IP subnets and setting them up again by assigning appropriate routing, geolocation, databases, etc.

#3 Diversity: Most commonly, the smallest subnet is a ‘/24’ subnet of 254 IPs. It clearly reflects that any small and medium in-house proxy networks will be struggling with one of the following situations:

a. An unnecessary but obvious increase in the overall proxy expense.

b. A larger proxy network than was initially planned which is required for its diversity.

c. There could be a situation when you have the right size network but with no or very little scope of diversity at all. When there is low diversity, it creates a larger risk of getting blocked. In such a scenario, when one or more of your subnets get blocked, a bigger chunk of your proxy network becomes unusable or the opposite case of a highly diversified network that would have affected a smaller portion of subnets.

#4 Flexibility: You have to keep up with the rapidly changing market dynamics to keep your business fit for its survival. Despite such a stringent situation when it comes to certain requirements like changing infra, replacing blocked IPs, changing or testing of IPs can never happen in a short period. It always takes weeks or months! Any kind of change is not simple. It always demands expert resources and a large amount of time.

Looking at all the factors above, it is pretty much clear that outsourcing your web-access points or internal proxy network makes a lot of sense in terms of business continuity and risk mitigation. Also, operationally and commercially.

This article was based on recent research conducted by Luminati Networks, the world’s largest proxy operator dedicated to enabling businesses open-source data collection. Luminati provides global businesses, companies, and brands with a transparent view of the internet, no matter where they are based in the world.

October 31, 2019  10:17 PM

Data Science for Business: A Must Read Book @Amazon

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Data Science

Data Science for Business: What You Need to Know about Data Mining and Data-Analytic Thinking

Excerpt from Amazon.com

Written by renowned data science experts Foster Provost and Tom Fawcett, Data Science for Business introduces the fundamental principles of data science and walks you through the “data-analytic thinking” necessary for extracting useful knowledge and business value from the data you collect. This guide also helps you understand the many data-mining techniques in use today.

Data Science

Source: Amazon.com

Based on an MBA course Provost has taught at New York University over the past ten years, Data Science for Business provides examples of real-world business problems to illustrate these principles. You’ll not only learn how to improve communication between business stakeholders and data scientists but also how to participate intelligently in your company’s data science projects. You’ll also discover how to think data-analytically, and fully appreciate how data science methods can support business decision-making.

Understand how data science fits in your organization—and how you can use it for competitive advantage
Treat data as a business asset that requires careful investment if you’re to gain real value
Approach business problems data-analytically, using the data-mining process to gather good data in the most appropriate way
Learn general concepts for actually extracting knowledge from data
Apply data science principles when interviewing data science job candidates

Editorial Reviews
“A must-read resource for anyone who is serious about embracing the opportunity of big data.”
— Craig Vaughan
Global Vice President at SAP

Data Science

“This book goes beyond data analytics 101. It’s the essential guide for those of us (all of us?) whose businesses are built on the ubiquity of data opportunities and the new mandate for data-driven decision-making.”
–Tom Phillips
CEO of Media6Degrees and Former Head of Google Search and Analytics

“Data is the foundation of new waves of productivity growth, innovation, and richer customer insight. Only recently viewed broadly as a source of competitive advantage, dealing well with data is rapidly becoming table stakes to stay in the game. The authors’ deep applied experience makes this a must-read–a window into your competitor’s strategy.”
— Alan Murray
Serial Entrepreneur; Partner at Coriolis Ventures

October 30, 2019  1:13 PM

PASS Summit 2019 In Seattle, Washington November 5-8 @DH2i

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Clustering, Clustering/High availability, DH2i, High Availability, SDP, Software defined networks, Tunneling

If, by any chance, you are attending PASS Summit 2019 taking place November 5-8 in Seattle, Washington, then don’t forget to visit DH2i Booth 118. Here, DH2i officials are showcasing how you can ensure always-secure and always-on IT and business infrastructure to your entire enterprise. There are live demos, exhibitions, conferences, and some swags to takeaway. Live Demos by DH2i include DxOdyssey, DxEnterprise, Secure Network Micro-Tunneling, Multi-Platform Smart Availability, and DxAG, Availability Group Clustering Software. There is plenty to learn at PASS Summit 2019. In fact, this is a golden chance to discuss your enterprise technology architecture and get some key insights on how you can optimize and improvise the current design to the best possible one. DH2i is one of the top providers of multi-platform Software-Defined Perimeter (SDP) and Smart Availability. These are available on Windows as well as Linux.

PASS Summit 2019

Source: DH2i

At PASS Summit 2019 you can get deep insights about DH2i software products DxOdyssey and DxEnterprise. These softwares empower customers to create an entire IT infrastructure running on a simple mission ‘Always Secure and Always On’. DH2i is an exhibitor and sponsor at PASS Summit 2019. This summit, you might already be knowing, is the world’s largest and most exhaustive technical training conference for technology professionals of Microsoft SQL Server and data. The live demonstrations will keep happening throughout the event. These demonstrations will comprise of DH2i’s industry-leading data security, high availability (HA), and disaster recovery (DR) software solutions. These include DxOdyssey that provides Secure Network Micro Tunneling. You can learn how you can create lightweight, scalable, highly available, and discreet “secure-by-app” connectivity between your on-premises and/or cloud environments running on Windows and Linux. Interestingly, you can build these connections without a VPN or direct link.

PASS Summit 2019

These softwares build an entirely safe technology environment in your organization without any compromise in quality, security, and performance. In case, you have any queries, you can easily get an answer when you visit DH2i booth 118 at PASS Summit 2019. DxConnect Secures Remote Access to DxOdyssey Tunnels. You can easily deploy a Software-Defined Perimeter (SDP) that secures network connectivity between your main sites of operation and remote users working from anywhere in the world. DxEnterprise is multi-platform smart availability. You can manage multiple workloads at the instance level and also as Docker containers. This provides instance mobility from any host to any host. That too, anywhere with the help of just an application or container stop and restart. DxAG, the availability group clustering software builds highly available SQL Server Availability Groups across Windows and Linux nodes – without WSFC or Pacemaker limitations.

You can claim $100 off your PASS Summit 2019 registration here: https://dh2i.com/webinars/

October 30, 2019  10:28 AM

SwiftStack 7 A New Paradigm In Data Storage and Management @SwiftStack

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Data Management, Data storage, Petabyte, SwiftStack

Any enterprise or organization thinking or working towards being more data-driven can’t think of surviving without SwiftStack 7. Just 8 years ago, SwiftStack was founded by some of the frontrunner experts in cloud computing. By now, it is the most acknowledged leader in data storage and management solutions. It has already transformed many enterprises and organizations in their unmatched achievements in these areas. Most of the industries have acknowledged getting the best solution from SwiftStack to manage petabytes of unstructured data from the edge to core to cloud. Whether it is a project in deep learning, analytics, and scientific research, or it is about leveraging large asset repositories, SwiftStack is an ultimate solution. SwiftStack 7 comes with lots of unique features that make it stand apart from any competition. It provides ultra-scale performance and capacity for data storage and management from edge to core to cloud.

SwiftStack 7 platform is built for intelligent data that ensures to deliver petabytes to any AI framework, GPU computes complexes and deep learning systems. This is a remarkable enhancement to its data platform for data-utilization, performance, and services at ultra-scale. This new platform ingests and processes 4K/8K videos effortlessly including any other compute-intensive activities. Most of the enterprises, industries, and businesses across the globe are not able to tackle emerging data-intensive applications that need a modern storage architecture along with an ability to feed thousands of GPU units working in parallel. That too from a single global namespace comprising of edge, core, and cloud data clusters. No other solution can beat SwiftStack 7 in tackling these emerging data-intensive applications flawlessly. The new features offered by SwiftStack 7 include ultra-scale performance, data immutability, workflow integration, and distributed flash-based caching all with standards-based APIs.

SwiftStack 7

George Crump, Chief Steward at Storage Switzerland says, “Data scientists are constrained by inherited infrastructure, particularly in performance, scalability, cost, metadata enrichment, workflow integration, and portability to accommodate data at the edge, core, and cloud. Large enterprise customers need to architect a data pipeline and framework to deliver business outcomes and intelligence, and SwiftStack’s software is a strategic component supporting these modern applications.”

SwiftStack 7

Source: SwiftStack

Joe Arnold, SwiftStack founder, and chief product officer says, “Right now, data is changing the world, applications can exist anywhere from the edge to the core data center to the cloud, and data management and control have been decoupled from core infrastructure. Our customers are pushing the boundaries in demanding environments, such as deep learning, and SwiftStack 7 is the foundation for delivering performance, capacity, and services at scale.”

This video would help gain more insights:

This would also make an interesting read: Anatomy of SwiftStack 1space

October 28, 2019  7:13 PM

IP Proxy Network To Strengthen Your Business @luminati_io

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Internet protocol, Open source, Proxy

In July 2019, Frost & Sullivan published a report ‘Global IP Proxy Networks Market’. The crux of the report is IP proxy network is going to be a major determinant in the existence and growth of any online business. The global market of IP Proxy Network standing at a figure of $76 million currently is to grow at a CAGR of almost 17%. By 2025 it will touch $260 million. The use of IP proxy networks is becoming essential for most of the businesses. Logically, all the online retail companies, for instance, need to study their competitor’s pricing strategy for the same commodity they are launching or selling in the market to analyze, compare prices and then place their item at a better price to gain an upper edge. This can happen only if their programs can capture this customized information in a simulated manner.

IP Proxy Network

Source: Luminati.io

Basically, with the help of IP proxy networks, the system builds an instance to behave as a user in that particular geographic location. Companies not using IP proxy networks are running their businesses at a very high risk of collecting inaccurate data that will keep them lagging behind the pace gaining them no business or customers. Luminati is the world leader in this technology. The company was launched in 2014, and in a short span, it has become the world’s largest proxy network operator with a sole aim to provide an open-source data collection. The technology it developed has the power to route internet-based traffic via different touchpoints on a global network. The company enables brands to position their products as the best value option.

IP Proxy Network

Luminati’s first of its kind enterprise IP proxy network empowers its customers to collect the most realistic and accurate competitive intelligence to help them manage their businesses most effectively. This is the first of its kind of technology that is helping a large number of businesses, companies, enterprises, brands, and product aggregators with an open and transparent view of the internet, irrespective of their geographic location, devices, service provider, etc. There is no other way to achieve such real-time and accurate results with the help of any other technology. With a strength of 120, Luminati is managing more than 10,000 customers across the globe. Enterprise proxy networks are increasingly becoming important to create network transparency. Use cases would be data collection, price comparison, fraud protection, brand protection, application performance, ad verification, talent sourcing, account management, and cybersecurity, to name a few.

The power of an IP proxy network is huge. It all depends on the businesses how they like to explore it, and how much they want to harness it to manage their businesses efficiently.

October 27, 2019  8:49 PM

Micron Technology Launches Deep Learning Solutions @MicronTech

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Artificial intelligence, Deep learning, Machine learning, Micron, Semiconductor

Micron Technology Inc. has clearly emerged as a global market leader in memory and storage solutions. It is the 4th largest semiconductor company across the globe. A recent acquisition of FWDNXT provides it a clear cut edge over the Artificial Intelligence domain. FWDNXT is a software and hardware startup mostly in artificial startup technology. With this significant acquisition, Micron jumps in a comprehensive AI development platform. Technologies lying with FWDNXT enable Micron to create key building blocks for innovative memory and AI workloads. Soon after this acquisition, Micron has come up with a unique and powerful high-performance combination of hardware and software tools for deep learning applications. It is actually the newly formed composition of advanced Micron memory with FWDNXT’s AI technology that made it possible to explore deep learning solutions thus making data analytics, especially in IoT and edge computing more meaningful. This is a major breakthrough.

Deep Learning

Source: Micron Technologies Inc.

The global trends clearly show companies worldwide are developing or intending to develop more complex AI and machine learning systems. That, in turn, increases the demand and importance of the hardware used to train and run those models. Micron brings memory and computes together with the help of its DLA (Deep Learning Accelerator) technology. The efficient and high-performance hardware and software solutions based on deep learning and neural networks have made it more advantageous for businesses. The Micron DLA technology has got powered up by the AI interface engine developed by FWDNXT. This geared-up Micron to launch tools to observe, assess, and develop innovation bringing memory and computing working together. Overall, it helped Micron to enhance performance and lower power consumption. Micron’s DLA technology empowers enterprises with an easy-to-use software programmable platform that works with a wide range of machine learning frameworks and neural networks.

Deep Learning

Micron Executive Vice President and Chief Business Officer Sumit Sadana said, “FWDNXT is an architecture designed to create fast-time-to-market edge AI solutions through an extremely easy to use software framework with broad modeling support and flexibility. FWDNXT’s five generations of machine learning inference engine development and neural network algorithms, combined with Micron’s deep memory expertise, unlock new power and performance capabilities to enable innovation for the most complex and demanding edge applications.”

October 24, 2019  2:48 PM

PayGo @SIOSTech Ensures High Availability of SQL Server in AWS Cloud

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Failover Cluster Management, Failover Clustering, High Availability, SIOS, SQL Server

SIOS Technology Corp. has emerged and established itself very fast among the big fishes in the global markets in the field of high availability and disaster recovery solutions. Why SIOS solutions are among the top is for the simple reason of availability and elimination of data loss for critical Windows and Linux applications running across heterogeneous enterprise environments viz virtual, cloud, physical, and hybrid cloud. On the same note, SIOS clustering software has become essential for enterprises having an IT infrastructure with applications needing a high degree of resiliency along with ensuring near to 100% uptime with no compromise with performance or data thus protecting their business ecosystem from any kind of local failures and regional outages be it planned or unplanned. PayGo, the latest launch by SIOS Technology Corp., ensures the high availability of SQL Server in the AWS cloud with SIOS DataKeeper.

SIOS Technology Corp., established in 1999 and headquartered in San Mateo, California has offices worldwide. SIOS, SIOS Technology, SIOS DataKeeper, SIOS Protection Suite, Clusters Your Way, and associated logos are registered trademarks or trademarks of SIOS Technology Corp. Why customers prefer SIOS is for building dependable operations, high performance, and ease of use. That has made SIOS an industry pioneer in providing IT Resilience via intelligent application availiability thus mitigating a large number of risks associated. PayGo uses SIOS DataKeeper on Amazon Web Services (AWS). It does so by utilizing Elastic Compute Cloud (EC2) virtual servers with a solid-state drive (SSD)-only storage. It results in creating a rapid, automatic failover environment that eventually ensures high availability (HA) for the enterprise’s key business applications on SQL Server. PayGo, in fact, is an integrated utility payment solution provider. It handles the largest energy company prepay ecosystem in the United States.


PayGo basically, as of now, is running four production environments on Amazon cloud. The fifth one is about to come anytime. The environment comprises of SQL Server 2017 Standard Edition running on Windows Server 2012 R2. But soon after testing is completed, it will migrate to Windows Server 2019. Chad Gates, Senior Director of Infrastructure and Security, PayGo says, “Our backend SQL Servers hold terabytes of data that must be available 24×7. As a Windows shop, we prefer to use Windows Server Failover Clustering (WSFC) for data protection and continuous operation in case of any failures. But WSFC requires some form of shared storage, like a storage area network (SAN) and that isn’t natively available in AWS.”

Chad concludes, “SIOS DataKeeper Cluster Edition overcame the problem caused by the lack of shared storage. Its use of a mirrored drive looks like shared storage to the WSFC. It was exactly what we wanted.” Interestingly, SIOS DataKeeper also met PayGo’s other three criteria better than any other solution. On the results achieved, Chad says, “We have been using SIOS DataKeeper for several years now, and it has proven to be the most rock-solid piece of software we have.”


Source: SIOS

Frank Jablonski, VP of global marketing, SIOS Technology says, “Whether you need to protect applications on a physical server, a private cloud, a public cloud or a hybrid cloud, you need to meet the same SLAs for application availability regardless of location. Applications running in clouds also need to be protected against the inevitable cloud outage through the use of availability zones and regions with automated intelligent failover. PayGo is using SIOS to provide a fast, easy way to deploy applications in a high availability environment in the AWS cloud while continuing to use Windows Server Failover Clustering.”

October 23, 2019  4:38 PM

PathWave Test 2020 Software Suite @Keysight For Rapid Product Development

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
5G, 5G and IoT, automotive, IIoT, product development

Keysight Technologies is among the global pioneers in technologies. It empowers enterprises, governments, and service providers to innovate to connect and secure the world. Basically, the solutions provided by Keysight Technologies enhance network optimization. Customers are able to launch electronic products in the market faster and at a lower cost. All this is possible because of Keysight’s design simulation, prototype validation, manufacturing test, and as already stated, optimization in networks & cloud environments. Any enterprise engaged in such activities is well aware of the power of Keysight solutions. The key customer verticals comprise communications, aerospace and defense, energy, automotive, general electronics, and semiconductors. Customers are spread across all geographies around the world. During the fiscal year 2018, Keysight revenues were around $4B. In order to accelerate the time-to-market of 5G, IoT, and automotive electronics, Keysight Technologies has launched Pathwave Test 2020 Software Suite to enable Rapid Product Development.

PathWave Test 2020 software suite

Source Keysight Technologies

Pathwave Test 2020 Software Suite has been developed on the Keysight Pathware Software Platform. The suite promises to deliver an integrated experience to accelerate time-to-market digital and wireless platforms and products. So, it is definitely going to help manufacturers in the electronic field in a big way. All 5G, IoT, and automotive engineers and developers and leverage the power of the Pathwave Test 2020 Software Suite to streamline test data processing and analysis. This further boosts speed product introductions in order to secure a competitive edge in the local of the global market as the case may be. The software enables data sharing and management between multiple platform software tools that include test automation, signal creation, advanced measurement, and signal generation. All this leads to a very useful ecosystem of data analytics.

PathWave Test 2020 Software Suite

The integrated Pathwave Test 2020 Software suite permits to develop and deploy application-tailored solutions to substantially enhance electronic test workflows and product introductions. Jay Alexander, Chief Technology Officer, Keysight Technologies says,

“The digital transformation happening today in engineering enterprises relies on accelerating time-to-market using best in class software and hardware. Keysight’s PathWave Test 2020 software suite reflects our commitment to creating powerful software solutions that help our customers streamline their workflows.”

October 21, 2019  10:26 PM

Edge Computing: Are We Back To Square One? #EdgeComputing

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Edge computing

Around four decades ago we didn’t have networking. There were standalone personal computers like one each in HR, purchase, finance, and so on. There were no laptops, no mobility. If finance needs some data from the PC in the HR, it had to move with the help of floppy disks. Individual computers, individual computing, processing, individual reports. Then came the concept of networking and centralized server for data and computing. Machines started talking to each other. Floppies were still needed but more for moving data to a computer out of the network or elsewhere. With this, organizations started relying more on computers and thus came business applications, CRM, ERP, etc. The need for data security and network security came into existence. Hubs changed to switches, switches changed to intelligent switches, routers came for connecting different networks, floppies changed to CDs, CDs to DVDs, and so on.

Edge Computing

Photo credit: jurvetson on Visualhunt / CC BY

The whole game changed by then. The Internet was a revolution. So was the cloud. For the current generation mainframes, magnetic drums, magnetic tapes, data punch cards are all history. None of them have seen these. They are not into artificial intelligence, internet of things, analytics, Big Data, virtual reality, machine learning, etc. All of a sudden with the need for edge Computing aren’t we rolling back. It’s the edge data and edge Computing that is providing more relevant information to businesses. ERPs and CRMs are taking back seats. They are not hot anymore. They are per se there by default. But business needs have changed. What about the security and safety of edge devices. Computing, processing, encryption, how are these being handled at the edge? What about data from the edge to data servers? What about the risks of so many things happening at edge devices?

Edge Computing

Aren’t we getting more exposed? More so with heterogeneous devices, mobility, BYOD, etc. What about the disaster recovery plans for an edge when edge devices and edge Computing is becoming the core lifeline of businesses? Is our focus shifting to wasteful activities or we are moving in the right direction? Is knowing the coordinates of a mobile device more important or its data? What about the change in the whole paradigm of coding and testing with edge Computing becoming prime. Are we landing in more troubled waters or heading to higher maturity levels. If so is the case, we are making machines so intelligent to perform proactive, predictive, and prescriptive analysis. Then if all intelligent work is going to be performed by machines, what will humans do? Is edge Computing an ultimate solution or it’s just a stopgap?

October 8, 2019  5:19 PM

Multi-Cloud Intelligent Data Protection Solution @aparavisoftware

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Data protection, data retention, File Backup, Multi-cloud, Ransomware

Aparavi delivers hybrid and multi-cloud file backup and long-term retention solutions. The new solution Aparavi has designed is to address the varying and fast-growing unstructured data loads that organizations are trying to tackle today. Organizations all across the globe are not only fighting with this issue but also crave for the best possible solution for backing up files from central storage devices to cloud endpoints. That is what Aparavi has come up with. The new solution also includes data awareness for intelligence and insight with efficient global security, search, and access. All this ensures files are protected and available. This is ideally a multi-cloud intelligent data protection solution from Aparavi. As we all know backup has a data dump problem. As such, enterprises have not been able to find a solution to this problem. Aparavi helps solve this issue in a very smooth and proven way.

The new multi-cloud intelligent data protection solution from Aparavi has a brilliant feature Aparavi File Protect & Insight℠. This is in fact, the second line of defense against ransomware. What it does is file by file data protection and archive for servers, endpoints and storage devices ensuring data classification, content level search, and hybrid cloud retention and versioning. While on one hand Data Awareness ensures Data Classification, Metadata Aggregation, and Policy Driven Workflows. At the same time, Global Security manages Role-based permissions, Encryption both in-flight and at rest, and File versioning. Along with all this, on the other hand, Data Search and Access takes care of Anywhere/anytime file access, Seamless cloud integration, and full-content search. So far all these are happening but in bits and pieces through multi-vendor multi-products. Not a single product until now was able to perform with all these capabilities and features.

Aparavi multi-cloud intelligent data protection solution

That’s what takes Aparavi multi-cloud intelligent data protection solution to a next level that itself creates a new paradigm of data protection and availability. The benefits are phenomenal like Single view of all data and device status, Near Instant Data Discovery, Automated ransomware alerts & defense, multi and cross-cloud support, and rapid granular recovery. Let’s understand very clearly what Aparavi File Protect & Insight does. In the nutshell it protects data, it organizes data, and it makes data usable as and when required. Let’s recall a recent Arizona Beverages ransomware attack in which even after 6 months, it was still estimated that 40% of the company servers were still operating under old out of date data. What happens to a business under such circumstances is very well understood. That’s why organizations across the globe need a rugged and proven solution such as Aparavi multi-cloud intelligent data protection solution.

Intelligent Data Protection

Source: Aparavi

Coming back to Arizona Beverages Ransomware attack and understanding the importance of Aparavi multi-cloud intelligent data protection solution. There are 1000+ employees in Arizona Beverages. More than 200 servers and computers were affected in a targeted attack. Millions of dollars were lost per day in sales. Now, what is important? Close your eyes and wait for the doomsday or get the best possible solution available in the market with a small investment that ensures no compromise will happen later with the organization’s finances or reputation at any cost. In this particular case, the Arizona Beverages network was hacked and encrypted. If Arizona Beverages had a second line of Aparavi protection, it would have provided them with a mountable archive to restore data files quickly on a file-by-file level or entire contents of a protected location. File under management could be restored based on need.

Aparavi multi-cloud intelligent data protection solution

Aparavi multi-cloud intelligent data protection solution ensures files under management can be restored based on need and left in the archive if the data is not needed for the resumption of daily operations. It provides a secondary immutable copy of data from servers, endpoints. Finally, it adds intelligent classification so that risk-averse data can be managed more securely, rather than stored on the potentially targeted local storage of data. Aparavi FPI certified clouds include Amazon Web Services (AWS), IBM Cloud, Google Cloud Platform, Scality, Wasabi, Oracle Cloud, Cloudian, Caringo, BackBlaze, and Azure. Aparavi FPI can solve problems like Ransomware Recovery, Endpoint and ROBO Protection, Data Retention and Archive, Compliance and Governance, and Storage Optimization. Can you name another single solution available in the market having the same capabilities?

Jonathan Schwam, Principal Architect, Core82 Inc. says, “The business driver for selecting Aparavi was to absolutely, positively ensure that we had immutable data for the time required.”

October 8, 2019  1:13 PM

Micron Global Development Centre Launch in Hyderabad @microntech

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Memory, Micron, NAND, SSD, Storage

A grand opening of Micron Global Development Centre in Hyderabad by Micron Technology Inc. Cleary shows their confidence in the talent pool available locally in Hyderabad. Present were some prominent top officials like Dee Mooney, Executive Director, Micron Foundation; Sanjay Mehrotra, President, and Chief Executive Officer, Micron Technology Inc.; April Arnzen, Senior Vice President of Human Resources, Micron Technology Inc.; and Jeff VerHeul, Senior Vice President, Nonvolatile Engineering, Micron Technology Inc. Their presence was more than enough to indicate Micron’s seriousness towards such a large scale launch in India, that too in Hyderabad, a fast emerging technology hub of India. This is Micron’s second footprint in India. They already have an operations site in Bengaluru launched in April 2019. Expansion at such a fast pace in India is really credible for both Micron and India. There was a high level of enthusiasm and energy all around on the huge floor.

Micron Global Development Centre in Hyderabad aims to create a talent pool in diversified fields to gain faster breakthroughs in the latest emerging technologies like Artificial Intelligence, Machine Learning, Internet of Things (IoT)m, and Emerging Memory. Interestingly Micron has already more than 40,000 patents to its credit. With around 700+ employees currently at this newly launched Global Development Centre in Hyderabad, the aim is to increase this strength to 2,000 by next year or so. Micron is a well-established leader in the world with having its global standard manufacturing plants in 6 countries that include the United States, Japan, Malaysia, China, and Taiwan. In India, the focus is mainly on research and development. There are concrete plans to play a key role in contributing to the development of technologies behind breakthroughs in a wide range of areas like machine learning and artificial intelligence.

Micron Global Development Centre

Guests of honor at the launch of Micron Global Development Centre included KT Rama Rao, Minister of IT, Industries and Municipal Administration & Urban Development, Government of Telangana; Amitabh Kant, CEO, National Institute of Transforming India (NITI) Aayog; and S K Joshi, Chief Secretary, Government of Telangana. On the launch, Sanjay Mehrotra said,

“We’re delighted to launch our Global Development Centre in Hyderabad and expand our team of engineers, researchers, developers, and IT specialists. Leveraging global talent, like our new Hyderabad team members, helps us drive innovation and stay at the forefront of emerging memory technologies.”

With the help of a highly educated and talented workforce recruited through premium technology institutes of India, Micron aims to set very high standards to cater to the needs of their global companies across the globe. The hunt for the best talent to recruit is always on the top for Micron.

Micron Global Development Centre

The skillset of employees at Micron Global Development Centre Hyderabad includes information technology functions and engineering having expertise and knowledge of designing and developing high-quality, cutting-edge memory and storage solutions. To get the best of the talent Micron has already tied up with some of the best institutes in India and has also set some ultra-advanced labs in some of these institutes to let the talent groom in the right direction. Ultimately, there has to be a great team of engineers, developers, and researchers to innovate to cater to the complexity of the organization’s global operations network that includes procurement, supply chain, manufacturing, packaging, test & assembly, quality, and information technology segments. Micron already has a number of global centers of excellence across the globe having innovative achievements in design and product engineering for next-generation memory and high-value storage solutions. Micron is already a world leader in this.

A podcast to the launch can be accessed here.

October 5, 2019  10:31 PM

Analytics on Data Lake Storage Goes 70x Faster With Dremio 4 @Dremio

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Analytics, Big Data, Data lake

What matters most in the case of analytics is the speed of execution to gain faster data insights, almost in a real-time environment so as to take quick and accurate decisions to excel in business. Otherwise, all those efforts in investments in data, storage, software, and hardware fetch no real value to the business. So, the key component is speed. Now, how do you achieve that? For that, you need to understand the breakthroughs happening across the globe. One of the most significant in this regard is Dremio’s Data Lake Engine Dremio 4.0. It enables breakthrough speed for analytics on Data Lake Storage. Mind you, this is not a simple achievement. It is something that is going to change the whole paradigm of the performance of data lake engines. As a matter of fact, it is a new landmark in itself that makes Dremio stand apart from any other vendors.

Dremio 4.0 significantly enhances performance and provides a self-service semantic layer for data in ADLS, S3, and other data sources. Dremio is one of the pioneer data lake engine company. The recent announcement about the release of its data lake engines for AWS, Azure, and Hybrid Cloud adds a new chapter in their achievements. Dremio 4.0, an open-source platform includes advanced columnar caching, predictive pipelining, and an altogether new smart execution engine kernel that delivers up to 70x increase in performance. That is, in fact, phenomenal. What is brings is flexibility and control for data architects and self-service model for data consumers. Dremio empowers companies to operationalize data lake storage like ADLS and S3. Data becomes easier to consume while provisioning the interactive performance that users strive to achieve. Dremio 4.0 data lake engine provides ANSI SQL capabilities, including complex joins, large aggregations, sub-selects, etc.

Dremio Creats A New History in Data Lake Engines

Ivan Alvarez, IT vice president, big data and analytics, NCR Corporation says, “We process hundreds of thousands of transactions on a daily basis and produce insights based on those transactions; this type of capability requires sophisticated and scalable data platforms. Dremio is working with NCR to solve the integration between traditional enterprise data warehouse and scalable distributed compute platforms for big data repositories. This integration allows NCR to also cross-pollinate data engineering knowledge among platforms and most importantly to deliver faster data insights to our internal and external customers.”

Mike Leone, senior analyst, Enterprise Strategy Group says, “Organizations recognize the value of being able to quickly leverage data and analytics services to further their data-driven initiatives. But it’s more important than ever to start with a strong data foundation, especially one that can simplify the usage of a data lake to enable organizations to maximize data availability, accessibility, and insights. Dremio is addressing this need by providing a self-sufficient way for organizations and personnel to do what they want with the data that matters, no matter where that data is, how big it is, how quickly it changes, or what structure it’s in.”

What makes Dremio special is its features like Columnar Cloud Cache (C3), Column-Aware Predictive Pipelining, Gandiva GA, Single Sign-on and Azure AD, and Advanced AWS Security. Tomer Shiran, co-founder and CEO, Dremio says, “Dremio’s Data Lake Engine makes queries on data lake storage extremely fast so that companies no longer have to move data into proprietary data warehouses or create cubes or extracts to get value from that data. We’re excited to announce new technologies – like our Columnar Cloud Cache (C3) and Predictive Pipelining – that work alongside Apache Arrow and the Dremio-developed Gandiva kernel to deliver big increases in performance.”

Latest release of Dremio

The latest release of Dremio can be downloaded from here. To getn more insights you can visit Dermio’s official blog.

October 3, 2019  11:32 AM

How to Improve and Lower the Cost of Distributed Enterprise Edge Data Management and Security

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Distributed applications, Distributed computing, Distributed storage, Distributed system, Edge computing, edge devices, edge security

The distributed enterprise model continues to grow exponentially driven by both strategic plannings, as well as by default. That is, more and more organizations consist of a centralized headquarters datacenter with in-house IT expertise, together with remote sites, smaller footprint sites where users and technology are located, but oftentimes do not have the same level of onsite IT know-how, if any at all.

This distributed enterprise model has various versions – ranging from a corporate office having subsidiary locations, such as a large retail brand with numerous retail stores distributed globally in malls and other shopping centers; to a large oil and gas or fast food conglomerate having numerous franchisee locations; to financial institutions with distributed ATM kiosks. Likewise, government agencies also rely upon a compute model in which there is a centralized command location, with intelligent mobile devices and internet of things (IoT) solutions in the field to help achieve their missions.

What all of these distributed enterprises have in common is that whether they know it or not, their success has become unavoidably tied to their ability to successfully deploy, manage and protect an edge computing infrastructure. Naturally for the most part, at the HQ level where IT expertise is typically based, this fact is well known. However, at the remote locations where IT is not their core competency – nor should it be, this fact may be less recognized.

I spoke recently with Cybera’s Andrew Lev, CEO; Bethany Allee, Executive Vice President of Marketing; and Paul Melton, Senior Vice President and General Manager, Petroleum for their perspective on this topic. Cybera delivers a secure software-defined wide area network (SD-WAN) edge for remote sites and IoT via the cloud. While Cybera stated they have over 90,000 customers in 23 countries, for the purposes of our discussion much of the focus was on its oil and gas customers (gas stations and convenience stores) in North America. This industry is a prime example of edge computing opportunities and challenges being faced.

Lev explained that EMV Chip Card Compliance is on the minds of most in this industry (anyone that takes payment via credit card should be concerned). Gas stations are among those that have an even greater challenge, as they must ensure compliance inside the stores and outside at the pumps, where payment is also accepted. While compliance will help to eliminate credit card fraud and protect the owners from liability, preparing for it requires additional technology and services, across applications and the network, which can be very complicated and expensive to properly deploy and manage, as well as secure and protect. It is estimated that over 700,000 pumps in North America alone need to go through this transformation.

Melton added that in addition to payments, other transactions are now taking place at the pump. Fuel pumps have become IoT devices that do a great deal more than just dispensing and monitor levels of fuel. Loyalty and reward programs, media, and lottery, as well as other products and services, are all enabled at the pump. And, while the bandwidth requirements are not significant, billions of transactions can take place on a monthly basis and at a high frequency that may include personal, financial, purchase or other private information. Making it more complicated, these transactions may need to be processed locally by the gas station/convenience store, transmitted back to a corporate office and/or sent to an outside third party vendor, in a different part of the country or world. Undoubtedly, even a moment of downtime or a breach in security can have disastrous consequences.

Melton explained that Cybera technology helps to enable these transactions – data transmissions, to take place in a highly reliable and secure manner, by incorporating edge and cloud systems. And, while more and more customers are using broadband – not all broadband is created equal. Cybera delivers a certified appliance that ensures a failover network with wireless tech (LTE service) which will maintain uptime, even if the customer’s service is interrupted. The Cybera appliance has network segmentation and security built-in. Cybera enables the network flow that provides partners/customers with better visibility and control to deliver the products and services intended. In addition, Cybera’s cloud-based SD-WAN edge appliance has orchestration built into the system. It is able to orchestrate all of the applications and ensure they are performing well together. This is important for privacy and security. Cybera segments-out each application and creates a virtual application network for each, that is fully segmented away and ensures individual application traffic never crosses. However, it is also critical for performance prioritization. For instance, payments must be immediately processed, whereas a carwash transaction can wait. Cybera leverages its in-depth experience with the industry combined with facilitating the individual gas station/convenience store operator to set the priorities – critical over non-critical, those that have tolerance versus those that have zero tolerance for a delay. Cybera can do all of this via one box with embedded security backed by 24×7 support, delivering a mature multi-tenant environment (versus the five or six boxes with wires everywhere that were likely there previously). I found these points very interesting. And, I believe these capabilities are quite unique to Cybera.

Allee also discussed the important fact that one of the biggest challenges Cybera, as well as other vendors, the face is a seamless integration of the numerous technologies, products, and services in the fuel industry ecosystem. However, given Cybera’s years in the industry, it has the benefit of knowing and having forged relationships with the majority. And, for those they do not already know, the operator/customer makes introductions. Collaboration is key between what can be hundreds of various technologies, applications, and services. These other vendors recognize they are not in the networking business. Via these relationships, the conflict that might otherwise be easily created is eliminated. And, customers can accelerate the safe and seamless deployment of new technologies and digital transformation.

Lev concluded with explaining that Cybera has been delivering SD-WAN before the industry even had a name for it. There are about 100 companies that call themselves SD-WAN. However, almost all of the other vendors are focused on the large enterprise (and at a much higher price point), which has very different requirements and challenges than the highly distributed customers Cybera serves. Whether oil and gas, retail, quick-service restaurants (QSR), healthcare, financial or kiosks, for the distributed enterprise – brick and mortar, with smaller remote locations at the edge – deploying, managing and protecting the technology and applications that deliver a better customer experience, increased revenues, and enhanced profit margins have historically been highly complex, but with Cybera it appears it doesn’t have to be.

Cybera’s website can be found at https://www.cybera.com/ should you wish to learn more.

September 30, 2019  10:52 PM

Building a Columnar Database on RAMCloud @Amazon

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Database Management Systems, Shared storage

Building a Columnar Database on RAMCloud: Database Design for the Low-Latency Enabled Data Center (In-Memory Data Management Research) by Christian Tinnefeld

Excerpt as on Amazon.com

This book examines the field of parallel database management systems and illustrates the great variety of solutions based on a shared-storage or a shared-nothing architecture. Constantly dropping memory prices and the desire to operate with low-latency responses on large sets of data paved the way for main memory-based parallel database management systems. However, this area is currently dominated by the shared-nothing approach in order to preserve the in-memory performance advantage by processing data locally on each server. The main argument this book makes is that such a unilateral development will cease due to the combination of the following three trends: a) Today’s network technology features remote direct memory access (RDMA) and narrows the performance gap between accessing main memory on a server and of a remote server to and even below a single order of magnitude. b) Modern storage systems scale gracefully, are elastic and provide high-availability. c) A modern storage system such as Stanford’s RAM Cloud even keeps all data resident in the main memory. Exploiting these characteristics in the context of a main memory-based parallel database management system is desirable. The book demonstrates that the advent of RDMA-enabled network technology makes the creation of a parallel main memory DBMS based on a shared-storage approach feasible.

About the Author
Dr. Christian Tinnefeld received his B.Sc. and M.Sc. degrees from the Hasso Plattner Institute at the University of Potsdam, Germany, where he also pursued his doctoral studies. His main research interests are In-Memory Databases and Cloud Computing. In the former area, he has been collaborating with SAP for six years and contributed to initial concepts of the SAP HANA database. In the latter area, Christian has been collaborating with the RAM Cloud Project at Stanford University, California, the USA for three years.

September 30, 2019  10:33 PM

Forsa For Developers With Optane by @FormulusBlack @packethost

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
DRAM, Intel

Forsa is a memory-based storage and virtualization system that has been created by Formulus Black. The solution is powered by latest in memory that is Intel’s Optane DC persistent memory. All this runs on the Packet’s cloud platform. That means the three giants in their respective fields Intel, Formulus Black, and Packet have jointly created an absolute solution for developers to test, validate, and optimize their data-intensive and real-time application workload on Forsta running with the best in the class memory on the cloud. This genius collaboration between the three to deliver persistent memory powered infrastructure solution to application developers across the globe is something ultimate for the developers. This is perhaps they have been craving for, for a long period to test, validate, and optimize data-intensive and real-time workloads on a proven system like this. The memory-based storage and virtualization solution is going to change the whole paradigm.


Today’s developers and IT leaders are surrounded by data-intensive and latency-sensitive applications. Till now they didn’t have a strong platform like Forsa to match their talent and business demand. The business applications today need to access limitless data. At the same time, while on one hand, data volumes for business are increasing exponentially, on the other hand, there is no time for business to spare a moment without having full access to the applications and data. The results are required in a jet speed. That is how the need arose and Venture-backed startup Formulus Black and Intel decided to collaborate to develop high-performance infrastructure solutions powered by persistent memory. Forsa is server-based software designed by Formulus Black that delivers Intel Optane DC Persistent Memory with improvised performance, availability, usability, and capacity. This is the first such kind of environment designed for developers.


Intel Optane DC persistent memory has the power to enhance the performance of data-intensive applications relative to NAND based SSDs. How it does that is quite interesting to know. It shortens the traditional path of data between memory and peripheral storage. That’s not all. Forsa works many steps ahead. It triggers the performance of Optance DC persistent memory through algorithms to optimize I/O between CPU and memory. In the nutshell, it performs some fantastic jobs like decreasing CPU usage, leveraging more TPS/IOPS, and lowering latency under maximum load conditions.

Jing Xie, Chief Operating Officer at Formulus Black says, “We believe Intel Optane is the leading persistent memory technology in the market today and we are excited to provide software that enhances its usability, performance, and TCO for private and hybrid cloud platforms like Packet. By combining our respective capabilities, we are making it easier for developers and IT leaders to adopt persistent memory solutions and supercharge the performance of databases, ERP systems, and custom-built data-intensive applications.”

Optane DC persistent memory is tremendously powerful in comparison to traditional DDR4 DRAM. It has the capability of offering a higher system capacity at lower total cost/gigabyte. It also supports data persistence. What that means is that data can be stored in memory in a very cost-effective manner which then can be utilized for faster analytics processing and query performance as and when required. This enables it to perform and get results in a real-time environment so that business gets the best out of technology.


Alper Ilkbahar, vice president of Non-volatile Memory and Storage Solutions at Intel says, “As Intel Optane DC Persistent Memory scales in the market, the industry is embracing the technology as truly transformational. The evolving software ecosystem using Optane DC persistent memory, such as Formulus Black’s Forsa, will only accelerate the pace of innovation for developers and end-users.”

Jacob Smith, Chief Marketing Officer and co-founder of Packet says, “Due to unmatched efficiency, the number of workloads that benefit from high-performance memory-based storage is expanding rapidly. The combination of Intel Optane and Formulus Black’s Forsa is exactly what many of our most demanding users are looking for.”

For a free trial of Forsa on Packet, please sign up at https://www.packet.com/about/formulus-black/

September 24, 2019  11:14 PM

Software-Defined Interconnect Platform Luxon @Stateless_Inc

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja

Think of a Software-Defined Interconnect Platform that has infinite interconnection service chains. What that means is that the new interconnects are easily created using any of the available combinations of security services, routing, data encryption, and so on. What if it has Increased connectivity possibilities. That means, with the help of additional connection protocol it becomes easy to integrate into existing networks. What if there is a new switch technology that opens more configuration possibilities. With the help of cutting edge switching software, software control extends to replace customized hardware. Then there is multi-tenant support. Once you deploy that software-defined interconnect platform in an enterprise or a service provider, end users are empowered to manage their own interconnects, scaling up to any extent. Now, come out of that thinking mode and get into the experience mode. You can now experience this all with Luxon by Stateless.

With Software-Defined Interconnect platform, Stateless reinvents connectivity. It creates a new landmark in connectivity. This groundbreaking technology delivers simple yet powerful, composable, scalable connectivity for any cloud-based applications environment. The future of connectivity is here. It shifts the whole paradigm of connectivity to the software. The whole game is changed. The era of new hardware buying for every scaling up, every new application, and every new network are at the verge of extinction now with the launch of Software-Defined Interconnect platform Luxon by Stateless. Network connectivity is reinvented in a new avatar. Luxon, with its general availability, is the industry’s first software-defined interconnect (SD-IX) platform. Luxon’s technology, that is under the process of patenting, makes interconnecting workloads simple, composable, and scalable with its expanded capabilities. That is a revolution. And it is available.

Software-Defined Interconnect Platform

Enterprises are going to save a huge amount on account of a lot of hardware that will not be required now because of software-defined interconnect platform. Device-Centric methods of constructing infrastructure are no more required and thus can easily be discarded. This is possible with Luxon’s expanded platform that easily empowers businesses and colocation providers to create any number of interconnections cost-effectively, simply, and quickly. With the help of this new approach of Software-Defined Interconnect platform, customers can streamline their network operations without any capital expenditures.

software-defined interconnect

Source: Stateless

Craig Matsumoto, Senior Analyst, 451 Research says, “As IT distributes outward to the cloud, businesses are becoming even more dependent on connectivity. What used to be the enterprise network now reaches out to multiple destinations, serving applications that have differing requirements. Stateless is simplifying connectivity by providing a kind of flexibility and control that isn’t necessarily found in conventional physical and virtual devices.”

Murad Kablan, CEO and co-founder, Stateless says, “The accelerated movement of applications to the cloud is creating a new era of connectivity. To maintain a competitive advantage, businesses require fast, easy connections to the cloud, support for numerous interconnects and the ability to deploy a diverse array of new applications quickly. Today it often requires intricate engineering and new hardware and takes months for a business to deploy a single cloud application. Luxon simplifies a traditionally complex, multi-step process into a single step that takes only minutes.”

To learn how Lexon works, watch this video:

September 23, 2019  11:24 PM

Keysight World India 2019 A Lot Insightful Event @keysight

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
5G, Testing

This is the concluding post of the 2-post series covering Keysight World India 2019. The first post can be accessed here. The event was held in two major technology cities of India – Bangalore and New Delhi.

What Keysight is doing is phenomenal indeed. The organization has expertise in providing an excellent environment to industries and industry leaders and inspire, enable, and accelerate the realization of their innovations. They are in fact enabler in real sense. It’s like catalyzing the whole electronic and semiconductor ecosystem in a very synchronized and orchestrated manner. This is the only company in the world that manages the electronic and semiconductor value chain while addressing test and measurement challenges and enriching with real would Solutions at such a large scale. It makes industries make their dreams come true by providing them with true potential to realize those. The global trends have changed completely when it comes to success. The whole mechanism has become very dynamic, demanding, and challenging thus asking for an agile and integrated approach to innovation.

Keysight World India

When there is a demand to deliver something really big, it needs a perfect combination of technology, partners, collaboration, inspiration, and solutions from around the world. That’s the core strength of Keysight Technologies and that is the driving idea of Keysight World India. Keysight Technologies is a world leader in electronic measurement offering a complete stack of test and design solutions to manage the entire lifecycle. You think of any area, be it simulation to R&D, design verification and pre-conformation, conformation, manufacturing, deployment, or service assurance, Keysight ensures to be the first choice.

Keysight World India

Sudhir Tangri, VP & Country General Manager India at Keysight Technologies says,

“Since customers play an integral role in the success of Keysight, Keysight World helped in bringing our customers together and served as a platform for them to network and exchange ideas. At Keysight, we believe that these seminars add value to our customers and their businesses.”

Sandeep Kapoor, Director Marketing (Europe, MEA and India), Keysight Technologies says,

“The program was designed to provide experiences through real-world product demonstrations, technical presentations, and networking among industry experts. Keysight World was a great platform to learn from industry leaders and technical experts about the latest industry directions and technologies. We saw a lot of interest from our customers and industry experts in Keysight World. We had a strong panel of speakers who shed light on innovative solutions and topics that are key for our stakeholders.”

The key focus areas of Keysight World India are 5G Technologies, Aerospace and Defense, Automotive & Energy, High-Speed Digital & Data Center Infrastructure and Network & Security Test. There’s a complete paradigm shift in all these industry segments with a high demand to innovate to market faster. Some of these include keeping up with network data speeds – 400G and beyond, test considerations for next-generation WiFi, Automation, to name a few that were topics covered in the sessions at Keysight World India. To listen to the global experts who spoke at Keysight World India, you can switch to the podcast here. These experts include:
Sandeep Kapoor, Director Marketing (Europe, MEA, and India)
Donna Majcen, Vice President, Global Field Marketing
Vince Nguyen, General Manager – Aerospace & Defense Business
Michael Raser, Business Development Director, Automotive & Energy Solutions
Brigham (Brig) Assay, Director of Strategic Planning, Internal Infrastructure Solutions
Eric Taylor, Vice President, Managed Services Division

September 23, 2019  11:08 PM

Keysight World 2019 A Golden Chance To Meet The Leaders @keysight

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja

Technology is changing very fast. To embrace it and use it to its fullest to reap maximum out of it, an enterprise needs to deploy it very fast. That deployment has not only to be fast enough but foolproof too. We all know the pain of leakages at a later stage could be killing for the business leading to financial and reputational losses. To make any technology workable and deployable for businesses and enterprises, companies like Keysight Technologies play a significant role. They ensure your technologies deploy well and work well. For instance, merely 5G has no value unless you have the knowledge and technology to deploy it and ensure it works for your business the way you want. It was a golden chance to witness most of the business and technology leaders working in Keysight Technologies handling various verticals at one place.

Keysight World

It was Keysight World 2019 in India that concluded recently hosted by Keysight Technologies. Keysight Technologies is one of the most sought after technology company by enterprises, service providers, and governments for only one reason, to accelerate innovation to Connect and Secure the world. Keysight World 2019 was hosted in two cities of India, Bangalore (now Bengaluru) and Delhi (the capital city of India). It was a right occasion to listen to most of the Keysight thought leaders and learn about their technological advances, breakthrough design/test/optimization, strategies, and leading-edge solutions. Keysight Technologies provides solutions on a global scale. Most of their solutions are firsts in the market, unique, innovative, and trendsetters. Every solution is a landmark in itself. It seems as if they have only one competitor globally. Competing with themselves and thriving for the next level of excellence from their existing and unbeatable levels of excellence.

Keysight World 2019

We will continue in the next post discussion more about Keysight World 2019.

September 18, 2019  4:56 PM

TNQ InGage: A Born Leader In Augmented and Virtual Reality @tnqingage

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
AR, augmented reality, iot, mixed reality, MR, Virtual Reality

When two masters from their respective fields join hands that means something unique is going to happen in the industry. That is what has happened when Vijay Karunakaran from InGage and Abhigyan Arun from TNQ Technologies joined hands to create a new joint venture named as TNQ InGage. While Vijay Karunakaran has ample experience in boards tips and augmented and virtual reality, Abhigyan Arun has vast experience in content creation on the global front. Everybody in technology knows the opportunities lying in AR and VR business almost in every industry segment. That is why TNQ Technologies decided to invest USD 2 million in TNQ InGage. InGage, on the other hand, has extensive expertise in immersive technologies and IP. The core focus of TNQ InGage lies on developing Immersive Technology Solutions with the help of augmented reality, virtual reality, and mixed reality that is AR, VR, and MR.

Integration with IoT and analytics will bring a new color to these solutions for industries like construction, manufacturing, and healthcare. The key services include immersive training simulators using virtual reality & haptics, Field service tools augmented reality & IoT, and digital experience centers for industry 4.0.

TNQ InGage

Abhigyan Arun, CEO, TNQ Technologies says, “TNQ Technologies is a 21-year-old, 2,500 people-strong company focussed on publishing technologies and services. We see the future of content consumption leveraging immersive AR and VR environments, and have made this investment in TNQ InGage as part of our long-term strategy. The extensive work InGage has done over the last 6 years under Vijay’s leadership, and the strong relationships with prestigious customers and partners make me confident that we have the right ingredients to establish ourselves as a leader in this technology and industry. TNQ Technologies will continue to focus on its existing publishing client base and will extend the AR/VR expertise of TNQ InGage to them as a new service line.”

TNQ InGage

The global AR and VR market aims to touch $100 billion in the next 5 years. In India, the current focus of the government is on skilling and development, adoption of industry 4.0, and improved infrastructure supporting the latest forms of content delivery. Keeping this in mind TNQ InGage plans to focus its investments in product R&D with a key focus to improve human interaction with virtual objects with the help of haptics and photorealistic rendering. The ultimate goal of the newly formed joint venture is to develop state of the art immersive world-class VR products and services for healthcare and industrial verticals keeping the areas of training and rehabilitation as a top priority.

TNQ InGage

Vijay Karunakaran, Founder & CEO, TNQ InGage says, “Over the last 6 years we have delivered over 500 projects, generated extensive IP, and built a great team. TNQ Technologies’ investment as a long-term strategic partnership will help us with growth and scalability. We are already investing in R&D to improve human interaction with virtual objects. Our plan is to develop products and services that will have a meaningful impact on how people train, work and live. We are also looking at expansion in the global market, including the book-publishing world, by leveraging TNQ’s existing capabilities and competencies.”

September 18, 2019  4:21 PM

Dr. Kurt Lauk, prominent business leader joins Tachyum @tachyum

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
"Angela Merkel", processor

Dr. Kurt Lauk is a well-known personality on the global front. He is a prominent business leader, former chairman of Economic Council (EC) in Germany (Wirtschaftstart), and former advisor to Angela Merkel. He joins Tachyum as company’s Board of Advisors. Dr. Lauk has a tremendous success record of serving various management boards like that of Daimler and Audi. He has been a member of the EU parliament. He was Chairman of the Economy Council for 15 years and in that capacity, he advised Angela Merkel. Currently, he is the founder and President of Globe CP GmbH which is a personally owned family office since 2000. With Dr. Lauk joining as Tachyum’s Board of Advisors it will add huge value to the company. A comes with valuable insights regarding European industries. He carries a deep understanding of European manufacturers and suppliers.

Also, Dr. Kurt Lauk is an expert in international strategy having an exhaustive experience of bringing Technology innovations to market. Dr. Lauk has a deep-rooted experience in European business for which the credit goes to his positions as Member of the Board of Management and Head of Worldwide Commercial Vehicles Division of Daimler Chrysler from 1996 to 1999. Another contributor to this is his tenure as Deputy Chief Executive Officer (Dy CEO) and Chief Financial Officer (CFO) of Audi AG from 1989 to 1992. Dr. Lauk is currently serving as a director on the supervisory boards of Magna Inc (Toronto), and Fortemedia (Cupertino). Nomura which is Asia’s global investment bank appointed him as Germany and Austria Advisory Board Chairman in July 2018. Since 2000 he is is a Trustee of IISS (International Institute of Strategic Studies in London).

Dr. Kurt Lauk

Previously he held the position of the chief executive officer at Zinser Textile Machinery GmbH. His other earlier positions include Vice President/Consultant/Partner at Boston Consulting Group and Chief Financial Officer & Controller for VEBA AG which is now known as E.ON SE. Dr. Lauk holds a Graduate degree from Ludwig-Maximilians-Universität München, an MBA from Stanford Graduate School of Business, and a doctorate in international politics from the University of Kiel. His rich experience includes serving as a Member of the European Parliament from 2004 to 2009, including as a member of the Economic and Monetary Affairs Committee, and as a Deputy Member of the Foreign and Security Affairs Committee.

Kurt Lauk

Dr. Radoslav Danilak, Tachyum’s Founder and CEO says, “Tachyum is honored to have an international business and technology luminary of Dr. Lauk’s caliber on our Board of Advisors. I am personally excited to work directly with Kurt, to be able to rely on his advice and counsel, especially regarding EU business strategies and how Tachyum can help facilitate EU technological sovereignty.”

Adrian Vycital representing Tachyum’s lead investor IPM says, “Today, Slovakia is the world’s largest per-capita car producer. I look forward to working with Tachyum and Dr. Lauk to identify Prodigy value-added insertion points within Industry 4.0 and the automotive sector. We believe Tachyum will play an essential role in creating an innovation ecosystem, which is a key factor in the transformation of the Slovak economy from manufacturing to knowledge-based.”

Dr. Kurt Lauk

Tachyum’s flagship product codenamed Prodigy is the world’s first Universal Processor. The progress of its development is on track. Basically, with its high performance and ultra-low power consumption, Prodigy offers industry-leading performance for data center, AI, and HPC workloads. These exceptional properties gel well with 5G. That means Prodigy promises to bring a paradigm shift for telecom industry players. It will bring AI from cell tower mounted data centers to power intelligent IoT devices and autonomous systems. Tachyum plans to release a 7nm tapeout next year. And its volume production will begin in 2021. It is important to understand that Tachyum’s Prodigy Universal Processor Chip is the world’s smallest and fastest general-purpose 64-core processor developed till date. Amazingly it requires 10x less processor power and reduces processor cost by 3x.

Tachyum’s immediate plans include enabling a 32-Tensor Exaflop supercomputer as well as build machines that are more powerful than the human brain by 2021. It is a great feat that Tachyum is going to achieve years ahead of industry expectations. Prodigy ensures the reduction of data center TCO (annual total cost of ownership) by 4x with the help of its disruptive processor design and a smart compiler that is going to make many parts of the hardware found in typical processor redundant. All this results in fewer transistors, fewer and shorter wires, because of smaller & simpler core translating into much higher speed and power efficiency for the Prodigy processor.

September 17, 2019  3:46 PM

Customer Statements About Next-Gen Zoho One @Zoho

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Business transformation

This is the last post in the series covering what customers have to say about Next-Gen Zoho One and also the pricing part. The first post in the series is Next-Gen Zoho One Can Transform Enterprise Journey @zoho. The second post is Business Transformation Is The Mantra of Next-Gen Zoho One @Zoho and the penultimate one is Key Features of The Bold New Operating-System Services @Zoho. Now, let us see first what few of the prestigious customers of Next-Gen Zoho One have to say about it and how do they transform the business with it.

Customer Speaks:

Pranesh Padmanabhan, Founder & CEO, Studio 31 says,

“Zoho One as a concept has fulfilled the dream of Studio 31 in becoming a fully-fledged, technology-enabled wedding photography and film company. It has given us the confidence and strength that we can sustain in this highly competitive, unorganized industry by not worrying about manual administration work anymore and fearing about human errors; it’s a very sensitive industry and every error will cost a client long memory. Now that we have a SaaS product running our business, we’ve got brilliant ideas and we’ve got the platform and time to make high-level strategic decisions to grow bigger, better and truly be one of the most-organized businesses in this sector.”

Sonia Bhadoriya, Head of Business Development, Eurokids says,

“We switched from Salesforce to Zoho One because of the fluidity of data across apps that allowed us to connect our departments. We use Zoho CRM, Sign and Creator extensively. The analytics tool enables us to visualize big data with graphs and charts. We love it!”

Next-Gen Zoho One

Niki Kushe, Group Head – CRM, India Infoline Finance Ltd says

“IIFL is a leading financial services conglomerate serving over 4 million satisfied customers around the globe. Though an established organization, we are constantly on the look for ways to build our strength and to deliver excellent service to our ever-expanding customer base. At IIFL, we have started using Zoho One, which includes their super-powered CRM, Email, Campaign Management, Survey, Social Media Management, Sales IQ, Creator, Internal Chat and HR products across various entities. Honestly, there is no better value than Zoho One can offer, especially at this low a cost! Zoho One does change the way businesses operate by offering a whole suite of apps that are not only tightly integrated with each other but also play well with third-party applications. These vast varieties of solutions are easy to configure and customize which in itself paves for an efficient cross-selling platform.”


The story remains unfinished if I don’t mention the pricing for Zoho One. The pricing, in fact, is very simple and is pocket-friendly for any industry segment of any size. And there are no hidden costs to it. It’s Rs. 1500 per employee or RS. 3000 per user. The ROI in case of Zoho One is multifold and quick as all the new features mentioned above come free of cost with Zoho One.

September 17, 2019  3:34 PM

Key Features of The Bold New Operating-System Services @Zoho

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Business Process, provisioning, Single sign-on, Telephony

This is the third post in the series of Zoho One Next-Gen features. You can access the first post here – Next-Gen Zoho One Can Transform Enterprise Journey @zoho and the second post here – Business Transformation Is The Mantra of Next-Gen Zoho One @Zoho. In this post, we shall be discussing some of the key features offered by the Bold New Operating-System of the Businesses that is Zoho One.

Now let us look at some of the key features offered by the Bold New Operating-System Services as below:

  • 1. Communication: PhoneBridge
  • – PhoneBridge is Zoho’s newer entry in the bouquet of applications in Zoho One. It is, in fact, Zoho’s telephony platform that enables telephony in Zoho apps. For instance, in Zoho CRM this PhoneBridge integration permits users to make calls from Zoho apps. Not only that it also provides contextual information on incoming calls. What it means is, if PhoneBridge is enabled in Zoho CRM it gives users context for all incoming calls not only from Zoho CRM but also from Zoho Recruit, Zoho Mail, and 20+ other apps.

  • 2. Single Sign-On:
  • Single Sign-On (SSO), another new service from Zoho One enables customers to integrate any third-party applications with their account. As a matter of fact, Zoho Single Sign-On currently supports more than 50 third party applications and account is increasing on a regular basis. This third party application integration can be done in two ways. It can be done either individual user wise or groups wise.

  • 3. App Management and Provisioning
  • : As of now Zoho One allows provisioning for all of its 45 plus apps.

    Bold New Operating-System

  • 4. Business Workflow Management
  • : Orchestly is the new Innovative and intuitive drag and drop interface helping managers or the process owners to define processes effortlessly without any technical knowledge in coding. There are ample practical examples of this like Purchase Approvals, Content Publishing, Asset Management, Onboarding, and so on.

  • 5. Zoho Sign:
  • Zoho Sign builds an additional level of validation for customers with the help of blockchain-based timestamping through Ethereum. Ethereum, as we all know, is a globally accepted open-source platform. So when a document is signed using Zoho Sign, an Ethereum transaction happens in the background, actually. The hash of the signed document is added to the transaction notes of that Ethereum transaction happening in the background.

    Next post is the concluding post in this series. Continued »

    September 17, 2019  3:16 PM

    Business Transformation Is The Mantra of Next-Gen Zoho One @Zoho

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Business transformation

    This is the second post in the series where we are talking about Next-Gen new features and how it empowers its customers in a multifaceted manner focusing on multi-directional business growth. The first post can be accessed here. We are talking about the power of Orchestly interface. This interface is so easy to manage that even the non-tech business process owners of various departments can build their own processes without any help from the technology department. Within 2 years of its launch, Zoho One has achieved substantial growth and is now catering to more than 20,000 customers across the globe. The new next-gen features added to the operating system include a new application that connects every corner of business operations, a number of updates in the existing modules, and an overall unmatched performance how the complete business suite.

    If we look at the statistics of Zoho One usage around 25% of its customers use more than 25 applications, and more than 50% are using 16 or more applications from the bouquet of applications on Zoho One platform. This itself proves how badly businesses are craving for an all in one business solution thereby to reduce the complexity of managing a large number of vendors and consistently arising conflicts out of the integration of multi-vendor multi-platform business applications. The speed at which Zoho One is getting adopted in the market among different business verticals is a proof of a major shift in customer expectation and rejection of complexity in lieu of easy to use All-in-One platforms to get immense value out of it.

    Business Transformation with Next-Gen Zoho One

    Rajendran Dandapani, Director of Technology, Zoho says,

    “Technology is supposed to help businesses. Instead, it has evolved into complex beast customers have to tame—from juggling apps from multiple vendors to trying to solve the multi-app integration puzzle to dealing with vendors forcing customers into expensive, lengthy contracts. The technology industry has gone too far down this path and this has to change. With Zoho One, we want to change all of that. It’s a technology platform to run your entire business with a trustworthy vendor that is easy to do business with. With Zoho One, you are not just licensing the technology. You are licensing peace of mind.”

    Don’t miss the next post in this series.

    September 17, 2019  2:29 PM

    Next-Gen Zoho One Can Transform Enterprise Journey @zoho

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Blockchain, Business applications, Business Process Automation, Single sign-on, Telephony

    Zoho is truly fitting in what it says ‘the operating system for businesses’. Zoho One is, in fact, a complete business solution with a high level of material and high focus on all business levels together. With the launch of Zoho Next-Gen Zoho One, the whole ecosystem has become much stronger with a power to deliver more to the business across all the user and management levels. The operating system for businesses has got empowered with a process automation app, telephony, single sign-on, and blockchain capabilities. This next-gen features or tools enable it to achieve greater heights in customer adoption and business acceptability.

    While the other business applications in the market focus on money for licensing and making their system so complex to an extent that their customers are forced to rely on them or rather stay at their helm instead of hefty annual maintenance costs and licensing costs. On the other hand, every new level of Zoho One is getting more powerful, more flexible, more business-friendly thereby leveraging with the new powerful features at no additional cost to its customers. Every new feature added becomes an integral part of Zoho One thus improving its quality and making it more powerful and stable from the business point of you. This is proven by the fact that the popularity of Zoho One is increasing exponentially and so is its customer base in India as well as globally.

    Next-Gen Zoho

    The next generation of Zoho One is designed and built to take care of entire business operations including sales and marketing, finance and HR, operations and business intelligence, and so on. All this runs on a unified technology platform. The new business workflow management application that they named as Orchestly empowers Zoho customers to effortlessly create, manage, and optimize their business processes with the help of an intuitive drag and drop interface.

    We shall be continuing in the next post featuring more features of Next-Gen Zoho One. Continued »

    September 16, 2019  9:26 AM

    INTELLIGENT SEARCH CAPABILITIES @cloudtenna and @nasuni

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Cloud search, Cloud storage, Nasuni, Object storage

    Technology has matured to a level of collaboration and innovation. High-end tech companies have understood that very well. A recent example in this regard is a partnership between Cloudtenna with Nasuni. As a result of this partnership, enterprises will be able to get intelligent search capabilities. In fact, this is a boon for existing Nasuni clients. Now the can achieve instant success in finding your file from a multiple cloud environment they are using. And all this happens with a guarantee of compliance with enterprise file access rules. As we are well aware that Cloudtenna is one of the front runners in enterprise search technology. It announced today strategic partnership with Nasuni. This partnership results in an unmatched integration of Nasuni Cloud File Services with Cloudtenna’s DirectSearch. This, in turn, empowers enterprises to augment their enterprise-wide collaboration efforts with intelligent search capabilities.


    Direct search is an outstanding recommendation engine which is empowered with machine learning and is capable of finding results with the lightning speed with the help of its sub-second search queries features that works on massive distributed datasets and including millions of items. For enterprises in today’s data-intensive environment file search has become a singular priority to increase employees productivity especially in an environment where the workforce is working in a distributed manner. Because enterprises have understood it well that to increase the productivity of their employees the level of collaboration between their employees based in various offices around the globe is very important. As I mentioned in one of my previous posts, recent research from IDC says an organisation having 1000 knowledge workers on board wastes around dollar 50000 per week. That comes to almost $2.5 million per annum.


    All this happens due to employee’s failure to locate and retrieve electronic files well in time. In a few cases, it could lead to higher losses in terms of financial transactions as well as business reputation.

    Will Hornkohl, vice president of alliances at Nasuni says,

    “For enterprise users, it can be a challenge to find the exact file they are looking for across multiple clouds, not to mention on-premises servers. Bringing Cloudtenna into our growing partner community will ensure that users at the organizations we serve can always find exactly what they are looking for quickly and easily within the entirety of their global file share – all while using a single login for all file sources and while conducting intelligent searches that reflect personalized contextual insights that are modeled on each individual user’s file activity, history, teams and relationships. And of course, Nasuni and Cloudtenna both are built with safeguards that ensure complete compliance with all file access rules and protocols.”


    There are around 500 enterprises that rely on Nasuni. With this significant collaboration, Nasuni has empowered enterprises to reap the highest level of benefits of cloud object storage which includes the unlimited capacity and inherent resiliency. As a matter of fact, this also changes the definition of ‘economy of the cloud’ for them. This will definitely enhance their control on performance from network-attached storage (NAS).


    Aaron Ganek, CEO at Cloudtenna says,

    “Nasuni empowers organizations of all kinds not only to use cloud object storage in all of its flavors for primary storage of their files but also makes an unprecedented degree of global collaboration possible. Search capabilities are even more important when you’re looking at organizations like those that rely on Nasuni, which in many cases not only have massive datasets and equally goliath global file shares but also have employees who need to access a specific file they worked on with a colleague who’s literally on the other side of the world.”

    Ganek adds,

    “The Cloudtenna DirectSearch platform is uniquely designed to tackle distributed datasets, making it the ideal solution for Nasuni’s hybrid cloud file services platform. File search infrastructure faces a unique set of requirements that goes beyond the footprint of traditional search infrastructure used for log-search and site-search. It has to be smart enough to reflect accurate file permissions. It has to be smart enough to derive context to boost search results and has to do all this in a fraction of second.”

    To conclude Nasuni is now Cloudtenna’s tier-one supported data source. In fact, Cloudtenna is also certified as a third-party integration which is available to customers.

    August 30, 2019  2:30 PM

    Predictive Analytics and Data Mining @Amazon

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Amazon, Data Mining, Predictive Analytics

    Predictive Analytics and Data Mining: Concepts and Practice with RapidMiner by Vijay Kotu and Bala Deshpande

    Book Excerpt as on Amazon.com:

    Put Predictive Analytics into Action Learn the basics of Predictive Analysis and Data Mining through an easy to understand the conceptual framework and immediately practice the concepts learned using the open-source RapidMiner tool. Whether you are brand new to Data Mining or working on your tenth project, this book will show you how to analyze data, uncover hidden patterns and relationships to aid important decisions and predictions. Data Mining has become an essential tool for any enterprise that collects, stores and processes data as part of its operations. This book is ideal for business users, data analysts, business analysts, business intelligence and data warehousing professionals and for anyone who wants to learn Data Mining. You’ll be able to: 1. Gain the necessary knowledge of different data mining techniques, so that you can select the right technique for a given data problem and create a general-purpose analytics process. 2. Get up and running fast with more than two dozen commonly used powerful algorithms for predictive analytics using practical use cases. 3. Implement a simple step-by-step process for predicting an outcome or discovering hidden relationships from the data using RapidMiner, an open-source GUI based data mining tool

    Predictive analytics and Data Mining techniques covered: Exploratory Data Analysis, Visualization, Decision trees, Rule induction, k-Nearest Neighbors, Naïve Bayesian, Artificial Neural Networks, Support Vector machines, Ensemble models, Bagging, Boosting, Random Forests, Linear regression, Logistic regression, Association analysis using Apriori and FP Growth, K-Means clustering, Density-based clustering, Self Organizing Maps, Text Mining, Time series forecasting, Anomaly detection and Feature selection. Implementation files can be downloaded from the book companion site at www.LearnPredictiveAnalytics.com

    Demystifies data mining concepts with easy to understand language
    Shows how to get up and running fast with 20 commonly used powerful techniques for predictive analysis
    Explains the process of using open source RapidMiner tools
    Discusses a simple 5 step process for implementing algorithms that can be used for performing predictive analytics
    Includes practical use cases and examples

    “If learning-by-doing is your mantra — as well it should be for predictive analytics — this book will jumpstart your practice. Covering a broad, foundational collection of techniques, authors Kotu and Deshpande deliver crystal-clear explanations of the analytical methods that empower organizations to learn from data. After each concept, screenshots make the ‘how to’ immediately concrete, revealing the steps needed to set things up and go; you’re guided through real hands-on execution.”

    –Eric Siegel, Ph.D., founder of Predictive Analytics World and author of Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie, or Die

    August 30, 2019  2:19 PM

    A Guide to Delivering Business Results with Big Data Fast @Amazon

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Big Data

    Actionable Intelligence: A Guide to Delivering Business Results with Big Data Fast!
    by Keith B. Carte and Donald Farmer’

    Excerpt as on Amazon.com

    Building an analysis ecosystem for a smarter approach to intelligence
    Keith Carter’s Actionable Intelligence: A Guide to Delivering Business Results with Big Data Fast! is the comprehensive guide to achieving the dream that business intelligence practitioners have been chasing since the concept itself came into being. Written by an IT visionary with extensive global supply chain experience and insight, this book describes what happens when team members have accurate, reliable, usable, and timely information at their fingertips. With a focus on leveraging big data, the book provides expert guidance on developing an analytical ecosystem to effectively manage, use the internal and external information to deliver business results.

    This book is written by an author who’s been in the trenches for people who are in the trenches. It’s for practitioners in the real world, who know delivering results is easier said than done – fraught with failure, and difficult politics. A landscape where reason and passion are needed to make a real difference.

    This book lays out the appropriate way to establish a culture of fact-based decision making, innovation, forward-looking measurements, and appropriate high-speed governance. Readers will enable their organization to:

    Answer strategic questions faster
    Reduce data acquisition time and increase analysis time to improve outcomes
    Shift the focus to positive results rather than past failures
    Expand opportunities by more effectively and thoughtfully leveraging information
    Big data makes big promises, but it cannot deliver without the right recipe of people, processes and technology in place. It’s about choosing the right people, giving them the right tools, and taking a thoughtful—rather than formulaic–approach. Actionable Intelligence provides expert guidance toward envisioning, budgeting, implementing, and delivering real benefits.

    “Actionable Intelligence has the critical insights business leaders need to leverage Big Data to win in the emerging digital marketplace! Well done, Keith!”

    —Ed Hunter, Vice President, Product Supply-Asia, Procter & Gamble Europe SA – Singapore

    August 30, 2019  2:15 PM

    Executing Data Quality Projects : Book on @Amazon

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Data quality

    Executing Data Quality Projects: Ten Steps to Quality Data and Trusted Information (TM) 1st Edition by Danette McGilvray

    Excerpt as on Amazon.com

    Executing Data Quality Projects presents a systematic, proven approach to improving and creating data and information quality within the enterprise.

    Recent studies show that data quality problems are costing businesses billions of dollars each year, with poor data linked to waste and inefficiency, damaged credibility among customers and suppliers, and an organizational inability to make sound decisions.

    Data Quality

    This book describes a Ten-Step approach that combines a conceptual framework for understanding information quality with the tools, techniques, and instructions for improving and creating information quality. It includes numerous templates, detailed examples, and practical advice for executing every step of the approach. It allows for quick reference with an easy-to-use format highlighting key concepts and definitions, important checkpoints, communication activities, and best practices.

    The author’s trademarked approach, in which she has trained Fortune 500 clients and hundreds of workshop attendees, applies to all types of data and all types of organizations.

    Includes numerous templates, detailed examples, and practical advice for executing every step of The Ten Steps approach.
    Allows for quick reference with an easy-to-use format highlighting key concepts and definitions, important checkpoints, communication activities, and best practices.
    A companion Web site includes links to numerous data quality resources, including many of the planning and information-gathering templates featured in the text, quick summaries of key ideas from The Ten Step methodology, and other tools and information that is available online.

    This book is a gem. Tested, validated and polished over a distinguished career as a practitioner and consultant, Danette’s Ten Steps methodology shines as a unique and much-needed contribution to the information quality discipline. This practical and insightful book will quickly become the reference of choice for all those leading or participating in information quality improvement projects. Experienced project managers will use it to update and deepen their knowledge, new ones will use it as a roadmap to quickly become effective. Managers in organizations that have embraced generic improvement methodologies such as six sigma, lean or have developed internal ones would be wise to hand this book to their Black Belts and other improvement leaders.

    – C. Lwanga Yonke, Information Quality Practitioner.

    August 30, 2019  12:26 PM

    An Excellent Application and Infrastructure Monitoring Tool @Site24x7

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Application monitoring, application performance monitoring, IT Infrastructure Monitoring, Server monitoring

    What I feel is Site24x7 will soon be the top layer application and infrastructure monitoring tool for all CIOs and CTOs irrespective of the next layers of support mechanisms they have in place. Site24x7 has a lot of features that are unmatched with any of the existing global standard monitoring tools. Firstly, none have all the features it has. Secondly, it beats all others in one field or the other. And that is why I call it to be the most suitable and promising top layer application and infrastructure monitoring tool. Let us learn a little more about Site24x7. It’s very lightweight in comparison to its peers in the market from other companies. Also, it is a non-intrusive agent. What that means is that the Site24x7 monitoring agent runs almost silently and with minimal dependencies. It consumes very little memory on a system.

    application and infrastructure monitoring tool

    Data in Site24x7 application and infrastructure monitoring tool comes through log files, APIs, and simple commands. The overall proposition is quite cost-effective. That means, with it, you can monitor and manage your infrastructure and the servers on a wide range of parameters starting from basic availability to server cluster troubleshooting. The pricing has nothing to do with the physical configuration of your server. It indicates, that its pricing is not based on the number of cores, that mostly happens in the market to exaggerate the costing model. Its costing also doesn’t consider the RAM size or the number of hours your server needs to run. It empowers you to monitor and control your complete server infrastructure in a holistic manner without any tension of the financial implications. That makes it the first choice for enterprises of any size and volume.

    Infrastructure Monitoring Tool

    application and infrastructure monitoring tool

    Source: site24x7

    Tell me, how many other tools in the market are so powerful with so much flexibility and least costing model. Site24x7 is a modern application and infrastructure monitoring tool that is a SaaS service. It is already serving thousands of businesses and millions of users across the globe. With its new launches of Site24x7 Signals, AI-based intelligent dashboards & Security, This state-of-the-art application and infrastructure monitoring tool have created a new landmark in this field.

    August 29, 2019  9:42 AM

    How GOFRUGAL Is The Best ERP for Trade and Supply Chain @gofrugaltech

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    ERP, Supply chain, trade

    In the previous post, I covered the multifaceted achievements of Kumar Vembu. In this post, we shall be learning more about GOFRUGAL ERP. GOFRUGAL ERP is a complete, comprehensive and collaborative core business application to take care of entire trade and supply chain ecosystem. It takes care of 100% transaction automation collaborating suppliers, customers, partners, and other stakeholders in the ecosystem. GOFRUGAL ERP intensively evolved with an in-depth understanding of the minutest of customer needs and their pain points. It is available in different flavors like on-premise, in the cloud or on mobile. Before buying it a customer can have a free trial to play around the solutions and experience its unbeatable simplicity. Its key features and benefits include handling customers’ omnichannel businesses, customizable, ML-driven supply chain, autopilot with AI-driven decisions. In addition, it easily integrates into the current complex IT environment as per the requirement of backward forward integration.


    GOFRUGAL customers include businesses of various sizes like small or independent stores, large stores, regional local chains or national chains. During the last 18 years of its evolution, it has catered to more than 40 plus business formats that include supermarkets, pharmacies, salons, Apparel stores, restaurants, bakeries, FMCG, and pharma. Let’s see what is latest in GOFRUGAL. Recently GOFRUGAL launched GoSecure which is a real-time backup as a service for traders. More than 90% of retail and distribution businesses don’t have a data backup solution in place that means they are running with a high risk of backing up anytime. Most of these businesses are not too serious about the safety of their data. GOSecure’s real-time backup as a service automatically backs up every transaction that takes place on the system to a cloud-based service in a secured manner.


    That means in case of any kind of contingencies user can restore the data without any technical support requiring just and OTP.

    August 29, 2019  9:30 AM

    Kumar Vembu and His Entrepreneurial Journey @gofrugaltech

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Cloud ERP, ERP, retail

    Kumar Vembu is a veteran entrepreneur. He has expertise in technology. Kumar is CEO and founder of GOFRUGAL. He wears multiple hats. He is an entrepreneur, investor farmer, yoga expert, and a lot more. Started his career with Qualcomm at San Diego. In 1995 he moved back to India to start his entrepreneurial journey. As a matter of fact, before starting his venture, he worked with large Telecom and networking companies like IITM, HCL, HP, and TeNet. While working, he was always striving to launch a world-class software product company in India and that is when he co-founded Zoho Corp with his brother Sridhar Vembu in 1995 and another company Delium in 2016. In between these two companies he launched GOFRUGAL in 2004 with a mission to make ERP easy and usable by all businesses of any size.

    Kumar Vembu and GOFRUGAL

    As of today, GOFRUGAL’s multiple solutions are running successfully in more than 30000 brands across more than 40 business formats with a sole purpose to transform business operations digitally to gain better customer experience, higher sales thereby acquiring more customers. Let us learn more about GOFRUGAL. GOFRUGAL is a digital-first company. It was founded in 2004 with a singular goal of empowering retail, restaurant, and distribution businesses with the help of appropriate ERP tools that empowers them to achieve greater levels of profitability and efficiency. The key focus of all GOFRUGAL applications stays on customer experience and mobility because these two are the new drivers for the growth of any business of any capacity. That is how GOFRUGAL empowers businesses to go Digital to stay ahead of their competition and sustain enough agility to grow and succeed continuously.

    Kumar Vembu

    We shall be continuing about GOFRUGAL in the concluding post. There are many more interesting facts about this wonderful venture initiated by Kumar Vembu.

    August 19, 2019  10:58 PM

    Care For Data Accuracy? Go For Golden-Record-As-A-Service @Naveego

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Data, Master data management, MDM

    With the launch of next-generation data accuracy platform by Naveego, it has marked another landmark in the respective field. This NextGen data accuracy platform with self-service MDM and advanced security features shows Golden record across the comprehensive Enterprise data systems. In fact, this marks the launch of powerful Golden-Record-As-A-Service offering thus eliminating the need for costly IT resources. As a matter of fact, it results in 5x faster deployment and a humongous 80% cost savings over legacy solutions. In fact, in a short span of time Naveego has emerged as a leader in cloud-first distributed Data Accuracy solutions. With its recent launch of the next generation of its complete data accuracy platform Naveego has set itself at a much higher stand as compared to its counterparts in the market. It comes bundled with self-service Master Data Management (MDM) and Golden-Record-as-a-Service (GRaaS).

    Rather, it has become a boon for non-technical business users to manage Technology in a much easier and more accurate manner. This powerful offering empowers non-technical executives in an organization to acquire the data that they may require for advanced analytics without any intervention of somebody from the IT department or needing any professional services. GRaaS has many more powerful benefits to help business see itself rise to new heights. For instance, it ensures that there exists a single version of data for all business verticals in an organization. They can easily bank on the same data for their respective analytics and reporting. Interestingly, it results in 80% reduction in cost. Also, the implementation goes 5 times faster than legacy solutions.

    Data Accuracy – How Critical?

    In fact, this next-generation platform consists of an advanced patent-pending security mechanism that ensures merging and checks consistency without decryption of data or even any requirement of having platform access to the encryption key. The best thing is it does not require any customization or infrastructure change for that matter that results in a low cost of ownership (TCO). Moreover, as it is a complete solution It eliminates the requirement of highly skilled individuals to implement and maintain this system. That gives another major benefit to an enterprise. As we all know data is expanding in any organization at a tremendous speed exponentially. This is because of adoption of latest technologies like artificial intelligence (AI), machine learning (ML), the internet of things (IoT), heterogeneous devices including mobile devices, autonomous vehicles, and a large number of other sources in the ecosystem that are outside of traditional data centers.

    data accuracy

    All these have emerged with a very essential requirement of data cleansing which is becoming quite cumbersome for enterprises as well as highly expensive. On average the annual cost to organizations is  $15 million for maintaining bad data according to Gartner. As a matter of fact, in addition to this, there are other heavy costs that have become a big headache for enterprises. This includes the high price of legacy systems, customization of existing systems, etc. A debt of $3.1 trillion is there every year on the US economy. For an individual organization, it might look irrelevant but that is not the case. A company with, for instance, 50,000 incorrect records will have to incur a cost of $5 million per year to maintain those at approximately $100 per incorrect recordhttps://hbr.org/2017/09/only-3-of-companies-data-meets-basic-quality-standards.

    Data Accuracy

    In lack of a proper mechanism in place, this cost of maintaining incorrect data will keep on rising exponentially every year looking at the speed at which the data is increasing. Another burning issue for organizations is scrubbing and prepping of data for which organizations have to hire or outsource high-wage data scientists. Evidently, reports indicate that 80% of the time of these high-wage data scientists goes in collecting and cleansing inaccurate digital data. Without this cleansing of inaccurate digital data, an organization cannot use it for analysis purposes. This, Naveego terms as ‘data janitor work‘ which doesn’t match to the skills of data scientists but unfortunately this kind of work eats out most of their time whereas they are hired for focusing on the highly skilled job of data analysis.

    That in fact creates a vicious circle for the organizations from which they will never be able to come out and ultimately will succumb to it sooner or later unless they adopt a powerful system like Naveego’s next-generation Data Accuracy platform with Self-Service MDM and Advanced Security Features to ensure Golden Record across all Enterprise Data Systems. Now let us understand how Naveego explains the emergence and importance of Golden record. The complete data accuracy platform that Naveego provides supports hybrid and multi-cloud environments providing distributed Data accuracy solution. It in fact proactively manages, identifies, and eliminates any kind of customer data accuracy problems across all enterprise data sources thus resulting in a single Golden record thereby ensuring data consistency across the Enterprise. In turn, it eliminates any chances of data lakes from becoming data swamps.

    Data Accuracy

    The solution talks to Kubernets, Apache Kafka, and Apache Spark technologies thereby ensuring rapid deployment distributed processing and flawless integration with data. This data may be residing anywhere in the cloud or on-premise/off-premise. The matter of fact is it supports all kind of hybrid and multi-cloud environments. Naveego ensures the data accuracy of any volume with realtime streaming from multiple data sources in any environment irrespective of its schema or structure. The key features of Naveego’s Next Generation Data Accuracy Platform include Self Service, Golden-Record-as-a-service, Golden Record C, Automated Profiling of Data Sources at the edge (machine learning), Automated Profiling of any Data Source including IoT, Automatic Data Quality Checks driven by Machine Learning and so on.

    Michael Ger, General Manager, Automotive and Manufacturing Solutions, Cloudera says, “Companies across all industries are reimagining themselves within a digitally transformed future. Central to that future is leveraging a data tsunami resulting from newly connected consumers, products and processes. Within this context, data quality has taken on critical new importance. The Naveego data accuracy platform is critical for enabling traditional approaches to business intelligence as well as modern-day big data analytics. The reason for this is clear – actionable insights start with clean data, and that’s exactly what the Naveego platform delivers.”

    Data Accuracy

    Katie Horvath, CEO, Naveego says, “The ability to achieve golden record data has typically been available only by hiring a systems integrator or other specialist, at a high cost and TCO to the enterprise. The next generation of our Data Accuracy Platform is truly a game-changer, empowering business users to access trusted data across all data types for analytics purposes, entirely on their own with an easy to use, flow-oriented user interface – and at a significantly lower cost. This is sure to disrupt pricey legacy solutions that require vast amounts of professional resources and on average five times longer to deploy.”

    August 19, 2019  9:55 PM

    @Tachyum Joins PCI-SIG for Ultimate Performance of Data Center

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja

    Tachyum joins PCI-SIG in order to optimize performance requirements of data center AI and HPC workloads. This highly scalable high-speed I/O solution by Tachyum is applicable to numerous market applications empowered by Tachyum’s processor Technology. Let us first understand what PCI and SIG stand for. PCI stands for peripheral component interconnect. SIG stands for a special interest group. As a matter of fact, this PCI-SIG comprises of over 700 member association that is committed for the advancement of the non-proprietary PCI Technology for high speed I/O in various market applications. As a matter of fact, the PCI expansion bus in today’s environment is by default and interconnect between CPUs and peripherals. With an increasing demand in higher performance I/O, scope and ecosystem reach of PCI expands tremendously. In fact, latest technology road maps including PCI Express focus on new form factors and lower-power applications.


    There comes the role of association members who collaborate in open communities for the purpose of defining, testing, and refining specifications so that companies can bring to market PCI-compliant devices. As the innovation in PCIe Technology grows there is a continuous doubling of bandwidth availability to graphic cards, hard drives, Wi-Fi, Ethernet cards, SSDs, and so on. You must know that the fourth generation of the PCIe standard supports bandwidth capabilities of 64 GB per second. This if you notice is twice that of the PCIe 3.0 interface. Now let us understand that capabilities are of PCIe 5.0. it will double bandwidth rates to 128 GB per second. Tachyum is doing a great job in integrating PCIe based on customer requirements for storage, peer-to-peer clusters, AI, and as endpoints for accelerator applications.

    Tachyum Joins PCI-SIG

    With these advances to PCIe besides performance improvements, improved the support of the memory coherency created into the standard that protocols like CCIX and CXL get to multi-core processors. This ensures that all copies of data stay in a coherency. As a matter of fact, Tachyum’s Prodigy Universal Processor Chip empowers industry-leading advances in performance, energy consumption, space requirements, and data center server utilization. This happens in fully coherent multiprocessor environments utilizing PCIe. Amazingly Tachyum’s Prodigy Universal processor Chip is the smallest and fastest general-purpose 64-core processor produced till date. For that reason, it requires 10x less processor power and reduces processor cost by 3x. The Prodigy in fact directly enables 32 Tensor Exaflop supercomputer thereby allowing the creation of systems more powerful than the human brain by 2021. That will be one of the biggest landmarks in technology.

    The data center TCO that is the annual total cost of ownership using prodigy reduces by 4x. All this happens because of Tachyum’s disruptive processor design embedded with a smart compiler that ultimately expels a number of parts of the hardware found in today’s typical processor as redundant. Ultimately it’s going to be fewer transistors, simpler core, fewer and shorter wires, greater speed, and huge power efficiency for the Prodigy processor. Basically, it is now businesses to decide when do they adopt Prodigy from Tachyum. Now to make things simpler or later when things complicate to a large extent.

    Dr. Radoslav Danilak, CEO of Tachyum says, “Much has been made about the death of Moore’s Law and how the possibility to improve density, power efficiency and cost benefits in the semiconductor industry into the future is problematic. But the advances seen among PCI shows that this simply isn’t true. PCI-SIG already announced PCIe 6.0, which would double bandwidth again within 3 years. We, at Tachyum, will also ensure that the processor will not be the limiting factor either; it just requires a more innovative approach. We are glad to join the efforts of our fellow members in PCI-SIG in improving the speed capabilities of the next PCIe standard to support the performance needs of the data center, AI and Big Data workloads.”

    August 16, 2019  10:36 PM

    Works with SwiftStack Simplifies Cloud Storage Deployments @SwiftStack

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Cloud storage, SwiftStack

    SwiftStack’s new technology partner program along with ‘Works with SwiftStack is all about validating integration with modern applications that use S3 API from Amazon. This seamless integration is for flexibility, scalability, and simplicity of cloud storage that includes public cloud platforms like Amazon S3 and Google cloud storage as well as SwiftStack’s private and multi-cloud storage. So basically SwiftStack launches ‘Works with SwiftStack’ and Technology partner programs in order to simplify cloud storage deployments for its customers. As we all know SwiftStack is the leader in multi-cloud data storage and management. Its latest launch is to provide customers confidence that their validated integrations have gone through exhaustive testing just to ensure enhanced compatibility with SwiftStack’s object-based cloud storage solution. SwiftStack’s state of the art storage and multi-cloud data management software works well with modern applications using Amazon S3 API as well as open standard Swift API.

    As a matter of fact, for a large number of its customers, many commercial software applications have been created or recreated to take advantage of these API in order to experience the flexibility, scalability, and simplicity of cloud storage. It includes both public cloud platforms like Amazon S3 and Google cloud storage as well as SwiftStack’s private and multi-cloud storage. The API and documentation of this API are easily accessible. These are helping software developers to validate their implementations for ensuring enhanced functionality and utmost user experience. Basically SwiftStack Technology partner program and the Works with SwiftStack feature, both empower software developers and relevant vendors with a unique method to not only validate their software functionality with SwiftStack but also to ensure the expansion of a business relationship for the mutual benefit of the vendor, SwiftStack, and end-users of the joint solution.

    Works With SwiftStack

    There are multiple benefits of joining the SwiftStack Technology Partner program which includes features like expanding customer reach with the help of integration with the industries needing multi-cloud storage and data management platform. Their partners are able to achieve validation with the help of an easy to follow self-test plan this gaining remarkable results. Actually, the partners have access to a proprietary set of SwiftStack solutions and a dedicated engineer to get the best of the value of the joint offering. Works with SwiftStack application is helpful for customers having multiple cloud vendors who can jointly work to resolve any kind of unexpected issues. The same application also helps customers developing their own in house applications to use the program to check and ensure if the functionality of their implementation is correct or not.

    Cloud Storage Works With SwiftStack

    Source: SwiftStack.com

    As a matter of fact, more than 30 applications have already been verified as part of the Works with SwiftStack program. The complete list of vetted partners can be accessed here. Technology partners who are interested in joining the SwiftStack Technology Partner program can get complete information in this regard here. SwiftStack was founded in 2011 by a group of experts in cloud computing. As a matter of fact, SwiftStack, in a short span of 8 years, has become a leading cloud storage provider. Organizations requiring universal access to petabytes of unstructured data in a single namespace find SwiftStack as the best solution. SwiftStack software is highly in demand in many business verticals like global service providers, life sciences, web-based businesses, and media and entertainment.

    Works With SwiftStack

    SwiftStack suits well with industries working in fields like artificial intelligence, machine learning, analytics, Active Archive, scientific research, and those who need to manage data across multiple clouds. SwiftStack customers include industry leaders like eBay, Verizon, HudsonAlpha Institute for Biotechnology and PAC-12 networks.

    August 14, 2019  9:13 AM

    @EndemolShineUS Banks On @Nexsan Unity Unified Storage

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Nexsan, Unified storage

    This is what I call as a real-life approval of all your technological advancements when a company as large as Endemol Shine North America deploys Nexsan Unity unified storage solution. This is recently announced by Nexsan that Nexsan Unity now serves as the backbone of Endemol Shine North America’s file-based workflow thus empowering its complete editorial mechanism. Endemol Shine North America is one of the pioneers in delivering global-standard content and exemplary storytelling. It basically serves multiple platforms in the U.S. and across a large number of countries worldwide. Endemol Shine Groups creates global content, produces, and distributes and has diversified in various industry verticals. It is, in fact, known for a number of hit television programs and series across the globe. So, when such a company with a top-level presence across the globe adopts a technology solution for its unified storage, it has to be the best-in-class.

    If you notice, file-based workflow editorial and editing processes are the key elements for a business like Endemol Shine North America. And now, this is being taken care of by Nexsan Unity, unified storage solution. Nexsan®, in fact, is a part of the StorCentric® group. It is, as I said above, a global leader in unified storage solutions that includes archive and high-volume storage. As a matter of fact, Endemol Shine is already using Nexsan E-series high-density storage arrays that itself is a best in class product. To further enhance its capabilities, Endemol Shine empowers it with Nexsan Unity™ unified storage solution which now takes care of Endemol Shine’s entire editorial process.

    Unified Storage mastered by Nexsan Unity

    Globally hit shows like Big Brother (CBS), The Real Housewives of Atlanta (Bravo), MasterChef & MasterChef Junior (FOX), Extreme Makeover: Home Edition (HGTV), and The Biggest Loser (USA) are all products of Endemol group.

    Unified Storage

    Source: Nexsan.com

    As Alex Palatnick, Vice President, Post Production, Endemol Shine North America says, “Just a few years ago, it was standard to shoot a conventional television show on videotape and work in proxy resolution because editing at high resolution was prohibitive. This offline and then online editing process to conform to each episode was labor-intensive and required multiple assistant editors over two shifts along with expensive broadcast decks. And that was just for one episode. We wanted to overcome these challenges, and instead design a workflow that would allow every edit client to be able to access the media to transcode it for editorial. At times, this would mean as many as 150 simultaneous editors. We designed a new file-based workflow that would require top-notch stable equipment to support it. That’s where Nexsan came in.”

    Palatnick continues, “Today, Nexsan houses our camera masters. When we go into production, and we are doing acquisition – shooting a television show, all of that content lives full-time, until the show wraps, on Nexsan. Our storage must always be in perfect operating condition, and it must be flexible. It must be able to expand and support continuous, planned and unplanned content creation. And, it must enable us to protect, find and retrieve any content, at any time – and that is exactly what Nexsan has provided.”

    Unified Storage has a new benchmark with Nexsan Unity

    It is not that it was a cakewalk for Nexsan. In fact, Endemol Shine North America considered most of the other solutions in the fray meant for media and entertainment (M&E) vertical, but none matched that could meet all of its criteria other than Nexsan. Mihir Shah, CEO of StorCentric, the parent company of Nexsan says, “Digitization has become the paradigm in the M&E industry. As new techniques and technologies continue to enter the space, driven by leading-edge innovators such as Endemol Shine North America, data storage requirements will continue to skyrocket. Nexsan’s storage portfolio is able to uniquely support the rigorous requirements of the M&E industry, at a price point that ensures unparalleled ROI.”

    August 8, 2019  12:06 AM

    State of Critical Application Availability In Cloud @SIOSTech

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    High Availability

    As a head of technology of their respective organizations, how many CIOs or CTOs would be actually aware of the state of critical business application availability in cloud and hybrid cloud environments? I doubt if more than 20% of them will have a real-time alert mechanism as and when the situation becomes red or orange from the green. Otherwise, mostly it is the end-user that acts as an alarm to get the IT department in action in such cases. If that is the case, that is not at all a healthy situation. Many CIOs or even businesses live under this misconception that just because their application is in the cloud meaning they have high availability. As a matter of fact, I have seen many enterprises migrating to cloud with the sole conception in mind that they are moving to cloud means high availability their business applications.

    Merely migrating or residing in the cloud is not the synonym to high availability. Because in most of the cases standard cloud offerings don’t include high availability. In fact, that is where your roles come in deciding the best out of many options available depending on your business needs. But logically when it is about mission-critical applications, it means effectively no downtime thus limiting a CIO to have a good number of choices to opt from. Jerry Melnick, president, and CEO of SIOS Technology gives some very valuable insights about how a business can ensure high availability for critical applications in the cloud. Basically, for a business, it is very important to understand the promises and myths of High Availability (HA) in the cloud. While selecting your cloud vendor, as a CIO you must be very clear about the HA options and solutions available in the cloud.

    State of High Availability

    As Jerry Melnick says,

    “As IT looks to move their most business-critical applications from the data center to cloud and reap the benefits, they are confronted with a seemingly vast set of choices on how to assure availability and data protection during cloud outages. Sifting through the capabilities and promises the cloud providers market and understanding how they can use these with other technologies to achieve the required SLA’s is a daunting task for even the most seasoned IT veterans.”

    As a matter of fact, the technology arm of a business must be very clear about the capabilities and shortcomings of its cloud vendor. Foremost to understand is in what ways do the cloud facilities are going to address your HA requirements. You must be aware of the gaps while formulating your comprehensive strategy with an aim to cover critical failures of components and timely and reliable recovery.

    Going a little deeper in a technology design of an enterprise-critical application, it must be very clear what do application-specific solutions such as SQL Server availability groups, and cloud vendors are providing. As Jerry says,

    “Availability is about reliably managing redundancy of all components of the application and infrastructure stack and recovering effectively.  Achieving reliable fault detection, reliable fault recovery, and covering the full scope of errors needed to be detected are essential for handling service outages. The cloud provides many facilities and resources that could theoretically be assembled to do all this. However, the practicality of designing, creating, testing, and maintaining a solution to accurately perform when you need it most is beyond most IT shops.”

    If you are clear about the challenges of creating a reliable availability and DR strategy from cloud components and homegrown designs, probably you are driving in the right direction.

    High Availability Strategy

    We must very clearly understand the key requirements for a highly effective HA/DR strategy includes the scope of coverage, a good provision of anticipating failures well in advance, and reliability of responding to those. As a matter of fact, challenges of developing custom designs from the cloud and on-premise environments are not easy to understand and tackle looking at the complexity and scope of the problem needing a quick resolution to support the objectives of a critical application with most appropriate SLAs. The primary goal of an IT solution architect would be to reliably protect business-critical applications while assessing commercially developed and market-tested solutions for HA/DR. HA/DR solutions by SIOS have been developed and enhanced over the years by best in the class industry experts who minutely understand the key issues needing immediate attention. Solutions like this are tested and verified thus ensuring reliable operations in any situation.

    High Availability

    Basically, the key parameters to assess a great HA/DR solution are its capabilities to manage operations in a wide range of environments, workloads, and configurations. It is, primarily, important to restrict the variety and variations of technologies in an enterprise environment to ascertain the right kind of approaches to achieve HA/DR across the organization. This will, definitely require expert training and support for each approach that most of the organizations either fail to understand or ignore. In fact, multiple approaches would need development and execution of a number of methodologies to test and validate under various real-time situations like a wide variety of environments, multiple operation variables, and different use cases. Obviously, the implementation effort will increase with the existence of different technologies and approaches employed to HA/DR. So will the complexities. That is why certifications by AWS and Microsoft are essential for the complete technology stack.

    High Availability or HA and DR go hand in hand

    Looking at all these realities and possibilities, Jerry Melnick concludes saying,

    “Traditional, commercial solutions such as HA clustering, have evolved and matured over many years. Their design uniquely addresses downtime by covering the full stack detecting and recovering from errors in the application, storage, network, or any level of the infrastructure layer. These capabilities give IT the flexibility to configure HA and DR systems using the cloud, cloud regions and zones, as well as physical and virtual data center resources, to reliably deliver the SLAs they need – so customers get their services and IT can sleep well at night.”

    August 6, 2019  11:37 PM

    PCIe Gen4 Storage Empowered with a Portfolio of Products From Phison

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    NVMe, PCIe

    Phison has become a synonym with its industry-first products that prove its leadership in the global market. This time it is about hitting the PCIe Gen4 Storage Market with a portfolio of products. As a matter of fact, Phison is the first and only company shipping PCIe® Gen4x4 NVMe SSD solutions. That makes it the industry leader that has enabled high-performance computing for high-speed, high-volume, bandwidth-hungry applications catering to millions of transactions and volumes of data movements. Phison has become a landmark in itself and a benchmark for others in developing these solutions. The solutions that it is providing are raising the bar to a new level of applications expectations to meet the requirement of faster and higher-definition digital transactions. In fact, these expectations are increasing exponentially with the faster adoption of newer technologies like big data, Internet of Things, Machine Learning, Artificial Intelligence, and Virtual/Augmented Reality.

    Very few of us know that the PCIe 4.0 standard has double the data transfer rate than its predecessor PCIe 3.0. This doubled data transfer rate along with signal reliability and integrity of PCIe Gen4 empowers technology solutions providers with the capabilities of delivering higher performance, enhanced flexibility, and decreased latency to a huge volume of applications that includes PC, mobile, gaming, networking, and storage.

    Chris Kilburn, corporate vice president, and general manager, Client Channel, AMD says, “There is continued pressure in the industry to improve the performance of computing systems to support the applications that end users are most interested in. PCIe 4.0 offers manufacturers a way to meet these consumer demands. AMD is delighted to work with Phison to raise the bar by introducing first-to-market solutions. Through sound engineering and design, we are working together to deliver the experiences our customers demand.”

    PCIe Gen4 Storage

    Sumit Puri CEO and Co-Founder of Liqid says, “PCIe Gen4 will unleash the performance capabilities required for next-generation data-centric applications, including artificial intelligence and 5G edge computing. The LQD4500 provides 32TB of capacity and a PCIe Gen4x16 interface that enables over 24GB/s of throughput and 4 million IOPS. This impressive performance is only possible by aggregating multiple Phison E16 NVMe controllers into a single device. The Phison E16 provides industry-leading performance, capacity and NVMe features required to build the PCIe Gen4 enabled data center of the future.  Liqid is excited that the Phison E16 is now powering the fastest storage in the world, the LQD4500.”


    Phison’s stronghold in-memory technology, innovation in flash memory products, and excellent in engineering are the key factors to establish it as a market leader known for its first-to-market expertise. While most of its peers in the market are yet to debut in Gen4 solutions, Phison has achieved the development of a package of products to cater to multiple sockets within the consumer space. The range of new portfolio of PCIe Gen4x4 NVMe products will be released within a year’s timeframe that includes PS5016-E16, PS5019-E19T, and PS5018-E18. Their availability will happen in the same order.


    K.S. Pua, CEO of Phison Electronics says, “After several years since the announcement of the standard, the era of PCIe 4.0 solutions is upon us and Phison is at the forefront of this movement with our portfolio of Gen4x4 solutions. We pride ourselves with our long history of innovation supporting emerging technologies. From doubling transfer rates to improving power consumption to increasing performance, Phison-based SSD solutions allow our integration partners to deliver the next-generation PC, gaming and storage systems needed to satisfy increasing consumer demand.”

    Phison is showcasing its products at the Flash Memory Summit (FMS), August 6-8 in Booth No. 219 at the Santa Clara Convention Center in Santa Clara, California.

    July 31, 2019  11:49 PM

    Designing Data-Intensive Applications @Amazon

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja

    Designing Data-Intensive Applications: The Big Ideas Behind Reliable, Scalable, and Maintainable Systems 1st Edition, Kindle Edition by Martin Kleppmann

    Excerpt as on Amazon.com

    Data is at the center of many challenges in system design today. Difficult issues need to be figured out, such as scalability, consistency, reliability, efficiency, and maintainability. In addition, we have an overwhelming variety of tools, including relational databases, NoSQL datastores, stream or batch processors, and message brokers. What are the right choices for your application? How do you make sense of all these buzzwords?

    In this practical and comprehensive guide, author Martin Kleppmann helps you navigate this diverse landscape by examining the pros and cons of various technologies for processing and storing data. The software keeps changing, but the fundamental principles remain the same. With this book, software engineers and architects will learn how to apply those ideas in practice, and how to make full use of data in modern applications.

    Peer under the hood of the systems you already use and learn how to use and operate them more effectively
    Make informed decisions by identifying the strengths and weaknesses of different tools
    Navigate the trade-offs around consistency, scalability, fault tolerance, and complexity
    Understand the distributed systems research upon which modern databases are built
    Peek behind the scenes of major online services, and learn from their architectures

    From the Publisher:

    Who Should Read This Book?
    If you develop applications that have some kind of server/backend for storing or processing data, and your applications use the internet (e.g., web applications, mobile apps, or internet-connected sensors), then this book is for you.

    This book is for software engineers, software architects, and technical managers who love to code. It is especially relevant if you need to make decisions about the architecture of the systems you work on—for example if you need to choose tools for solving a given problem and figure out how best to apply them. But even if you have no choice over your tools, this book will help you better understand their strengths and weaknesses.

    You should have some experience building web-based applications or network services, and you should be familiar with relational databases and SQL. Any non-relational databases and other data-related tools you know are a bonus, but not required. A general understanding of common network protocols like TCP and HTTP is helpful. Your choice of programming language or framework makes no difference for this book.

    If any of the following are true for you, you’ll find this book valuable:
    You want to learn how to make data systems scalable, for example, to support web or mobile apps with millions of users.
    You need to make applications highly available (minimizing downtime) and operationally robust.
    You are looking for ways of making systems easier to maintain in the long run, even as they grow and as requirements and technologies change.
    You have a natural curiosity for the way things work and want to know what goes on inside major websites and online services. This book breaks down the internals of various databases and data processing systems, and it’s great fun to explore the bright thinking that went into their design.

    Sometimes, when discussing scalable data systems, people make comments along the lines of, ‘You’re not Google or Amazon. Stop worrying about scale and just use a relational database’. There is truth in that statement: building for scale that you don’t need is wasted effort and may lock you into an inflexible design. In effect, it is a form of premature optimization. However, it’s also important to choose the right tool for the job, and different technologies each have their own strengths and weaknesses. As we shall see, relational databases are important but not the final word on dealing with data.

    Scope of This Book
    This book does not attempt to give detailed instructions on how to install or use specific software packages or APIs since there is already plenty of documentation for those things. Instead, we discuss the various principles and trade-offs that are fundamental to data systems, and we explore the different design decisions taken by different products.

    We look primarily at the architecture of data systems and the ways they are integrated into data-intensive applications. This book doesn’t have space to cover deployment, operations, security, management, and other areas—those are complex and important topics, and we wouldn’t do them justice by making them superficial side notes in this book. They deserve books of their own.

    Many of the technologies described in this book fall within the realm of the Big Data buzzword. However, the term ‘Big Data’ is so overused and underdefined that it is not useful in a serious engineering discussion. This book uses less ambiguous terms, such as single-node versus distributed systems, or online/interactive versus offline/batch processing systems.

    This book has a bias toward free and open-source software (FOSS) because reading, modifying, and executing source code is a great way to understand how something works in detail. Open platforms also reduce the risk of vendor lock-in. However, where appropriate, we also discuss proprietary software (closed-source software, software as a service, or companies’ in-house software that is only described in the literature but not released publicly).

    July 31, 2019  11:41 PM

    NAND Flash Memory Technologies @Amazon

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja

    NAND Flash Memory Technologies (IEEE Press Series on Microelectronic Systems) 1st Edition, Kindle Edition by Seiichi Aritome (Author)

    Offers a comprehensive overview of NAND flash memories, with insights into NAND history, technology, challenges, evolutions, and perspectives

    Describes new program disturb issues, data retention, power consumption, and possible solutions for the challenges of 3D NAND flash memory

    Written by an authority in NAND flash memory technology, with over 25 years’ experience

    From the Back Cover
    Examines the history, basic structure, and processes of NAND flash memory

    This book discusses basic and advanced NAND flash memory technologies, including the principle of NAND flash, memory cell technologies, multi-bits cell technologies, scaling challenges of the memory cell, reliability, and 3-dimensional cell as the future technology. Chapter 1 describes the background and early history of NAND flash. The basic device structures and operations are described in Chapter 2. Next, the author discusses the memory cell technologies focused on scaling in Chapter 3 and introduces the advanced operations for multi-level cells in Chapter 4. The physical limitations for scaling are examined in Chapter 5, and Chapter 6 describes the reliability of NAND flash memory. Chapter 7 examines 3-dimensional (3D) NAND flash memory cells and discusses the pros and cons in structure, process, operations, scalability, and performance. In Chapter 8, challenges of 3D NAND flash memory are discussed. Finally, in Chapter 9, the author summarizes and describes the prospect of technologies and market for the future NAND flash memory.

    Offers a comprehensive overview of NAND flash memories, with insights into NAND history, technology, challenges, evolutions, and perspectives
    Describes new program disturb issues, data retention, power consumption, and possible solutions for the challenges of 3D NAND flash memory
    Written by an authority in NAND flash memory technology, with over 25 years’ experience
    NAND Flash Memory Technologies is a reference for engineers, researchers, and designers who are engaged in the development of NAND flash memory or SSD (Solid State Disk) and flash memory systems.

    About the Author
    Seiichi Aritome was a Senior Research Fellow at SK Hynix Inc. in Icheon, Korea from 2009 to 2014. He has contributed to NAND flash memory technologies for over 27 years in several companies and nations. Aritome was a Program Director at Powerchip Semiconductor Corp. in Hsinchu, Taiwan, a Senior Process Reliability Engineer at Micron Technology Inc. in Idaho, USA, and a Chief Specialist at Toshiba Corporation in Kawasaki, Japan. He received his Ph.D. from Graduate School of Advanced Sciences of Matter, Hiroshima University, Japan. Aritome is an IEEE Fellow and a member of the IEEE Electron Device Society.

    July 31, 2019  11:01 PM

    Phison Showcases Technology Innovation Leadership at FMS2019

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Flash memory, NVMe, PCIe, PCIe SSD, SSD

    If you are there at Flash Memory Summit which is from August 6 to August 8 at the Santa Clara Convention Centre in Santa Clara, California, then don’t forget to visit Booth number 219 because you are going to witness one of the most innovative technologies there. Phison is one of the pioneer company that is giving the best of the class SSD solutions. In fact, Phison is the only company with PCIe Gen4x4 NVMe SSD solutions at Flash Memory Summit. There will be a lot of partner demos and interesting panel participation at the summit. Phison brings Technology innovation leadership on full display at Flash Memory Summit 2019. Phison electronics is the industry leader in the flash controller and NAND solutions. It will showcase its lineup of PCIe Gen4 SSD solutions that include the public debut of its power-conscious PS5019-E19T controller at both number 219.


    As a matter of fact, it is the first and only company ready with PCIe Gen4x4 NVMe SSD solutions. There will be a demonstration of how its controllers push the boundaries of low power consumption and high performance for storage. In fact, it is being shown publicly for the first time how Phison’s E19T controller that offers low power consumption for main steam drives and at the same time promises to deliver best in class power savings while reducing cooling needs in data centers. That is, of course, a phenomenal achievement. In addition to that Phison is offering a preview of the company’s next-generation gen 4 by 4 PS 501 851118 controller this is an optimized design that it gives high-performance advantages because of PCI e 4.0 interface. This enables the company to how to mark a new achievement it in performance leadership in gen 4 SSD.

    Phison Elevates Scale in SSD Solutions

    That is not all the company will also showcase for the first time its PS5013-E13T 1113 BGA SSD at FMS. With this, the customer gets all kind of advantages of flash technologies in ultra-thin and ultra-compact 1113 BGA form factor. The E13T BGA SSD can perform up to 1.7 GB per second sequential read and 1.1 GB per second sequential write while consuming only 1.5 watts. That ensures a prolonged battery life of any embedded solution. At the same booth, there will be more demonstrations from Phison’s Technology partners Liqid and Cigent Technology Inc. Liqid will be showcasing its ultra-high-performance Gen4 NVMe full length and full height add-in card model LQD4500 powered by Phison’s E16 controller. It is capable of 5 million IOPS and 24 GB per second throughput. The card is available up to 32 TB of capacity.

    On the other hand, Cigent will demonstrate it’s Dynamic Data Defense Engine (D3E™) for Windows 10. D3 when paired with Phison’s E12-based SSD based helps to prevent the exfiltration of sensitive data as soon as a system gets compromised. As a matter of fact, s Phison E12 allows D3E to support “on the fly” firmware based folder locking which can only be allowed to access with a higher level authentication the moment its threat level is elevated.

    Phison Electronics

    K.S. Pua, CEO, Phison Electronics says, “Whether in the audience at one of our speaker presentations or stopping by our booth for a demonstration of our next-generation technologies, FMS attendees will have an excellent opportunity to learn how Phison is leading the way in delivering high-performance solutions that meet the ever-increasing needs of the data storage market. FMS is the ideal setting for us to demonstrate this leadership, as well as the perfect venue to publicly show our E19 for the first time.  We look forward to a great show.”

    July 30, 2019  11:33 PM

    Storage Economics Sets A New Benchmark with S1aaS @StorOne_Inc

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Backup storage, Data storage software, Dell, FPGA, Mellanox, Storage

    Storage Economics gets a new paradigm shift soon after the announcement of S1-as-a-Service (S1aaS) from StorOne. What is means is a comprehensive enterprise storage solution is now feasible at a very cost-effective subscription model that never existed so far. Neither had it been imagined or initiated by any other venture so far. StorOne’s innovation behind the S1 storage software platform brings S1aaS. S1aaS, in simple terms, can be explained as a use-based solution integration enterprise-class S1 storage service with Dell Technologies and Mellanox hardware. This integrated solution showcases the next level of storage economics where the industry can define its price point. This transformation delivers a perfect balance between enterprise-level performance and data protection capabilities. It also provides the best answer to a long-pending question hovering around without a solution. This refers to proper resource utilization that StorOne has overcome beautifully balancing security risks and performance impacts.

    Storage Economics

    These security risks and performance impacts are usually deeply connected with the cloud that on one hand provides the reliability and capabilities as good as that of an on-premise model and on the other hand guarantee you only pay for what you need to utilize to support your business requirements. And, in fact, all this happens flawlessly and in a very transparent manner. This is one of the best prepositions for the customer who gets the best of both worlds. On one hand, the customer enjoys cloud-like simplicity and on the other hand the flexibility in the pricing in such a manner as listed above. That is the environment remains as that of a cloud-based model and the performance and control is like an on-premise infrastructure.

    Storage Economics Achieves a New Landmark

    The pricing of this S1aaS model starts at $999 per month for an 18 terabyte (TB) of an all-flash array that performs up to 150,000 IOPS. This, as a matter of fact, is the most flexible model with customer definable pricing along with best of the capacity and performance capabilities available in the market.

    Gal Naor, CEO, and co-founder of StorONE says, “S1aaS is going to change the economics not only of storage but of the entire data center. S1aaS makes enterprise-class all-flash array performance and data protection and control available for only $999 per month. No other vendor can offer a complete storage solution – whether on-premises or in the cloud – for this low of a monthly cost.”

    George Crump, Founder, and Lead Analyst, Storage Switzerland says, “As high-performance, SAS and NVMe flash drives become commonplace in the data center, storage media is no longer the bottleneck to performance. The storage management layer is a problem. Vendors try to compensate by using more powerful processors, more RAM, custom FPGAs, and ASICs, as well as spreading I/O across dozens of flash drives, whose capacity is not needed. StorONE’s focus on efficiency – 150K IOPS from four conventional drives, an industry-defining capability – is the foundational component of S1aaS. It enables the democratization of storage performance previously unavailable to the data center.”

    Motti Beck, Senior Director Enterprise Market Development, Mellanox says “Advanced storage solutions like this require high-performance, programmable and intelligent networks. The combination of StorONE’s S1 software and Mellanox Ethernet Storage Fabric solutions eliminate the traditional bottlenecks that have been associated with the server to storage communication and supports critical storage features, which improves data center efficiency and ensures the best user experience available.”

    Further information about S1aaS is available at http://bit.ly/2JWBb56

    July 24, 2019  11:44 PM

    NVIDIA Partner Network Strengthens With the entry of @SwiftStack @NVIDIA

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Artificial intelligence, Cloud data storage, Machine learning, NVIDIA, Public Cloud, SwiftStack

    According to the latest release, Swiftstack joins NVIDIA Partner Network. It means a lot to Major industries like automobile, healthcare, and telecom to name a few. Now, Autonomous Vehicles, Telecom, and Healthcare can leverage large-scale artificial intelligence and machine learning data pipelines from edge to core to cloud. That covers a complete spectrum, in fact. SwiftStack, as we all know, is the market leader in multi-cloud data storage and management. The company announces its entry to NVIDIA Partner Network (NPN) program as a key solution provider for latest technologies like artificial intelligence and machine learning use cases. SwiftStack uses NVIDIA DGX-1 systems and NGC container registry of GPU-optimized software in its latest state-of-the-art storage solution. The solution covers large-scale Artificial Intelligence (AI) and Machine Learning (ML) along with deep learning workflows that span across edge-to-core-to-cloud data pipelines for the use cases mentioned above.

    NVIDIA Partner Network

    As a matter of fact, the NPN Solution Advisor Program empowers NVIDIA customers with full access to world-class solution experts having deep knowledge of enterprise-wide integration with NVIDIA DGX-1 clusters. To add further value to it, SwiftStack’s AI/ML solution has the power to deliver massive storage parallelism and throughput to NVIDIA GPU compute and NGC. The use cases cover a wide range including data ingest, training and inferencing, data services, etc to support any kind of AI/ML workflows. On top of it, the SwiftStack 1space solution extends to the public cloud so that the customers can benefit from cloud-bursting and economies of scale. At the same time, the data stays secured on-premises.

    SwiftStack joins NVIDIA Partner Network

    Amita Potnis, Research Director at IDC’s Infrastructure Systems, Platforms and Technologies Group says, “ Infrastructure challenges are the primary inhibitor for broader adoption of AI/ML workflows. SwiftStack’s multi-cloud data management solution is the first of its kind in the industry and effectively handles storage I/O challenges faced by the edge to core to cloud, large-scale AI/ML data pipelines.”

    NVIDIA Partner Network

    Shailesh Manjrekar, Head of AI/ML Solutions Marketing and Corporate Development at SwiftStack says, “ The SwiftStack solution accelerates data pipelines, eliminates storage silos, and enables multi-cloud workflows, thus delivering faster business outcomes. Joining NVIDIA’s Partner Network program builds upon the success we are seeing with large-scale AI/ML data pipeline customers and endorses our value to these environments.”

    Craig Weinstein, Vice President, Americas Partner Organization at NVIDIA says, “ NVIDIA AI solutions are used across transportation, healthcare and telecommunication industries. Our high-performance computing platform needs fast storage and SwiftStack brings on-premises, scale-out, and geographically distributed storage that makes them a good fit for our NPN Solution Advisor Program.”

    July 21, 2019  11:33 PM

    Protect Your Physical, Virtual, and Cloud With NAKIVO v9 @Nakivo

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Data Recovery, Deduplication, Nakivo, VM backup

    With the release of NAKIVO v9, your support for physical windows server backup is here. As a matter of fact, NAKIVO Backup & Replication v9 now provides 100% protection for physical, virtual, and cloud environments. NAKIVO Inc. has proven its mettle in a very short span. It is one of the fastest-growing software company with a sole aim to provide enterprise solutions to protect virtual and cloud environments. With this announcement, NAKIVO creates a new landmark in physical, virtual, and cloud environments. The new version, i.e. v9, adds support for Microsoft Windows Server backup thereby empowering its customers to safeguard physical, virtual, and cloud environments from a single point. That is an extremely useful feature for an enterprise from system upkeep point of view. There are certain key features of the new release. For instance, it supports the application-consistent Backup.

    What we mean by Application-consistent Backup is that the new release NAKIVO Backup & Replication v9 can very well take care of incremental, application-aware backups of physical Microsoft Windows Servers. That means the solution now provides application-consistent backups of business-critical applications including databases running on physical Windows Servers. This includes different variants of Microsoft Servers viz Microsoft Exchange, SQL, Active Directory, and Sharepoint, and also Oracle. Another key feature is the Global Data Deduplication. Now, the backups of physical servers can be stored in a regular backup repository. This can easily be done along with backups of VMs and AWS EC2 instances. All these backups being stored in a backup repository can, in turn, be automatically deduplicated irrespective of the platform. This ensures only unique data blocks being saved. This results in a tremendous saving of storage space used by physical machine backups.

    NAKIVO Backup & Replication v9 also means Instant Granular Recovery. This means now you can instantly recover files, folders, or Microsoft application objects (any Microsoft server) directly from the earlier created deduplicated physical machine backups. In addition, customers can take help from the Universal Application Object Recovery feature when an instant recovery of objects from any other applications is required. The new version supports Physical to Virtual (P2V). Besides instantly recovering files, folders, and objects from physical servers backups, enterprises can also restore physical Windows Server backups to VMWare and Hyper-V VMs. While the new version tackles so many complexities, the pricing model is quite simple. It starts at $17 per machine/year which is probably the most cost-effective per machine subscription model. A single per-machine license means either of a VMWare VM, Hyper-V VM, a physical machine, Nutanix AHV VM, or AWS EC2 instance.

    This provides customers a high level of flexibility and least dependence on different vendors. The customer can now easily remove vendor locks in order to move their workloads between various platforms without any need of changing their data protection licensing. Bruce Talley, CEO, NAKIVO Inc. says, “NAKIVO Backup & Replication v9 enables our customers to not only protect their business-critical workloads across virtual and cloud environments but now also physical Windows Server systems. Now our customers who have physical or mixed environments can protect their critical business data from a single pane of glass.”

    Trial Download: www.nakivo.com/resources/download/trial-download/
    Success Stories: www.nakivo.com/customers/success-stories/
    Datasheet: www.nakivo.com/res/files/nakivo-backup-replication-datasheet.pdf

    July 14, 2019  12:10 AM

    1st Data Orchestration Platform with Multi-cloud Analytics AI @Alluxio

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Cloud analytics, data orchestration

    Alluxio is a well-known name among the world’s top internet companies. In fact, 7 out of the top 10 internet companies use open-source data orchestration technology developed by Alluxio. The launch of Alluxio 2.0 at the recent AWS Summit in New York announces a lot more power in it. The open-source and enterprise edition Alluxio 2.0 simplifies and accelerates multi-cloud, data-hungry workload adoption and deployment. As a matter of fact, Alluxio 2.0, as a result, brings breakthrough innovations for data engineers who are responsible for managing and deploying analytical and AI workloads in the cloud.

    The solution works equally well in hybrid and multi-cloud environments. There is a tremendous demand in compute workloads across the globe. Adoption of the cloud has triggered this requirement multifold in an exponential manner. Organizations are adopting a decoupled architecture for modern workloads in which compute scales independently from storage. That brings in new data engineering problems.

    Data Orchestration

    The new paradigm definitely enables scaling elasticity. But at the same time, it scales up new data engineering problems. That arises a need for an abstraction layer with an immediate effect. The way compute and containers happen well with Kubernetes, data needs orchestration so badly with an increase in data silos. Data Orchestration will not only bring data locality but also enable data accessibility and data elasticity to compute across data silos. Those silos include different zones, regions, and even clouds. That is where Alluxio 2.0 Community Edition and Enterprise Edition comes in the picture. The two editions ensure new capabilities across all known critical segments that are causing gaps in today’s cloud data engineering market. Alluxio 2.0 is a true example of breakthrough data orchestration innovation for multi-cloud. It ensures policy-driven data management, improved administration, cross-cloud efficient data services, focussed compute, and integration.

    Data Orchestration

    Haoyuan Li, Founder and CTO, Alluxio says, “With a data orchestration platform in place, a data analyst or scientist can work under the assumption that the data will be readily accessible regardless of where the data resides or the characteristics of the storage. They can focus on building data-driven analytical and AI applications to create values, without worrying about the environment and vendor lock-in. These new advancements to Alluxio’s data orchestration platform further cement our commitment to a cloud-native, open-source approach to enabling applications to be compute, storage and cloud agnostic.”

    Mike Leone, Analyst, ESG says, “Data is only as useful as the insights derived from it and with organizations trying to analyze as much data as possible to gain a competitive edge, it’s challenging to find useful data that’s spread across globally-distributed silos. This data is being requested by various compute frameworks, as well as different types of users hoping to gain actionable insight. These multiple layers of complexity are driving the need for a solution to improve the process of making the most valuable data accessible to compute at the speed of innovation. Alluxio has identified an important missing piece that makes data more local and easily accessible to data-powered compute frameworks regardless of where the data resides or the characteristics of the underlying storage systems and clouds.”

    Data Orchestration

    Steven Mih, CEO, Alluxio says, “Whether by design or by departmental necessity, companies are facing an explosion of data that is spread across hybrid and multi-cloud environments. To maintain a competitive advantage, speed and depth of insight have become the requirement. Data-driven analytics that was once run over many hours, now need to be done in seconds. AI/ML models need to be trained against larger-and-larger datasets. This all points to the necessity of a data tier which orchestrates the movement and policy-driven access of a companies’ data, wherever it may be stored. Alluxio abstracts the storage and enables a self-service culture within today’s data-driven company.”

    Both Alluxio 2.0 Community and Enterprise Edition are now generally available for download via tarball, docker, brew, etc.


    Alluxio 2.0 release page – https://www.alluxio.io/
    Download Alluxio 2.0 – https://www.alluxio.io/
    Founder blog – https://www.alluxio.io/blog/
    Product blog – https://www.alluxio.io/blog/2-

    July 13, 2019  11:12 PM

    Tachyum Inc’s 64-core processor cuts processor power by 10x @Tachyum

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Artificial intelligence, Data Center, processor

    Tachyum is a combination of two Greek words collectively symbolizing ‘an element of speed’. The company fits best to it. And it is continuously craving to beat its own records in terms of bringing a better product every time. Every time it brings a new product, that is first in the world. I think the top management of Tachyum is firm on keeping the company creating new landmarks for others to follow. The recent news is a good example of it. Tachyum is bringing $25 million in Series A financing. Rado Danilak, CEO, Tachyum has a couple of significant laurels to his credit. His last two companies were bought by Western Digital (WD) and LSI/SanDisk. He carries more than 100 patents to his credit. These patents are already in production. Rado has a deep knowledge of the semiconductor market in general.


    Anybody dealing in chips and processors would be conversant with Prodigy Processor Universal Chip from Tachysum. It is the smallest and fasters general purpose, 64-core processor developed to date across the globe. The money mentioned is being used for further enhancement of the Prodigy Processor Universal Chip. It requires 10x less processor power than the chips produced by its nearest competitive products in the market from Intel, NVIDIA, and AMD. Another huge disruption it brings is a cost reduction of 3x. These two factors are more than enough to shake the existing markets and redefine the leadership chart in these segments. The development matches well with the AI revolution that demands machines more powerful than the human brain. The ultimate goal is to deliver AI for Good and AI for All. Prodigy has enormous strength. It reduces data center annual total cost of ownership (TCO) by 4x.

    Tachyum brings a new revolution in AI with Prodigy

    Prodigy from Tachyum is a sheer example of disruptive hardware architecture and a smart compiler. In fact, this new design has made many parts of the existing hardware in a typical processor redundant. The core has become simpler and smaller. The wires have become fewer and shorter. All this results in greater speed and power efficiency for the processor. That is what Prodigy Universal Processor Chip is. It is an ultra-low-power processor having a capability of enabling an Exaflop supercomputer using around 250,000 Prodigy processors.


    Adrian Vycital, Managing Partner at IPM Group, Tachyum’s lead investor based in London and Bratislava says, “The work that Tachyum is doing is highly disruptive and will lead to dramatic improvements in burgeoning markets of artificial intelligence and high-performance computing that require extreme processing speeds and power efficiencies. Supporting Tachyum at this stage of their development provides cascading opportunities for unprecedented success, helping them to establish themselves as the leader in what truly is the future of computing.”

    Dr. Radoslav Danilak, Co-founder and CEO of Tachyum says, “We are extremely pleased to announce another infusion of working capital into Tachyum, which not only enables us to complete our mission of delivering disruptive products to market but also represents well-reasoned confidence in our approach to overcoming challenges faced by the industry. The ability to change the world takes more than one man’s vision. Having an investment community backing Tachyum allows us to properly build a world-class organization with the best and brightest talent available. We look forward to growing the company and the industry atop the foundation that we’ve already built.”

    You can visit the official website here: http://www.tachyum.com

    July 10, 2019  9:58 PM

    How Safe Is Your Enterprise Backup Data from Malware Attack? @asigra

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Asigra, Backup and restore, Backup Recovery and Media Services, Cloud Backup, Enterprise Backup, malware

    How many CIOs and CTOs can surely claim that their enterprise backup that is being taken regularly is not contaminated with any kind of Malware. A recent report published by DCIG features cybersecurity approaches from the legends in the field like Asigra, Rubrik, and Dell. Detecting and preventing malware in your enterprise backup environments is as critical as in the production environments. Unless and until you have sufficient knowledge and tools to detect those, it is impossible for you to respond and prevent your enterprise data well within safe limits. Asigra Inc. is a pioneer in cloud backup, recovery, and restore solutions since 1986. Just now it announced that the Data Center Infrastructure Group (DCIG) has come out with an important report titled “Creating a Secondary Perimeter to Detect Malware in Your Enterprise Backup Environment.” I think, the report is important to read by all CIOs and CTOs.

    enterprise backup

    source: asigra.com

    CTOs and CIOs must read this report in order to understand the threats and vulnerabilities in this regard they are living in. The report gives quite a number of useful insights. It presents a comparison of three approaches that any enterprise can use to detect and prevent malware attacks on backup data. The report further analyses which approach may be the most effective for enterprise backup environments. It is the purity of your backup sets that creates confidence in recovering lost data when the need arises. Therefore it is the foremost point of importance to understand that. As a matter of fact, enterprises are now understanding the importance of managing the threat that malware offers to backup data in today’s high-risk environments. A successful recovery completely depends on how pure a confirmed reproductive set of enterprise backup is. That set, in fact, should be completely free form malware.

    Enterprise Backup Data

    The three methodologies that the DCIG report talks about creating a golden copy of enterprise backup data are like this. First one talks about the inline scan which means all incoming and restored backup data must be actively scanned for malware in real-time. The second method recommends using a sandbox approach in which no scan happens while creating a backup set but an IT sandbox is set separately to recover this data and test it thoroughly for malware. Finally, the third method it talks about is Snapshot analysis in which snapshots of production data are captured and analyzed thoroughly. It is the result of the analysis will decide which set is infected with malware. Obviously, out of these three, the most appropriate method is the inline scan of backup and recovery data.

    Enterprise Backup

    Source: Asigra.com

    As DCIG states, “Inline scans represent the easiest and fastest way for a company to scan its backup data for the presence of known strains of malware as well as position the company to scan recovered data for yet unknown malware signatures.” That is where the Asigra enterprise backup solution comes in the picture as a top contender. The report suggests Asigra Cloud Backup V14 as an optimum solution for inline scanning of malware.

    Jerome Wendt, Founder, and President, DCIG says, “The products that Asigra, Dell EMC, and Rubrik offer, and the respective techniques they use to detect the presence of malware in backup repositories, represent the primary methodologies that backup software employs. Of these three, only Asigra and Rubrik provide a company with the means to automate and simplify the process to detect malware in backups. Of those two, only Asigra currently makes cybersecurity software available as an optional feature that a company can turn on.”

    Enterprise Backup Solution

    Eran Farajun, Executive Vice President, Asigra says, “ Asigra Cloud Backup V14 converges enterprise data protection and cybersecurity, embedding malware engines in the backup and recovery streams to prevent ransomware from impacting the business. Asigra identifies any infecting malware strains, quarantines them, then notifies the customer. It is a very comprehensive data protection solution, built from the ground up for distributed IT environments.”

    You can download the free DCIG report here: http://library.asigra.com/dcig-report

    July 10, 2019  5:17 PM

    @SwiftStack Enables @dcBLOXinc To Deliver Multi-Region Cloud Storage

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Cloud storage, SwiftStack

    DC BLOX is a multi-tenant data center provider in Southeastern U.S. The company designs and manages highly secured & reliable data centers for almost all segments of clients viz government and education, enterprise, healthcare, content providers, life sciences, and managed service providers. That is a huge spectrum they serve to. You will mind many of their state-of-the-art data centers in traditionally underserved markets. This way they are able to provide affordable business-class cloud storage and colocation services along with a private high-performance network in the Southeast with an aim to guarantee business continuity The company also supports hybrid IT environments with the least upfront capital investment without a compromise with an iota of quality. Recently, DC BLOX selected SwiftStack to deliver a large scale multi-region cloud storage service for Southeastern U.S. That is a huge volume to cater to with a seamless service.

    SwiftStack is a market leader in multi-cloud storage and its management. DC BLOX decides to deploy SwiftStack software in order to boost its own multi-region, hyperscale cloud storage service for its large customer base for their business continuity (BC) and disaster recovery (DR) needs. As the business grows, it results in higher customer expectations. To meet those expectations, the service provider needs a foolproof stable system in place. DC BLOX was facing a tremendous demand for affordable secondary storage services from its existing customers. At the same time, it was having a high pressure to scale up its business level to expand to more regions. That resulted in an immediate requirement of a new storage platform that could seamlessly perform, scale-up, and cater to multiple geographic regions. There was also an intense requirement of support traditional and cloud-native applications.

    Seamless Multi-Region Cloud Storage

    Among many options of object and file storage from the leading vendor, DC BLOX found SwiftStack most suitable after a comprehensive evaluation process. Thus SwiftStack was given a nod to provide a turnkey platform for DX BLOX Cloud Storage. With the help of SwiftStack, DC BLOX is able to create a multi-region cluster that currently is at three locations with the fourth one coming shortly. And 15 more to come within a stipulated timeframe in a well-planned manner.

    Chris Gatch, CTO at DC BLOX says, “ SwiftStack helped us reduce the cost of storing and utilizing data, based on a comparison with other choices we considered, including the ability to manage more data with a smaller headcount. Along with savings at scale, we are able to offer innovative data services for a more compelling, more competitive solution.”

    Erik Pounds, Vice President of Marketing at SwiftStack says, “ DC BLOX offers a cloud solution that addresses the needs of the business communities they serve, and also has unique differentiators to let them compete with global public cloud providers. Giving its customers both object and file access to data ensures cloud storage is compatible with their users’ modern and legacy applications, which is a fairly unique feature compared to what is available from big cloud vendors.”

    July 8, 2019  7:27 PM

    Content Guru Creates A New Landmark at CCW Las Vegas @cgchirp

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Customer engagement, customer experience, Las Vegas, Rakuten

    Customer Contact Week (CCW) in Las Vegas this week becomes a major collaboration moment for two experts in their respective fields. One is Content Guru which is a global frontrunner in large volume cloud-based contact center technology. On the other hand is Rakuten Inc., the Japanese electronic commerce company. The two global leaders joined forces at CCW to demonstrate how effective intelligent automation is when it comes to delivering exemplary Customer Engagement and Experience. Of course, excelling in customer engagement would be a matter of concern for any organization across the globe. As a matter of fact, this is the 20th anniversary of CCW this year. CCW or Customer Contact Week that was from June 24 to June 28, 2019 at The Mirage Hotel in Las Vegas in world’s most significant conference and expo for Customer Experience (CX), contact center, and customer care professionals.

    Content Guru

    Content Guru was at stand #1102 at the event. The theme for Content Guru at the event was to showcase various major achievements for its customers in terms of customer experience. Rakuten is also known as the ‘Amazon of Japan’. The story of the relationship between Content Guru and Rakuten is quite interesting. Their journey together began at CCW Las Vegas in 2017. Rakuten highlighted how the company successfully used Content Guru’s cloud-based storm® platform that helped them transform their customer’s experience. After this deployment, the customers of Rakuten experience a greater experience when they contact the company or the thousands of sellers that use Rakuten platform to sell their products. In fact, Content Guru hosted a workshop on Tuesday, June 25 that was titled “Next Generation Omni-Channel Contact Center: AI, NLP, Web Chat & Chatbots”. The workshop was led by Martin Taylor, Deputy CEO, Content Guru.

    Content Guru Creates A New Landmark in Customer Experience

    By means of conducting this workshop, Content Guru showcased how they utilize intelligent automation and Artificial Intelligence through their state of the art storm® platform that helps their customers to deliver high-quality and ultimate customer service. Martin Taylor says, “Content Guru’s partnership with Rakuten originally began from conversations at CCW Las Vegas, so this event will always have a special place in our hearts. The quality of customer service has become a crucial differentiator between businesses. The CCW conference allows us to put front-and-center Content Guru’s vision to place organizations head-and-shoulders above their competition by providing the best Customer Engagement and Experience.”

    You might like to see how storm works. Have a look at the insightful video below:

    July 2, 2019  4:01 PM

    Why You Should Replace or Enhance Your Legacy VPN with a Software-Defined Perimeter (SDP) Solution

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    DH2i, SDP, VPN

    As has been the case for many decades, innovative new applications are entering the marketplace on a regular basis to support the ever-changing way we do business, interact with customers, and interact with each other. While developments in areas such as cloud, AI, machine learning, IoT, edge, mobile and big data to name just a few, bring with them undeniable and highly desirable benefits, they can also introduce problems. Certainly, one of the biggest pain points for many organizations is how to ensure the protection and security of data. To not do so can mean not only serious detriment to the long-term success of your business but can also carry serious legal and regulations compliance ramifications.

    Adding to the problem is that many IT professionals have come to rely upon and trust virtual private networks (VPNs) to deliver the level of security they require. And, this makes sense. For a very long time, they did indeed deliver the required security protection. Unfortunately, as it stands today VPNs have not evolved to support today’s application protection and security requirements. At least, not by themselves.

    I recently spoke with Don Boxley, CEO, and Co-Founder of DH2i on this subject. He describes VPNs as taking a “castle and moat” approach to security, where the VPN serves as the drawbridge. This painted a very understandable picture as to why VPNs are unable to meet today’s new business and IT realities. He explained that via this approach, organizations are more vulnerable to compromised devices and networks, excessive network access by non-privileged users, credential theft and other security issues. From a non-security specific standpoint, the VPN introduces complex manual set-up and maintenance, slow and unreliable connections, and an inability to scale efficiently and cost-effectively.

    We then talked about a relatively new approach to not necessarily replace a VPN (although, I would argue it could), but to dramatically enhance it – a software-defined perimeter (SDP) solution. SDPs offer an ideal new approach for connectivity security. SDP tackles legacy VPN, cloud-native and privileged user access security issues. Designed specifically to support today’s DevOps, IoT, containers, edge, and other workloads, with the inherent flexibility to be tailored to support future yet to be introduced application/workload requirements, SDP not only delivers considerably improved security but increased performance speed as well.

    DH2i has announced a new SDP solution, called DxConnect. DxConnect is a network security software designed to enable developers and network admins to build an integrated zero trust (ZT) connectivity security infrastructure for cloud-native applications, hybrid/multi-cloud connectivity and privileged user access without using a VPN. If you are interested in securing your organization’s data, and you wish to replace or enhance the capabilities of your VPN, you can learn more about DH2i’s new software here: http://dh2i.com/dxconnect/.

    Forgot Password

    No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

    Your password has been sent to: