Quality Assurance and Project Management


December 13, 2019  9:10 PM

Market Study on Call Centers – 2019 by Ozonetel @Ozonetel_KOOKOO

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Call Center

A Study and analysis conducted by Ozonetel on call centers presents some very interesting facts. More than 250 million calls are studied to understand a correct picture of the call center in 2019. The fact of the matter is that call centers this year have worked substantially harder with a solo aim of customer satisfaction. Although any organization would strive for it for sure. But when a company puts this on top of everything else on its list of priorities, the whole game takes a new direction. Ozonetel is among India’s top solution providers of on-demand cloud communication and telephony. The study unveils very interesting insights about the call center industry. A sample set of over 250 million calls made on Ozonetel’s CloudAgent platform was taken for this important study and analysis. These 250 million calls were made by more than 60,000 active agents in 2019.

The sample set includes both in-bound and outbound calls that were made on Ozonetel’s platform. The calls belong to various business verticals such as real estate, education, ecommerce, pharma, travel, finance, banking, and food & restaurants. The analysis with those interesting facts is as below:

Fact 1: Least Wait Time for Customer

There is a generic formula being adhered by call centers for last many years. The formula follows 80-20 rule which means 80% of the customer calls need to be answered within 20 seconds. This has remained as a benchmark for almost all industry verticals for call center service levels. Ozonetel’s report reflects a serious transition in this rule. According to the latest analysis, industries tend to exceed this benchmark in various ways. The new standard is 93% for an average service level which reflects that a major chunk of call center agents across the globe answered 93% of their customer calls within 20 seconds. The credit for this significant enhancement in service level goes to two factors – better call routing and adoption of advanced distribution tools.

Fact 2: Faster Response from Agents

Against an average speed of answer in 2018 of 6 seconds, the same in 2019 has improved tremendously to 3.5 seconds. That is a phenomenal achievement. This, basically, pertains to the average time taken by a call center agent to respond to an inbound call from a customer. Interestingly this excludes the time spent by the customer in the IVR or waiting in queue. Faster response definitely reflects lesser wait time for the customer. Obviously, it improves customer experience in a big way. Many call centers are using auto-answering features to speed up agent’s response time.

Fact 3: Enhancement in Agent Efficiency

Wrap time in call centers means the time taken by a call center agent in attending to the next call. It has a direct impact on queue wait time, efficiency, and productivity. In this context, After Call Work has improved from 29 seconds in 2018 to 25 seconds in 2019. As a matter of fact, International call centers have achieved higher results by reducing average wrap time to 15 seconds.

Fact 4: Agents Gaining Longer Breaks

On an average an agent logs in for 7.5 hours per day and takes breaks for an average of 67 seconds. Agents, that means, are working harder with marginally longer breaks by answering or wrapping calls for almost 85% of their workday.

Fact 5: Outbound Dialling Switches from Manual Dialling To Power and Predictive Dialling
Manual dialling is reducing at a higher speed. Power and predictive dialers are taking its place. On an average, an agent, across outbound call centers, dials 90 calls speaking to 42 contacts per day. This switching from manual to power and predictive dialing has given a significant boost to the improvement in outbound dialling. The efforts are on to improve the answer rates with the improvement in data quality and some innovative experimentation with call timings.

Chaitanya Chokkareddy, Chief Innovation Officer, Ozonetel says, “We have analysed various metrics to determine and understand trends in customer experience as well as agent efficiency. We believe this report has value as a benchmarking index for the industry: call center agents, managers, and businesses. As customer support becomes an increasingly omni-channel play, right tech integrations can help contact centers add more value to businesses. For instance, integrating WhatsApp into your contact centre platform can be a game-changer in your customer support efforts. Ozonetel is constantly developing holistic solutions to provide seamless experience for both the customer and the call center agent.”

December 12, 2019  10:43 PM

WebNMS Delivers End-to-End IoT Enabled Solutions @WebNMSIoT

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
IIoT, Industrial IoT, iot

Almost 2.5 years back WebNMS and Energy Monitoring Ltd. announced partnership to deliver end-to-end IoT enabled energy and industrial solutions in East Africa. That was a major breakthrough for WebNMS. WebNMS is a division of Zoho Corporation having a very successful track record. IoT solution offers a customizable platform to build edge-to-cloud solutions that connects and manages energy and assets at remote infrastructure. This is an enterprise-scale IoT platform. But it is not just limited to this. EdgeX, a multi-vendor software agent of IoT platform has a capability to seamlessly integrate multiple protocols while enabling real-time data collection that results in local intelligence across the edge devices. There are a plenty of similar large-sized projects successfully completed by WebNMS. That has established WebNMS among the top system integrators in Enterprise IoT spectrum. It is actually tapping the right potential to resolve major business challenges.

Product innovation is one of the key strongholds of WebNMS. Its open platform approach empowers third-party system integrators and development teams to build innovative applications. It also has a number of ready-to-launch IoT applications for specific industry verticals such as renewable energy, remote locations, telecommunication, commercial infrastructure, transportation, etc. In July 2019, WebNMS partnered with Swedish firm HMS Networks for industrial solutions. HMS Networks is supplier of solutions for industrial IoT and communication. WebNMS integrates seamlessly with Ewon Flexy, a remote access and industrial management equipment designed by HMS Networks. This integration results in multi-protocol data acquisition. This data, then further, processed and analyzed for gaining greater business insights. Various use cases emerging out of this include process monitoring, remote monitoring, optimization, quality traceability, predictive maintenance, and lifecycle management of assets. WebNMS exhibiting at Gitex Technology Week in Dubai from 6th to 10th October, 2019 was another landmark.

WebNMS

WebNMS

Source: WebNMS.com

Karen Ravindranath, Director, WebNMS says, “With our extensive portfolio of IoT AEP platform and vertical solutions, we are focused on expanding our partner ecosystem which consists of SIs, device manufacturers, solution providers, Managed Engineering Service Providers, facilities and engineering service providers from the region to maximize the potential of IoT solution and services offered to enterprise and industrial customers from the region”.


December 11, 2019  8:34 PM

2020 Predictions On 7 Major Technology Developments @alluxio

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
ai, cloud, Data Analytics, DevOps, Hadoop, HDFS, Hybrid cloud, Kubernetes, Machine learning, Spark, Storage, Technology

Here are 2020 Technology Predictions of Haoyuan Li, founder and CTO, Alluxio, about seven major technology developments he sees in cloud, AI, DevOps, data analytics and storage in 2020.

Prediction 1: Rise of the hybrid cloud (really)

We’ve been hearing people talk about the hybrid cloud for the past three years now. And for the most part, that’s all it’s been – talk. 2020 is the year it gets real. We are seeing large enterprises refusing to add capacity on-prem to their Hadoop deployments and instead invest in the public cloud. But they are still not willing to move their core enterprise data to the cloud. Data will stay on-prem and compute will be burst to the cloud, particularly for peak demands and unpredictable workloads. Technologies that provide optimal approaches to achieve this will drive the rise of the hybrid cloud.

Prediction 2: One Machine Learning framework to rule them all

Machine learning with models has reached a turning point, with companies of all sizes and at all stages moving towards operationalizing their model training efforts. While there are several popular frameworks for model training, a leading technology hasn’t yet emerged. Just like Apache Spark is considered a leader for data transformation jobs and Presto is emerging as the leading tech for interactive querying, 2020 will be the year we’ll see a frontrunner dominate the broader model training space with pyTorch or Tensorflow as leading contenders.

Prediction 3: “Kubernetifying” the analytics stack

Technology Developments

Photo credit: Marc_Smith on Visualhunt.com / CC BY

While containers and Kubernetes works exceptionally well for stateless applications like web servers and self-contained databases, we haven’t seen a ton of container usage when it comes to advanced analytics and AI. In 2020, we’ll see a shift to AI and analytic workloads becoming more mainstream in Kubernetes land. “Kubernetifying” the analytics stack will mean solving for data sharing and elasticity by moving data from remote data silos into K8s clusters for tighter data locality.

Prediction 4: Hadoop storage (HDFS) is dead. Hadoop compute (Spark) lives strong.

There is a lot of talk about Hadoop being dead…but the Hadoop ecosystem has rising stars. Compute frameworks like Spark and Presto extract more value from data and have been adopted into the broader compute ecosystem. Hadoop storage (HDFS) is dead because of its complexity and cost and because compute fundamentally cannot scale elastically if it stays tied to HDFS. For real-time insights, users need immediate and elastic compute capacity that’s available in the cloud. Data in HDFS will move to the most optimal and cost efficient system, be it cloud storage or on-prem object storage. HDFS will die but Hadoop compute will live on and live strong.

Prediction 5: AI & analytics teams will merge into one as the new foundation of the data organization

Yesterday’s Hadoop platform teams are today’s AI/analytics teams. Over time, a multitude of ways to get insights on data have emerged. AI is the next step to structured data analytics. What used to be statistical models has converged with computer science to become AI and ML. So data, analytics, and AI teams need to collaborate to derive value from the same data they all use. And this will be done by building the right data stack – storage silos and computes, deployed on-prem, in the cloud, or in both, will be the norm. In 2020 we’ll see more organizations building dedicated teams around this data stack.

Prediction 6: Talent gap will inhibit data technology adoption

Building the stacks that enable data technology into practice is hard, and this will only become more obvious in 2020. As companies discuss the importance of data in their organizations, they’ll need to hire the data, AI, and cloud engineers to architect it. But there aren’t enough engineers who have expertise in these technologies to do that. This “super-power” skill is the ability to understand data, structured and unstructured, and pick the right approach to analyze it. Until the knowledge gap closes, we’ll continue to see a shortage of these types of engineers – many companies will come up short on their promises of ‘data-everywhere’.

Prediction 7: China is moving to the cloud on a scale much larger than the US and will leap frog from on-prem to massive cloud deployments for advanced workloads

Over the past 5 year, while enterprises in the US have been moving in leaps and bounds to public clouds, enterprises in China have been investing mostly in on-prem infrastructure primarily for data-driven platform infrastructure. 2020 will be the inflection point where this changes. China will leapfrog into the cloud at a scale much larger than the US by adopting the public cloud for new use cases, bursting in the cloud for peak loads and over time move existing workloads. Public cloud leaders in China will see dramatic growth that might outpace the growth of the current cloud giants.


December 11, 2019  10:55 AM

2020 Technology Predictions of SIOS Technology @SIOSTech

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Artificial intelligence, DBaaS, DevOps, enterprise applications, Machine learning, Resellers, SIOS, System integration

In this post, let me share the 2020 Technology Predictions of SIOS Technology’s Frank Jablonski, VP, Global Marketing, that span cloud, AI/ML, DevOps, HA/DR, databases and the channel. Those are as below:

Frank predicts that Machine learning and artificial intelligence will deliver cost savings through greater cloud efficiencies.

Enterprises are looking for application and cloud service providers to help them operate more efficiently through the use of machine learning (ML) and artificial intelligence (AI) to deliver more effective resource management. Achieving this will require the environment or application to understand when it needs more resources and then automatically scaling up those resources to meet the increased demand. Conversely, the technology will need to understand when specific resources are no longer needed and safely turn them off to minimize costs. Today such dynamic resource allocation can be unreliable or must employ an inefficient manual process, forcing cloud customers to either spend more than necessary or fall short of meeting service levels during periods of peak demand.

Frank on DevOps says it will transition companies to cloud-native implementations.

Enterprises will seek to take full advantage of the cloud’s agility by re-architecting their application/technology stacks to optimize them specifically for the cloud environment. IT departments regularly use a “lift and shift” approach to migrating applications to the cloud, but the effort still requires some changes to ensure meeting desired service levels owing to some differences between private and public infrastructures. After the initial wave of migration to the cloud is optimized, DevOps will drive re-architecting their application/technology stacks to a cloud-native implementation to take further advantage of the cloud’s greater efficiency, reliability, scalability and affordability.

Frank opines that Application vendors will architect HA and DR into their core solutions.

Application vendors will endeavor to deliver greater value and higher reliability by integrating core high availability (HA) and disaster recovery (DR) features into their solutions. Most applications today require the customer to provide these protections separately, and most organizations do this for all their applications with a general-purpose HA/DR solution. With HA and/or DR built into an application as a standard feature, customers will be able to simply deploy it on any platform in a private, purely public or hybrid cloud environment. This will be especially beneficial for smaller organizations that normally lack the expertise or resources needed to implement and operate configurations capable of eliminating all single points of failure. For cloud-native implementations, the application vendor will want to take full advantage of the resiliency afforded by the cloud’s multiple availability zones and regions.

On Database as a service and Cloud, Franks predicts that DBaaS and cloud will become the preferred platform for database deployments.

IT organizations have traditionally chosen to implement critical databases and applications in their own datacenters, where the staff retains full control over the environment. As the platforms offered by cloud service providers (CSPs) have matured, the cloud has become commercially viable for hosting critical applications, as well as Database-as-a-Service (DBaaS). This viability is true even for complete suites, such as SAP, that span virtually all of an organization’s departments and all of its business functions This change will put greater focus on reliability, availability and performance of the applications, and make the cloud more strategically important to companies. For CSPs who deliver greater resilience through availability zones and geographic diversity, it will be a way to secure long-term engagements with customers.

Finally, he predicts about a major shift in the roles played by resellers and system integrators by stating that Resellers and system integrators will play an increasingly vital role as critical applications move to the cloud.

As the migration of enterprise applications to the cloud accelerates and matures, the need to ensure mission-critical high availability (HA) will create opportunities for resellers and system integrators. This window of opportunity is forming as enterprises seek more robust HA solutions that have yet to be fully integrated into the application and system software. Some system integrators may have the expertise and resources needed to leverage open source software in their Linux offerings. But an increasing percentage will choose to integrate solutions purpose-built to provide HA and disaster recovery protections, as these have proven to be more dependable for the customer, while also being just as (if not more) profitable for the integrator.


December 4, 2019  10:50 PM

2020 Customer Support and Experience (CX) Predictions

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
customer experience, Customer support, CX

UJET’s Anand Janefalkar, Founder and CEO has some very interesting 2020 Customer Support and Experience (CX) Predictions. UJET is provider of the world’s leading cloud contact center platform. Here are those quotes by Anand.

According to Anand, Messaging Will Surpass Voice. He says,

“While voice will always remain an important channel for support, especially for urgent issues, in 2020, we will see messaging (SMS and chat) overtake voice as the most critical support channel. Woe to customer service organizations that cannot provide an omnichannel support experience that includes messaging, as this will most surely equal the success or demise of the overall customer experience (CX).”

According to Anand, Multichannel Will Expand to Multimedia. He says,

“In 2020, expect to see customer service organizations turn their attention to optimizing each support pathway to meet the tech-savvy needs of many of their customers. Chief among enhanced capabilities will be multimedia. The ability to share screenshots, photos and even video between the customer and support professional will become commonplace during support interactions.”

Data Will Break Down Silos Between Customer Support and Other Teams, opines Anand saying,

“In 2020, the ‘digital transformation’ conversation that has become commonplace across IT, will extend into the customer service center. We will begin to see the impact and value of support data being shared across the enterprise. Customer feedback, sentiment, profile data and more will be securely shared across organizations helping teams such as marketing, sales and product development to make more strategic decisions. And as a result, the importance and value of customer support will be elevated as a whole.”

Anand feels that Agent Specialization Will Be A Key Focus. He said,

“In 2020, as the presence of technologies such as AI and Machine Learning within the contact center continue to grow, and more customers are directed towards bots and self-service options, support agents will become hyper-specialized. Agent specialization will not only be geared towards channels, but also centered around specific issues, situations and the urgency of incoming support interactions.”

Finally, he feels, AI Will Improve the Customer Support Employee Experience (EX), as well as the Customer Experience (CX). On this, he says,

“In 2020, AI will dramatically improve the employee experience (EX). The ability to automatically and instantly collect data from across multiple channels, analyze it and provide actionable insight will enable support agents to more quickly, easily and accurately address customer inquiries and come to highly satisfactory issue resolution.”


December 4, 2019  10:37 PM

Raspberry Pi, Software Defined Perimeters and cloud-based DR

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Cloud based DR, DH2i, Raspberry Pi, SDP, Software defined networks

Don Boxley, CEO and Co-Founder, DH2i has put some very interesting remarks about two key developments he foresees in 2020 regarding Raspberry Pi, Software Defined Perimeters and cloud-based DR. Let’s see them one by one.

According to Don, Enterprises will combine Raspberry Pi (RasPi) and software defined perimeters (SDP) to create secure low-cost IoT networks. He says,

“All over the world, people are using Raspberry Pis (RasPi) to learn about and build Internet of Things (IoT) devices. Raspberry Pi is a great platform for IoT – its a very cheap computer that runs Linux and provides a set of open GPIO (general purpose input/output) pins that allow you to control electronic components. Software defined perimeter (SDP) software improves the security of data flows between devices by removing an IoT device’s network presence, eliminating any potential attack surfaces created by using a traditional virtual private network (VPN). In 2020, enterprises will take advantage of the ubiquity of RasPi and the security of SDP software to enhance product differentiation with high value IoT networks.”

Don emphasized that Smart endpoints and software defined perimeters (SDP) will transform cloud-based disaster recovery (DR).

He presents his views saying,

“Many organizations are pursuing a cloud-based Disaster Recovery (DR) strategy to achieve the business objectives of: 1. Getting replicas off-site and 2. Eliminating the cost and complexity of building and maintaining a DR site. But these DR strategies typically depend on a VPN to connect the on-premises source to the cloud-based target. That’s a problem, because traditional VPN software solutions are obsolete for the new IT reality of hybrid and multi-cloud. They weren’t designed for them. They’re complex to configure, and they expose “slices of the network,” creating a lateral network attack surface. In 2020, a new class of DR software with integrated SDP security will emerge to eliminate these issues and disrupt the cloud DR market. This new SDP-enhanced DR software will enable organizations to build smart endpoint DR environments that can seamlessly span on-premises and the cloud without the added costs and complexities of a VPN, and with virtually no attack surface.”


December 2, 2019  1:21 PM

How an IP proxy network enables brands to view the internet thru customers’ eyes

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Ecommerce, Proxy

If you are in e-retail or e-commerce business at whatever scale, this article is for you. In today’s business culture and market dynamics, almost no e-commerce business can survive without using an IP proxy network. In simple terms, what can an IP proxy network do for you? It enables you, as a business, to see what your consumers see when each of them browses the internet. It allows you to openly view the internet, transparently – basically seeing a truthful web-reality. Many or most of the global retail brands are probably already using it.

Why do you need a transparent view of the internet?

For the most straightforward needs: To gather information that is openly available to your customers and to learn and understand your customer and peer online interaction with your offering, your competitors’ offering and your product. This is crucial business intelligence that cannot be ignored.

Every organization knows and understands the value of data. Today data is the most critical element to help you gain a competitive edge as well as continuous market traction and profits.

For example, as e-commerce brands, the most important thing is to understand how do your competitors’ price similar products to yours and how are they selling them in every geographical location; country, city, etc.

IP Proxy

Source: Luminati.io

All brands are collecting data, but it is significant to know how many have relied on timely, accurate decisions based on the right data. No brand can survive today without practicing this as their everyday data gathering routine. Most of today’s data is available online. But as a fact of life: if you are not collecting data accurately, leveraging the ability to see it from the customer’s point of view, your data may be meaningless.

Many brands feel they are ‘seeing what their customers are seeing’ but actually they are not. In fact, they can’t because they are often using irrelevant or outdated data that will never provide them with a real truthful picture. To achieve that with 100% accuracy, you need an IP proxy network.

Let’s go a little deeper. Most e-commerce studies predict the e-commerce domain to grow at least at a pace of 20% touching $3.5 trillion by the end of this year. As you know, this sector is facing the roughest and toughest competition. These days, in order to meet the fierce growing competition, brands have no choice to deliver more than their customers expect. Without that they won’t be able to survive and thrive in this competitive ecosystem. To exceed their customers’ expectation, they need online personalization and localization.

Without offering a customer a special offer, a personalized deal, or a localized choice, it is difficult to win over a customer. This is why every business needs a continuous stream of data. For instance, a global supermarket chain having their presence in almost all countries across the globe has to go local while maintaining their superior global qualities. It needs to check, for example, the pricing of a product on its main competitor’s portal. Both businesses are selling in the same town. For the business it is important to access its competitor’s website to gain some insights on pricing, offers and deals and then reset their pricing and catalogue of offerings accordingly.

To ensure you keep attracting customers to your website, this is a must. If a business or this supermarket chain, as an example, attempts to check this kind of data online without any help, it will most likely be blocked or served misleading content. This is what usually happens when a business tries to access its competitor’s website. On the other hand, the typical consumer (regardless to location) is able to browse both websites without any hitch.

A customer can always browse different online retailers to compare prices and get the best deals. That is where a dire need arises for a brand to step into the shoes of its consumers or prospective consumers and see what they can openly see. Businesses that are not aware of the benefits of an IP proxy network go for various investments with little to no results. Like some brands, they buy IP addresses in bulk just for the sake of exploring competitors’ websites. In the past this served all brands well but today it won’t be productive. A competitor’s website is usually smart enough to identify data center originated IP addresses and block them in one go. The same happens to a competitor when it views another brand’s website with the help of a pool of IPs.

This recent development has prompted many global brands to start using IP proxy network to collect this much needed competitive data.

Let us understand it more clearly. A quick reminder, when a brand uses an IP proxy network, it allows this brand to gain access to competitors’ websites through these open pathways and openly view the website just as their consumers would. With an IP proxy network one can adjust it and view the internet through different locations, devices or ISPs, depending, of course on business requirements. With the help of IP proxy network, this brand will be able to re-route traffic in a small district in India and check its competitors’ prices in that particular district in an open and transparent manner.

Important to note that to use an IP proxy network, a brand must go for a suitable vendor whose consumer IP addresses have been obtained through a global voluntary opted in. These millions of consumers have willingly opted-in in return for benefits such as ad-free applications. They can also opt-out in any given time.

What other types of businesses can benefit from an IP proxy network? Well, the answer is any and all. When we talk about a brand, we relate to multiple types of brands such as, travel, hospitality, social media, food, advertising, etc. who can and are benefitting from IP proxy networks.

An IP proxy network enables an open web environment. It is beneficial for multiple business needs and not only for comparing prices. It goes beyond this. It can help in verifying advertising campaigns and protect consumers against fraud, it can test a website’s responsiveness globally and much more. As a matter of fact, this may be the ultimate technology for the best consumer experience. A recent study reveals that more than 90% of consumers switch and stick to brands providing them with relevant offers, deals, and recommendations. Similarly, in travel industry, more than 80% of consumers prefer to stick to a brand giving them a better personalized experience. In retail, it is a scenario of minute-to-minute best offers, deals, consumer packages, and products. When a business or brand is able to see what its consumers see, business decisions become faster, more precise and accurate.

This article is based on a recent study conducted by Luminati Networks, the leading IP proxy network operator. To date, Luminati provides over 10,000 customers with a transparent view of the internet.


November 29, 2019  8:40 PM

How to Measure Anything in Cybersecurity Risk @Amazon

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
cybersecurity, risk

How to Measure Anything in Cybersecurity Risk by Douglas W. Hubbard , Richard Seiersen , et al.

Excerpt from Amazon.com

A ground shaking exposé on the failure of popular cyber risk management methods

How to Measure Anything in Cybersecurity Risk exposes the shortcomings of current “risk management” practices, and offers a series of improvement techniques that help you fill the holes and ramp up security. In his bestselling book How to Measure Anything, author Douglas W. Hubbard opened the business world’s eyes to the critical need for better measurement. This book expands upon that premise and draws from The Failure of Risk Management to sound the alarm in the cybersecurity realm. Some of the field’s premier risk management approaches actually create more risk than they mitigate, and questionable methods have been duplicated across industries and embedded in the products accepted as gospel. This book sheds light on these blatant risks, and provides alternate techniques that can help improve your current situation. You’ll also learn which approaches are too risky to save, and are actually more damaging than a total lack of any security.

Cybersecurity Risk

Dangerous risk management methods abound; there is no industry more critically in need of solutions than cybersecurity. This book provides solutions where they exist, and advises when to change tracks entirely.

Discover the shortcomings of cybersecurity’s “best practices”
Learn which risk management approaches actually create risk
Improve your current practices with practical alterations
Learn which methods are beyond saving, and worse than doing nothing

Insightful and enlightening, this book will inspire a closer examination of your company’s own risk management practices in the context of cybersecurity. The end goal is airtight data protection, so finding cracks in the vault is a positive thing—as long as you get there before the bad guys do. How to Measure Anything in Cybersecurity Risk is your guide to more robust protection through better quantitative processes, approaches, and techniques.


November 29, 2019  8:36 PM

Cybersecurity – Attack and Defense Strategies @Amazon

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
cybersecurity, defense

Cybersecurity – Attack and Defense Strategies: Infrastructure security with Red Team and Blue Team tactics by Yuri Diogenes and Erdal Ozkaya

Excerpt from Amazon.com

https://www.amazon.com/Cybersecurity-Defense-Strategies-Infrastructure-security/dp/1788475291/ref=sr_1_3?keywords=cybersecurity&qid=1575039694&s=books&sr=1-3

Enhance your organization’s secure posture by improving your attack and defense strategies
Key Features

Gain a clear understanding of the attack methods, and patterns to recognize abnormal behavior within your organization with Blue Team tactics.
Learn to unique techniques to gather exploitation intelligence, identify risk and demonstrate impact with Red Team and Blue Team strategies.
A practical guide that will give you hands-on experience to mitigate risks and prevent attackers from infiltrating your system.

defense strategies

Book Description

The book will start talking about the security posture before moving to Red Team tactics, where you will learn the basic syntax for the Windows and Linux tools that are commonly used to perform the necessary operations. You will also gain hands-on experience of using new Red Team techniques with powerful tools such as python and PowerShell, which will enable you to discover vulnerabilities in your system and how to exploit them. Moving on, you will learn how a system is usually compromised by adversaries, and how they hack user’s identity, and the various tools used by the Red Team to find vulnerabilities in a system.

In the next section, you will learn about the defense strategies followed by the Blue Team to enhance the overall security of a system. You will also learn about an in-depth strategy to ensure that there are security controls in each network layer, and how you can carry out the recovery process of a compromised system. Finally, you will learn how to create a vulnerability management strategy and the different techniques for manual log analysis.

By the end of this book, you will be well-versed with Red Team and Blue Team techniques and will have learned the techniques used nowadays to attack and defend systems.
What you will learn

Learn the importance of having a solid foundation for your security posture
Understand the attack strategy using cyber security kill chain
Learn how to enhance your defense strategy by improving your security policies, hardening your network, implementing active sensors, and leveraging threat intelligence
Learn how to perform an incident investigation
Get an in-depth understanding of the recovery process
Understand continuous security monitoring and how to implement a vulnerability management strategy
Learn how to perform log analysis to identify suspicious activities

Who This Book Is For

This book aims at IT professional who want to venture the IT security domain. IT pentester, Security consultants, and ethical hackers will also find this course useful. Prior knowledge of penetration testing would be beneficial.
Table of Contents

Secure Posture
Incident Response Process
Understanding the Cybersecurity Kill Chain
Reconnaissance
Compromising the system
Chasing User’s Identity
Lateral Movement
Privilege Escalation
Security Policy
Network segmentation
Active sensors
Threat Intelligence
Investigating an Incident
Recovery Process
Vulnerability management
Log Analysis


November 29, 2019  8:31 PM

The Secret to Cybersecurity – A must read book @Amazon

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Cybercrime, cybersecurity

The Secret to Cybersecurity: A Simple Plan to Protect Your Family and Business from Cybercrime by Scott Augenbaum

Excerpt from Amazon.com

Cybercrimes are a threat and as dangerous as an armed intruder—yet millions of Americans are complacent or simply uninformed of how to protect themselves. The Secret to Cybersecurity closes that knowledge gap by using real-life examples to educate readers.

Cybersecurity

It’s 2 a.m.—do you know who your child is online with?

According to author Scott Augenbaum, between 80 to 90 percent of students say they do whatever they want on their smartphones—and their parents don’t have a clue. Is that you? What about your online banking passwords, are they safe? Has your email account or bank/debit card ever been compromised? In 2018, there were data breaches at several major companies—If those companies have your credit or debit information, that affects you.

There are bad people in the world, and they are on the internet. They want to hurt you. They are based all over the world, so they’re hard at “work” when even you’re sleeping. They use automated programs to probe for weaknesses in your internet security programs. And they never stop.

Cybercrime is on the increase internationally, and it’s up to you to protect yourself. But how?

The Secret to Cybersecurity is the simple and straightforward plan to keep you, your family, and your business safe. Written by Scott Augenbaum, a 29-year veteran of the FBI who specialized in cybercrimes, it uses real-life examples to educate and inform readers, explaining who/why/how so you’ll have a specific takeaway to put into action for your family. Learn about the scams, methods, and ways that cyber criminals operate—and learn how to avoid being the next cyber victim.


November 24, 2019  6:34 PM

How Much Is Your Organization Losing to Online Ad Fraud? @luminati_io

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
click fraud, Cyberfraud, cyberfraudsters

Marketing and advertising have been complementing each other for ages. While advertising plays a vital role in achieving marketing targets, accurate advertising targeting and the right kind of target audience, makes it happen. From the time the internet came into existence during last couple of decades, the whole dynamics of marketing and advertising has changed significantly. Marketers’ reliance on advertising has increased exponentially. Buying ad placements in the physical and virtual worlds has become an everyday event. Marketers buy ad placements with the sole purpose of reaching their specific target audience. But many times, it happens, that those ads reach the wrong audience or worse, have been hacked for malicious reasons. As a result, advertising budgets are wasted on fraudsters despite all efforts to avoid that result. According to several recent studies, there is an alarming increase in digital advertising-related cybercrimes. It has become a major point of concern for professionals worldwide.

The Pay Per Click (PPC) model has become most popular reward system that ad networks offer on the global front. There is a big catch in this. Many of these clicks go to waste costing organizations a hefty portion of their advertising budgets. This loss has gone up to as high as 40%, in many cases. This is a serious threat to companies that cannot be ignored. So far there has not been one, effective, consolidated solution to mitigate this risk. There is a continuous war on ad fraud happening across the globe. This concept of ad fraud is not new.  It has been around for the last three decades. Nowadays this war has reached a multi-scale level branching out in the advertising supply chain. It has expanded from a linear chain to a level that includes a large number of intermediaries.

This, in fact, has given ample scope to fraudsters to achieve new heights in ad fraud. To understand this magnitude of fraudsters, it is essential to understand how click-fraud works. Click fraud refers to phony or disguised clicks that should be driven by a genuine interest in the targeted ad link. In line with that, false clicks on ads are arranged. This becomes a goldmine for shady publishers, taking advantage of Pay-Per-Click agreements. They start charging for clicks with rock bottom prices, as well as for conversion rates, thus spoiling the market of genuine publishers. This severely affects advertising costs and revenue potential. A recent Bot Baseline report released by White Ops revealed that out of all fraud attempts, 20-35% amount to all ad impressions throughout the year.

Ad Fraud Photo credit: Photo credit: Visual Content on Visualhunt / CC BY]

Despite it’s low-risk category, Ad Fraud is an increasingly sophisticated, high-profit crime with recurring revenue. According to the World Federation of Advertisers, ad fraud is bound to become the second most alluring form of organized crime (next to drug trafficking) within the next 8-10 years. Losses on ad fraud in 2019 is estimated to be as high as $42 billion.

Luminati Networks has looked into Fighting Ad Fraud and has discovered repeated patterns. While other reasons for this traffic peak include accidental clicks, the traditional detections were not equipped to build tracking capabilities matching the speed of fraudsters. Rather, fraudsters, to date, have enhanced capabilities to control the system.

Tracking your digital advertising journey as a genuine advertiser is your fundamental right. An advertiser hopes to ensure that whatever site, country or city it targets its advertisement to move to, it reaches its destination. That is why it is very important for an advertiser to get into its consumer’s shoes and become capable of testing his or her ad viewing as a regular consumer irrespective of wherever they are located geographically. Looking at the growing expertise and skills of hackers in today’s digital world, this is the only method an advertiser has to ensure that their allocated budget is correctly spent on their brand and that their brand reputation is not hampered by hackers in any manner. The best option available is by using open source data collection technology.

IP Proxy Networks when applied by businesses, lets them gain the experience of their real consumers. With the help of these networks, a business or advertiser can track their digital advertising campaign following every bit of its journey including customer behavior, testing links, placements, and a lot more.

Now, it becomes apparently important to understand what an IP Proxy Network is. Organizations are facing an uphill task of viewing or browsing the internet openly without getting blocked or served irrelevant content. When an organization deploys IP Proxy Network, it leverages real IPs from real consumers who have willingly opted-in through an exclusively clear landing page. These consumers do so in order to gain access to an ad-free application, enhancing their user experience. They have an option to opt-out at any given time.

In fact, these millions of real consumer IP addresses (where each IP address is a unique set of numbers) are utilized as a gateway for businesses to browse the internet without getting blocked. This is the only way a business is able to view the internet as an individual or consumer would view. Most importantly, the business is able to do it openly, transparently, and freely. For instance, it might be Jack in Ohio, USA or Catherine in Nice, France; businesses will be able to navigate their individual web-journey and browse the internet the same way as a consumer would do. Above all, it is independent of a device, browser, network, or ISP (internet service provider).

Now, the question comes – How can a business use an IP Proxy Network to fight against ad fraud? Ad fraud is an expert’s job who has a criminal bent of mind.

How can an IP Proxy Network assist in fighting Ad Fraud?

A fraudster comes to know quite easily when he is getting noticed or being watched. He can easily cover his tracks, hide, or change entity. When you are using an IP Proxy Network, it is as simple as looking at ads as if you were a consumer thus alarming no one. In fact, this way, you will get directed by an ad in the same manner as a consumer would. The next question that arises is that of scale. When you are a business, it is not about one or two consumers. It is about millions of consumers, residing in thousands of cities in hundreds of countries. So, you might well ask – How does an IP Proxy Network tackle ad fraud at scale?

Basically, if you see, digital advertising technology is like a double-edged sword. On one hand, it permits for automated distribution while targeting of ads on a huge scale. On the other hand, it offers a tough challenge to the advertiser to control and check so many ads for veracity. As a matter of fact, these ads are entirely unique in terms of what they serve, where they serve, and when they serve to each of the person. The variables dictating all this, interestingly, are changing all the time. An IP Proxy Network is actually required to test various ads that are being catered to the user at that particular instance and which out of those are able to draw his or her attention.

For instance, let us take an example of a social media site. The social media company, with the help of IP Proxy Network, can instantly bifurcate the user session and simultaneously check all ads being served to that user at that particular moment.

Technically, ad fraudsters have gone quite advanced in terms of developing various methods to find out an advertisement’s ‘soft spot’ and explore its revenue chain, in a very organized manner. Companies are pumping in huge amount in digital advertisements.

IP Proxy Network, as I mentioned earlier, can help businesses leverage real consumers IP addresses who have opted-in to the network in return for ad-free applications. By using the same network, a business can ensure its ad journey getting protected along with no broken links or hackers involved and thus the brand image remains intact. After all, it is not only the question of money but also of market reputation.

This article was based on a recent research conducted by Luminati Networks, the world’s largest proxy operator dedicated to enabling businesses open source data collection. Luminati provides global businesses, companies and brands with a transparent view of the internet, no matter where they are based in the world, they can view the internet as a typical consumer would do.


November 24, 2019  6:17 PM

Network Security Hacks: Tips & Tools for Protecting Your Privacy

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Hack, Network security

Network Security Hacks: Tips & Tools for Protecting Your Privacy by Andrew Lockhart

Excerpt from Amazon.com

In the fast-moving world of computers, things are always changing. Since the first edition of this strong-selling book appeared two years ago, network security techniques and tools have evolved rapidly to meet new and more sophisticated threats that pop up with alarming regularity. The second edition offers both new and thoroughly updated hacks for Linux, Windows, OpenBSD, and Mac OS X servers that not only enable readers to secure TCP/IP-based services, but helps them implement a good deal of clever host-based security techniques as well.

Network Security

This second edition of Network Security Hacks offers 125 concise and practical hacks, including more information for Windows administrators, hacks for wireless networking (such as setting up a captive portal and securing against rogue hotspots), and techniques to ensure privacy and anonymity, including ways to evade network traffic analysis, encrypt email and files, and protect against phishing attacks. System administrators looking for reliable answers will also find concise examples of applied encryption, intrusion detection, logging, trending and incident response.

In fact, this “roll up your sleeves and get busy” security book features updated tips, tricks & techniques across the board to ensure that it provides the most current information for all of the major server software packages. These hacks are quick, clever, and devilishly effective.


November 24, 2019  6:10 PM

Seven Deadliest Unified Communications Attacks @Amazon

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Unified Communication

Seven Deadliest Unified Communications Attacks (The Seven Deadliest Attacks) by Dan York

Excerpt from Amazon.com

Seven Deadliest Unified Communications Attacks provides a comprehensive coverage of the seven most dangerous hacks and exploits specific to Unified Communications (UC) and lays out the anatomy of these attacks including how to make your system more secure. You will discover the best ways to defend against these vicious hacks with step-by-step instruction and learn techniques to make your computer and network impenetrable.

unified communications

The book describes the intersection of the various communication technologies that make up UC, including Voice over IP (VoIP), instant message (IM), and other collaboration technologies. There are seven chapters that focus on the following: attacks against the UC ecosystem and UC endpoints; eavesdropping and modification attacks; control channel attacks; attacks on Session Initiation Protocol (SIP) trunks and public switched telephone network (PSTN) interconnection; attacks on identity; and attacks against distributed systems. Each chapter begins with an introduction to the threat along with some examples of the problem. This is followed by discussions of the anatomy, dangers, and future outlook of the threat as well as specific strategies on how to defend systems against the threat. The discussions of each threat are also organized around the themes of confidentiality, integrity, and availability.

This book will be of interest to information security professionals of all levels as well as recreational hackers.

Knowledge is power, find out about the most dominant attacks currently waging war on computers and networks globally
Discover the best ways to defend against these vicious attacks; step-by-step instruction shows you how
Institute countermeasures, don’t be caught defenseless again, and learn techniques to make your computer and network impenetrable


November 24, 2019  6:06 PM

Cybercrime: Key Issues and Debates @Amazon

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Cybercrime

Cybercrime: Key Issues and Debates by Alisdair A. Gillespie

Excerpt from Amazon.com

Now in its second edition, Cybercrime: Key Issues and Debates provides a valuable overview of this fast-paced and growing area of law. As technology develops and internet-enabled devices become ever more prevalent, new opportunities exist for that technology to be exploited by criminals. One result of this is that cybercrime is increasingly recognised as a distinct branch of criminal law.

Cybercrime

The book offers readers a thematic and critical overview of cybercrime, introducing the key principles and clearly showing the connections between topics as well as highlighting areas subject to debate. Written with an emphasis on the law in the UK but considering in detail the Council of Europe’s important Convention on Cybercrime, this text also covers the jurisdictional aspects of cybercrime in international law. Themes discussed include crimes against computers, property, offensive content, and offences against the person, and, new to this edition, cybercrime investigation.

Clear, concise and critical, this book is designed for students studying cybercrime for the first time, enabling them to get to grips with an area of rapid change.


November 24, 2019  6:02 PM

Cyberpsychology: Individuals, Society and Digital Technologies @Amazon

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Cyber, Cybercrime

Cyberpsychology: The Study of Individuals, Society and Digital Technologies (BPS Textbooks in Psychology) by Monica T. Whitty

Editorial Reviews from Amazon.com
From the Back Cover

An important new textbook for an exciting area of contemporary psychological study and research…

Cyberpsychology

The field of cyberpsychology examines the psychology of interactions between individuals, societies and digital technologies. This engaging and accessible textbook offers a complete introduction to the subject. The authors outline key theories, provide critical assessments, identify areas in need of further research, and discuss ways to use digital technologies as a research tool. They also include a wealth of real life examples, activities and discussion questions for students at undergraduate and graduate levels.

Cyberpsychology provides up-to-date coverage of a wide range of topics relating to online behaviour, and considers the potential impact of these interactions offline:

online identity
online dating and relationships
pornography
cyberbullying
children’s use of the Internet
online games and gambling
deception
online crime

About the Author

Monica T. Whitty is Professor of Human Factors in Cyber Security in WMG at the University of Warwick, UK. Her research focus is on cybersecurity, cybercrime and online behaviour. She is a co-author or co-editor of several books, and has published widely on cybersecurity, mass-marketing fraud, insider threat, cyberstalking, online identity, cyber-relationships, cyberethics, online surveillance and taboos in video games.

Garry Young is Senior Lecturer in Psychology at Nottingham Trent University, UK. His research and teaching focus on the ethics of enacting real-life taboos within virtual environments, the phenomenology of delusions, and embodied cognition. He has published widely on ethics in vid


November 24, 2019  5:54 PM

Managing the Risk of Fraud and Misconduct @Amazon

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Fraud

Managing the Risk of Fraud and Misconduct: Meeting the Challenges of a Global, Regulated and Digital Environment by Richard H. Girgenti (Author), Timothy P. Hedley

Excerpt from Amazon.com

Praise for Managing the Risk of Fraud and Misconduct: Meeting the Challenge of a Global, Regulated, and Digital Environment

“This book belongs on any desk where fraud and misconduct threaten. It is bristling with the kind of detail this field truly needs. Written by leading pros at the top of their game, its soup-to-nuts advice matches solutions to problems. Read it once to gain broad insight; come back again and again to manage particular risks.”
Thomas Donaldson, Professor of Legal Studies and Business Ethics, Wharton School of the University of Pennsylvania

“A valuable road map for corporate fraud fighters in an age when shareholders, regulators, and whistleblowers are making their work more vital than ever.”
Neil Weinberg, Executive Editor, Forbes Media LLC; coauthor, Stolen Without a Gun

“Auditors, managers, and directors may argue over who is responsible for minimizing fraud and misconduct. The fact is, we all share this responsibility. As a current and comprehensive guide to the subject, this book should be recommended reading for every public company director.”
Kenneth Daly, President and CEO, National Association of Corporate Directors

risk of fraud

“Written for managers from C-level on down — without avoiding technical jargon. This approach, combined with a consistent, efficient, easy-to-read writing style, leads to a thorough understanding of the subject without compromising its technical accuracy. I strongly recommend this book.”
– Randall LaSalle, Ph.D., CPA, CFE, John Jay College of Criminal Justice Department of Economics

” Managing the Risk of Fraud and Misconduct delves in great depth into the issues and provides sage advice. Based on my experience, the book should be required reading for every general counsel and most corporate counsel.”
– Albert Driver, Editor, The Metropolitan Corporate Counsel

“An extremely detailed book that serves as a comprehensive guideline for risk managers.”
– Business Finance

“The definitive authority on this important business issue-the fraud risk management bible” that stresses “the ever- growing importance of integrity in business.”
– Risk Management

“Both a history of recent developments in the field — and a comprehensive plan for developing a robust approach to deterring, detecting, and preventing fraud, and to assessing both a company’s vulnerabilities and the success of its anti-fraud measures.”
– Accounting Today

“This book addresses the challenges posed by changes in law, technology, and globalization in a comprehensive manner that can help the reader improve core competencies and initiate some interesting dialogue in the process.”
– Elizabeth Sullivan Armetta, CIA, CAMS, The Institute of Internal Auditors

A Comprehensive “C-Level” Guide to Preventing and Responding to Fraud and Misconduct

Maintaining and enhancing the integrity of an organization in a global, regulated, and digital environment has become an increasingly complex and difficult challenge for business leaders. Despite major legislative and regulatory reforms over the past decade, the headlines are replete with instances of corporate misdeeds. Indeed, nearly two-thirds of executives surveyed by KPMG reported that fraud and misconduct were significant risks in their industries, and a third of these executives expected fraud or misconduct to rise in their organizations within the year.

Managing the Risk of Fraud and Misconduct is an A-to-Z guidebook for business leaders who are looking for an integrated and comprehensive approach for cutting through the complexities in today’s environment. This solution driven book provides insights from top experts who walk you through proven approaches to customize a strategy for preventing, detecting, and responding to fraud and corruption by building a culture of ethics and integrity.

Managing the Risk of Fraud and Misconduct will help business leaders to stay a step ahead of tomorrow’s demands by providing guidance on how to:

Assess your organization’s vulnerability to fraud and misconduct, and design and implement controls to prevent, detect, and respond to these occurrences
Address increased regulatory enforcement and enhanced scrutiny
Preserve and create value from corporate governance and compliance programs
Use technology and data analytics to mitigate fraud and misconduct risks
Evaluate the ongoing effectiveness of your compliance program


November 19, 2019  12:11 AM

StaffConnect Series 3 – Landmark In Employee Engagement SaaS Platform

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Employee engagement, Remote worker

StaffConnect Series 3 launch begins a new era of its mobile employee engagement platform. The biggest gap between remote employees of an organization and their corporate or office employees is communication. That is a major gap that never let remote employees feel like an integral part of their organization. StaffConnect is the only company in the world that is filling that gap in a very innovative manner. Organizations using StaffConnect can easily vet this statement because their employees working remotely or in other terms, their deskless employees no longer feel as not belonging to the organization as their counterpart desk employees. Now, despite being deskless or working remotely without a desktop or a laptop, these employees remain the part of every activity of their organization like a survey or poll, training, feedback, or an important online meeting.

StaffConnect Series 3 comes with unmatched new and enhanced features. There is enhanced analytics available now that measures engagement success. The personalized content feature helps to improve employee communications. StaffConnect is a global frontrunner in mobile employee engagement solutions for the deskless workforce. StaffConnect customers find StaffConnect Series 3 as a new landmark in engaging and communicating with their employees, especially their deskless workforce. With the help of this SaaS platform, the deskless workers who have no access to a laptop, intranet, or email, find it quite helpful.

StaffConnect Series 3

Source: StaffConnect

Some of the key features of StaffConnect Series 3 include:

  • An exclusively individualized or personalized intuitive interface empowers with a greater user experience drawing best practices from the top social tools.
  • There is an addition of new communication channels for users. They can now organize and view all community and organization news based on their individual preferences.
  • Enhanced Analytics keeps building a continuous measurement of content and communication performance on a real-time basis. There are advanced features like sentiment analysis, continual assessment, and pulse surveys. Ultimately, user engagement data and preferences help in determining the most appropriate content. Above all, there is instant messaging with integrated chat that seamlessly connects individuals and teams.
  • Ciara O’Keefe, vice president of customer experience, StaffConnect says,

    “Series 3 is the most advanced version of our employee engagement platform designed from the ground up to provide an organization with the most personalized and intuitive way to engage, communicate and garner critical feedback from all employees – particularly remote, deskless employees. Our enhanced analytics give organizations important insight and measurement into the effectiveness of their engagement strategies allowing them to continually improve the employee experience, boost productivity and customer satisfaction.”

    StaffConnect Series 3 is available now. For further information, please visit: http://www.staffconnectapp.com

    For more about employee engagement and the deskless workforce, visit:
    “How Can Enterprises Overcome the Global Employee Engagement Crisis That Impacts 2.7 Billion Deskless Employees” – eBook
    “Overcoming the Employee Engagement Crisis for the Deskless Workforce” – infographic
    The Deskless Workforce video – https://www.youtube.com/watch?v=PInhpPdo5rc
    The Impact of Employee Disengagement video – https://www.youtube.com/watch?v=9MZdk-Zx3OY


    November 12, 2019  5:00 PM

    How Cybersecurity Evolves CISO/Security Vendor Relationship? @cynet360

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    CISO, cybersecurity

    With any kind of evolution there is a need for realignment, reexamination, change from the existing, or discovery of new. Why is it so? Because a change seeks to change. With the change in the severity of threats and vulnerabilities, there is a need for evolution in cybersecurity. The same old methods or concepts hold no good in today’s environment of new threats and risks. The evolution of cybersecurity, in fact, automatically promises to remodel of CISO/Security Vendor relationship.

    As the corporate’s dependency on technology and information systems has increased tremendously to the core of business processes, IT infrastructure and business applications security needs has taken a new shape. It needs a complete reassessment, recognition, and a new perspective of understanding of ongoing challenges of defending corporate infrastructure. Cynet, keeping all this in mind, is showcasing a new series of videos on the trials and tribulations of cybersecurity.

    The IT department of an enterprise needs to understand the wide gap between what vendors promise and what pains of the corporates remain unattended. The key focus of this video series stays on educating Technology experts in an organization how vendor solutions fall short of actual deliverables at the time of crisis despite carefully drawn service level agreements. A new approach, thus, is very important. That new approach should combine technology and a team of security experts to cater to a highly effective defensive strategy. CISO being at the top of the ladder when it comes to cybersecurity in an organization must equip him or herself with all kinds of powerful tools and solutions to defend the organization against cybersecurity breaches. Because ultimately he or she will be responsible for failed efforts. There is always something more to what they are doing.

    Despite deploying the best of the solutions, these top IT security experts can’t afford to just sit idle after the deployment. They continuously need to strive to enhance their IT security defenses with various actions like educating the workforce, upskilling security teams, and protective product selection to insulate their enterprise against some of the most notorious attacks. It is, actually, quite difficult for CISOs and their team of professionals to vet IT security solutions for the threats and vulnerabilities they are tackling regularly. Every solution has its sphere of coverage while catering to a variety of environments, applications, and risks to protect and resolute against cyber threats in a definitive manner. It is, thus important to get them out of that vicious circle. To tackle that, Cynet has launched a new video series (https://www.cynet.com/ciso-vs-security-vendor/?utm_source=thn) to address the concerns of CISOs when managing cybersecurity vendors.

    Most of the CISOs are finding themselves helpless against the high-pressure sales tactics of cybersecurity vendors who come with over-hyped claims of protection, extremely complex operations, and lack of automation. Each of these areas is the actual pain points of IT professionals working in the security industry. Cynet, through these videos, asks CISOs to re-evaluate their approach to cyber defense.

    Dori Harpaz, VP, Marketing for Cynet says, “Our newest generation technology challenges the common misperception that cybersecurity solutions are ineffective or too complicated to leverage the benefits. Cynet’s radical approach simplifies and converges cyber defense so organizations can quickly and easily handle cyber-attacks and remain focused on what they do best – their business.”

    Cybersecurity

    Source: A clip from one of the Cynet videos on cybersecurity

    Enterprises need a more holistic cybersecurity solution in comparison to what is currently available in the market. Cynet has come out with an Automated Breach Protection Platform to safeguard and protect the complete IT environment in an organization with a simple yet powerfully integrated control dashboard that is backed by a top System & Organizational Control (SOC) Team. The company has built a unique combination of the most advanced technology and security support infrastructure. So, it is Cynet360 software empowered with Sensor Fusion technology that can collect all activity signals, analyzing them together in a real-time environment to conclude the true context of each activity based on which it enforces precise breach prevention actions.

    Cynet 360 Sensor Fusion technology begins with a sensor array to monitor activities of a file, process, memory, network traffic, and user accounts to find out any kind of exceptional behavior or patterns, individually and after fusing the results in order to analyze the findings thereby delivering automated precise actions that provide complete protection than any comparable solutions of platforms. As an outcome of this, enterprises get precise monitoring and control, attack prevention, and detection, and an immediate response orchestration for the highest order of breach prevention.

    The Cynet 360 technology platform is supported by CyOps 24/7 Cyber SWAT Team at no additional cost. This provides customers with the crucial cybersecurity staff required to ensure customers do not fail in keeping up with a fast-moving threatscape. The CyOps team of professionals includes security analysts, researchers, and experts capable enough to provide incident responses, forensics, threat hunting, and malware analysis.


    November 10, 2019  9:49 PM

    Why an Internal/In-house Proxy Operation Is No Longer Viable?

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Proxy

    There is a misconception among enterprises that operate their own server network in the cloud. They believe that having access to global IP distribution helps them a lot. It doesn’t. I am writing this so that the CEOs and CTOs/CIOs of such enterprises will understand the reason for stating this fact. Allow me to explain this and clarify why you should select an external IP proxy vendor operator and not operate your own proxy hub.

    Why do many Enterprises follow the rising trends of outsourcing? Here are a few basic reasons. First of all, it saves costs and creates an environment for faster business growth. Also, it helps in improving overall results pertaining to business finances and business reputation. In fact, on the technology front, it creates an entirely new paradigm within an enterprise.

    As technology is constantly changing at a very fast rate, competition among peer business players is becoming tougher day by day. In days gone by, it was not as tough for a business to survive, sustain, and grow. In fact, then when organizations had a need of access points or static IPs for the purpose of openly accessing the web from various locations across the globe, the easiest solution was to lease cloud servers from the major cloud service providers like AWS and Azure and lease a pool of IPs from their respective Internet service providers (ISPs). Similarly, when there was a need for mobile IPs, building a lab of devices, and SIM-cards was the solution. In a nutshell, it once was a creation of their own in-house proxy network.

    in-house proxy

    source: luminati.io

    Recent research from Luminati Networks (the largest IP proxy network operator) indicates that it was just a waste of time, money and energies, without achieving any useful results.

    Today all organizations realize that the management of such an infrastructure is a costly affair in terms of capital expenditures and operative costs. In addition to the costs, the infrastructure consumes a lot of time for their human resources. Outsourcing is easier and it saves significantly on both fronts.

    There are overall 4 key reasons for selecting an external IP proxy network operator versus an in-house one:

    #1 Cost: I remember when in my last organization we needed a large pool of static IPs and our ISP was unable to meet that requirement because of a shortage of IPV4 IPs. At the same time, whatever number of available static IPs in our pool was, it had a humongous recurring cost per annum. In today’s scenario, the situation is no different. IPV4 IPs are very costly, as much as $20 per IP. It is practically possible for a small business that wants to create an in-house proxy network to shell out $15k as a first-time IP’s purchase and continue spending at a recurring rate of $10k per month merely to manage its operating costs for engineers and servers needed. The only feasible solution here is to outsource at a much lower cost.

    Leasing a smaller number of IPs will carry an even higher price tag and reach as high as 4 times the larger pool.

    Another major hiccup is leasing on an annual basis. But what if you have a shorter requirement or a dynamic requirement for a changing number of IPs every now and then? What would you do then?

    #2 Resources: Buying is the first hurdle. Even if you are able to cross it by convincing your head of finance and get approvals from senior management members, the recurring cost of managing an in-house global proxy network will be a killer in terms of the consumption of time and resources. You will need to source servers from data centers in all required locations. Expertise will be required to set those servers, update configurations, and install all software types an enterprise needs for its operations. Routing through a 3rd party upstream provider to manage IPs and servers, is another large-scale task. The story doesn’t end here. It needs continuous updating or fixing of each IP’s geolocation in various databases that your target domains would be referencing.

    All this will require regular monitoring of your in-house proxy network on a daily basis. Above all, you need to refresh IPs on a regular basis by replacing IP subnets and setting them up again by assigning appropriate routing, geolocation, databases, etc.

    #3 Diversity: Most commonly, the smallest subnet is a ‘/24’ subnet of 254 IPs. It clearly reflects that any small and medium in-house proxy networks will be struggling with one of the following situations:

    a. An unnecessary but obvious increase in the overall proxy expense.

    b. A larger proxy network than was initially planned which is required for its diversity.

    c. There could be a situation when you have the right size network but with no or very little scope of diversity at all. When there is low diversity, it creates a larger risk of getting blocked. In such a scenario, when one or more of your subnets get blocked, a bigger chunk of your proxy network becomes unusable or the opposite case of a highly diversified network that would have affected a smaller portion of subnets.

    #4 Flexibility: You have to keep up with the rapidly changing market dynamics to keep your business fit for its survival. Despite such a stringent situation when it comes to certain requirements like changing infra, replacing blocked IPs, changing or testing of IPs can never happen in a short period. It always takes weeks or months! Any kind of change is not simple. It always demands expert resources and a large amount of time.

    Looking at all the factors above, it is pretty much clear that outsourcing your web-access points or internal proxy network makes a lot of sense in terms of business continuity and risk mitigation. Also, operationally and commercially.

    This article was based on recent research conducted by Luminati Networks, the world’s largest proxy operator dedicated to enabling businesses open-source data collection. Luminati provides global businesses, companies, and brands with a transparent view of the internet, no matter where they are based in the world.


    October 31, 2019  10:17 PM

    Data Science for Business: A Must Read Book @Amazon

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Data Science

    Data Science for Business: What You Need to Know about Data Mining and Data-Analytic Thinking

    Excerpt from Amazon.com

    Written by renowned data science experts Foster Provost and Tom Fawcett, Data Science for Business introduces the fundamental principles of data science and walks you through the “data-analytic thinking” necessary for extracting useful knowledge and business value from the data you collect. This guide also helps you understand the many data-mining techniques in use today.

    Data Science

    Source: Amazon.com

    Based on an MBA course Provost has taught at New York University over the past ten years, Data Science for Business provides examples of real-world business problems to illustrate these principles. You’ll not only learn how to improve communication between business stakeholders and data scientists but also how to participate intelligently in your company’s data science projects. You’ll also discover how to think data-analytically, and fully appreciate how data science methods can support business decision-making.

    Understand how data science fits in your organization—and how you can use it for competitive advantage
    Treat data as a business asset that requires careful investment if you’re to gain real value
    Approach business problems data-analytically, using the data-mining process to gather good data in the most appropriate way
    Learn general concepts for actually extracting knowledge from data
    Apply data science principles when interviewing data science job candidates

    Editorial Reviews
    Review
    “A must-read resource for anyone who is serious about embracing the opportunity of big data.”
    — Craig Vaughan
    Global Vice President at SAP

    Data Science

    “This book goes beyond data analytics 101. It’s the essential guide for those of us (all of us?) whose businesses are built on the ubiquity of data opportunities and the new mandate for data-driven decision-making.”
    –Tom Phillips
    CEO of Media6Degrees and Former Head of Google Search and Analytics

    “Data is the foundation of new waves of productivity growth, innovation, and richer customer insight. Only recently viewed broadly as a source of competitive advantage, dealing well with data is rapidly becoming table stakes to stay in the game. The authors’ deep applied experience makes this a must-read–a window into your competitor’s strategy.”
    — Alan Murray
    Serial Entrepreneur; Partner at Coriolis Ventures


    October 30, 2019  1:13 PM

    PASS Summit 2019 In Seattle, Washington November 5-8 @DH2i

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Clustering, Clustering/High availability, DH2i, High Availability, SDP, Software defined networks, Tunneling

    If, by any chance, you are attending PASS Summit 2019 taking place November 5-8 in Seattle, Washington, then don’t forget to visit DH2i Booth 118. Here, DH2i officials are showcasing how you can ensure always-secure and always-on IT and business infrastructure to your entire enterprise. There are live demos, exhibitions, conferences, and some swags to takeaway. Live Demos by DH2i include DxOdyssey, DxEnterprise, Secure Network Micro-Tunneling, Multi-Platform Smart Availability, and DxAG, Availability Group Clustering Software. There is plenty to learn at PASS Summit 2019. In fact, this is a golden chance to discuss your enterprise technology architecture and get some key insights on how you can optimize and improvise the current design to the best possible one. DH2i is one of the top providers of multi-platform Software-Defined Perimeter (SDP) and Smart Availability. These are available on Windows as well as Linux.

    PASS Summit 2019

    Source: DH2i

    At PASS Summit 2019 you can get deep insights about DH2i software products DxOdyssey and DxEnterprise. These softwares empower customers to create an entire IT infrastructure running on a simple mission ‘Always Secure and Always On’. DH2i is an exhibitor and sponsor at PASS Summit 2019. This summit, you might already be knowing, is the world’s largest and most exhaustive technical training conference for technology professionals of Microsoft SQL Server and data. The live demonstrations will keep happening throughout the event. These demonstrations will comprise of DH2i’s industry-leading data security, high availability (HA), and disaster recovery (DR) software solutions. These include DxOdyssey that provides Secure Network Micro Tunneling. You can learn how you can create lightweight, scalable, highly available, and discreet “secure-by-app” connectivity between your on-premises and/or cloud environments running on Windows and Linux. Interestingly, you can build these connections without a VPN or direct link.

    PASS Summit 2019

    These softwares build an entirely safe technology environment in your organization without any compromise in quality, security, and performance. In case, you have any queries, you can easily get an answer when you visit DH2i booth 118 at PASS Summit 2019. DxConnect Secures Remote Access to DxOdyssey Tunnels. You can easily deploy a Software-Defined Perimeter (SDP) that secures network connectivity between your main sites of operation and remote users working from anywhere in the world. DxEnterprise is multi-platform smart availability. You can manage multiple workloads at the instance level and also as Docker containers. This provides instance mobility from any host to any host. That too, anywhere with the help of just an application or container stop and restart. DxAG, the availability group clustering software builds highly available SQL Server Availability Groups across Windows and Linux nodes – without WSFC or Pacemaker limitations.

    You can claim $100 off your PASS Summit 2019 registration here: https://dh2i.com/webinars/


    October 30, 2019  10:28 AM

    SwiftStack 7 A New Paradigm In Data Storage and Management @SwiftStack

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Data Management, Data storage, Petabyte, SwiftStack

    Any enterprise or organization thinking or working towards being more data-driven can’t think of surviving without SwiftStack 7. Just 8 years ago, SwiftStack was founded by some of the frontrunner experts in cloud computing. By now, it is the most acknowledged leader in data storage and management solutions. It has already transformed many enterprises and organizations in their unmatched achievements in these areas. Most of the industries have acknowledged getting the best solution from SwiftStack to manage petabytes of unstructured data from the edge to core to cloud. Whether it is a project in deep learning, analytics, and scientific research, or it is about leveraging large asset repositories, SwiftStack is an ultimate solution. SwiftStack 7 comes with lots of unique features that make it stand apart from any competition. It provides ultra-scale performance and capacity for data storage and management from edge to core to cloud.

    SwiftStack 7 platform is built for intelligent data that ensures to deliver petabytes to any AI framework, GPU computes complexes and deep learning systems. This is a remarkable enhancement to its data platform for data-utilization, performance, and services at ultra-scale. This new platform ingests and processes 4K/8K videos effortlessly including any other compute-intensive activities. Most of the enterprises, industries, and businesses across the globe are not able to tackle emerging data-intensive applications that need a modern storage architecture along with an ability to feed thousands of GPU units working in parallel. That too from a single global namespace comprising of edge, core, and cloud data clusters. No other solution can beat SwiftStack 7 in tackling these emerging data-intensive applications flawlessly. The new features offered by SwiftStack 7 include ultra-scale performance, data immutability, workflow integration, and distributed flash-based caching all with standards-based APIs.

    SwiftStack 7

    George Crump, Chief Steward at Storage Switzerland says, “Data scientists are constrained by inherited infrastructure, particularly in performance, scalability, cost, metadata enrichment, workflow integration, and portability to accommodate data at the edge, core, and cloud. Large enterprise customers need to architect a data pipeline and framework to deliver business outcomes and intelligence, and SwiftStack’s software is a strategic component supporting these modern applications.”

    SwiftStack 7

    Source: SwiftStack

    Joe Arnold, SwiftStack founder, and chief product officer says, “Right now, data is changing the world, applications can exist anywhere from the edge to the core data center to the cloud, and data management and control have been decoupled from core infrastructure. Our customers are pushing the boundaries in demanding environments, such as deep learning, and SwiftStack 7 is the foundation for delivering performance, capacity, and services at scale.”

    This video would help gain more insights:

    This would also make an interesting read: Anatomy of SwiftStack 1space


    October 28, 2019  7:13 PM

    IP Proxy Network To Strengthen Your Business @luminati_io

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Internet protocol, Open source, Proxy

    In July 2019, Frost & Sullivan published a report ‘Global IP Proxy Networks Market’. The crux of the report is IP proxy network is going to be a major determinant in the existence and growth of any online business. The global market of IP Proxy Network standing at a figure of $76 million currently is to grow at a CAGR of almost 17%. By 2025 it will touch $260 million. The use of IP proxy networks is becoming essential for most of the businesses. Logically, all the online retail companies, for instance, need to study their competitor’s pricing strategy for the same commodity they are launching or selling in the market to analyze, compare prices and then place their item at a better price to gain an upper edge. This can happen only if their programs can capture this customized information in a simulated manner.

    IP Proxy Network

    Source: Luminati.io

    Basically, with the help of IP proxy networks, the system builds an instance to behave as a user in that particular geographic location. Companies not using IP proxy networks are running their businesses at a very high risk of collecting inaccurate data that will keep them lagging behind the pace gaining them no business or customers. Luminati is the world leader in this technology. The company was launched in 2014, and in a short span, it has become the world’s largest proxy network operator with a sole aim to provide an open-source data collection. The technology it developed has the power to route internet-based traffic via different touchpoints on a global network. The company enables brands to position their products as the best value option.

    IP Proxy Network

    Luminati’s first of its kind enterprise IP proxy network empowers its customers to collect the most realistic and accurate competitive intelligence to help them manage their businesses most effectively. This is the first of its kind of technology that is helping a large number of businesses, companies, enterprises, brands, and product aggregators with an open and transparent view of the internet, irrespective of their geographic location, devices, service provider, etc. There is no other way to achieve such real-time and accurate results with the help of any other technology. With a strength of 120, Luminati is managing more than 10,000 customers across the globe. Enterprise proxy networks are increasingly becoming important to create network transparency. Use cases would be data collection, price comparison, fraud protection, brand protection, application performance, ad verification, talent sourcing, account management, and cybersecurity, to name a few.

    The power of an IP proxy network is huge. It all depends on the businesses how they like to explore it, and how much they want to harness it to manage their businesses efficiently.


    October 27, 2019  8:49 PM

    Micron Technology Launches Deep Learning Solutions @MicronTech

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Artificial intelligence, Deep learning, Machine learning, Micron, Semiconductor

    Micron Technology Inc. has clearly emerged as a global market leader in memory and storage solutions. It is the 4th largest semiconductor company across the globe. A recent acquisition of FWDNXT provides it a clear cut edge over the Artificial Intelligence domain. FWDNXT is a software and hardware startup mostly in artificial startup technology. With this significant acquisition, Micron jumps in a comprehensive AI development platform. Technologies lying with FWDNXT enable Micron to create key building blocks for innovative memory and AI workloads. Soon after this acquisition, Micron has come up with a unique and powerful high-performance combination of hardware and software tools for deep learning applications. It is actually the newly formed composition of advanced Micron memory with FWDNXT’s AI technology that made it possible to explore deep learning solutions thus making data analytics, especially in IoT and edge computing more meaningful. This is a major breakthrough.

    Deep Learning

    Source: Micron Technologies Inc.

    The global trends clearly show companies worldwide are developing or intending to develop more complex AI and machine learning systems. That, in turn, increases the demand and importance of the hardware used to train and run those models. Micron brings memory and computes together with the help of its DLA (Deep Learning Accelerator) technology. The efficient and high-performance hardware and software solutions based on deep learning and neural networks have made it more advantageous for businesses. The Micron DLA technology has got powered up by the AI interface engine developed by FWDNXT. This geared-up Micron to launch tools to observe, assess, and develop innovation bringing memory and computing working together. Overall, it helped Micron to enhance performance and lower power consumption. Micron’s DLA technology empowers enterprises with an easy-to-use software programmable platform that works with a wide range of machine learning frameworks and neural networks.

    Deep Learning

    Micron Executive Vice President and Chief Business Officer Sumit Sadana said, “FWDNXT is an architecture designed to create fast-time-to-market edge AI solutions through an extremely easy to use software framework with broad modeling support and flexibility. FWDNXT’s five generations of machine learning inference engine development and neural network algorithms, combined with Micron’s deep memory expertise, unlock new power and performance capabilities to enable innovation for the most complex and demanding edge applications.”


    October 24, 2019  2:48 PM

    PayGo @SIOSTech Ensures High Availability of SQL Server in AWS Cloud

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Failover Cluster Management, Failover Clustering, High Availability, SIOS, SQL Server

    SIOS Technology Corp. has emerged and established itself very fast among the big fishes in the global markets in the field of high availability and disaster recovery solutions. Why SIOS solutions are among the top is for the simple reason of availability and elimination of data loss for critical Windows and Linux applications running across heterogeneous enterprise environments viz virtual, cloud, physical, and hybrid cloud. On the same note, SIOS clustering software has become essential for enterprises having an IT infrastructure with applications needing a high degree of resiliency along with ensuring near to 100% uptime with no compromise with performance or data thus protecting their business ecosystem from any kind of local failures and regional outages be it planned or unplanned. PayGo, the latest launch by SIOS Technology Corp., ensures the high availability of SQL Server in the AWS cloud with SIOS DataKeeper.

    SIOS Technology Corp., established in 1999 and headquartered in San Mateo, California has offices worldwide. SIOS, SIOS Technology, SIOS DataKeeper, SIOS Protection Suite, Clusters Your Way, and associated logos are registered trademarks or trademarks of SIOS Technology Corp. Why customers prefer SIOS is for building dependable operations, high performance, and ease of use. That has made SIOS an industry pioneer in providing IT Resilience via intelligent application availiability thus mitigating a large number of risks associated. PayGo uses SIOS DataKeeper on Amazon Web Services (AWS). It does so by utilizing Elastic Compute Cloud (EC2) virtual servers with a solid-state drive (SSD)-only storage. It results in creating a rapid, automatic failover environment that eventually ensures high availability (HA) for the enterprise’s key business applications on SQL Server. PayGo, in fact, is an integrated utility payment solution provider. It handles the largest energy company prepay ecosystem in the United States.

    PayGo

    PayGo basically, as of now, is running four production environments on Amazon cloud. The fifth one is about to come anytime. The environment comprises of SQL Server 2017 Standard Edition running on Windows Server 2012 R2. But soon after testing is completed, it will migrate to Windows Server 2019. Chad Gates, Senior Director of Infrastructure and Security, PayGo says, “Our backend SQL Servers hold terabytes of data that must be available 24×7. As a Windows shop, we prefer to use Windows Server Failover Clustering (WSFC) for data protection and continuous operation in case of any failures. But WSFC requires some form of shared storage, like a storage area network (SAN) and that isn’t natively available in AWS.”

    Chad concludes, “SIOS DataKeeper Cluster Edition overcame the problem caused by the lack of shared storage. Its use of a mirrored drive looks like shared storage to the WSFC. It was exactly what we wanted.” Interestingly, SIOS DataKeeper also met PayGo’s other three criteria better than any other solution. On the results achieved, Chad says, “We have been using SIOS DataKeeper for several years now, and it has proven to be the most rock-solid piece of software we have.”

    PayGo

    Source: SIOS

    Frank Jablonski, VP of global marketing, SIOS Technology says, “Whether you need to protect applications on a physical server, a private cloud, a public cloud or a hybrid cloud, you need to meet the same SLAs for application availability regardless of location. Applications running in clouds also need to be protected against the inevitable cloud outage through the use of availability zones and regions with automated intelligent failover. PayGo is using SIOS to provide a fast, easy way to deploy applications in a high availability environment in the AWS cloud while continuing to use Windows Server Failover Clustering.”


    October 23, 2019  4:38 PM

    PathWave Test 2020 Software Suite @Keysight For Rapid Product Development

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    5G, 5G and IoT, automotive, IIoT, product development

    Keysight Technologies is among the global pioneers in technologies. It empowers enterprises, governments, and service providers to innovate to connect and secure the world. Basically, the solutions provided by Keysight Technologies enhance network optimization. Customers are able to launch electronic products in the market faster and at a lower cost. All this is possible because of Keysight’s design simulation, prototype validation, manufacturing test, and as already stated, optimization in networks & cloud environments. Any enterprise engaged in such activities is well aware of the power of Keysight solutions. The key customer verticals comprise communications, aerospace and defense, energy, automotive, general electronics, and semiconductors. Customers are spread across all geographies around the world. During the fiscal year 2018, Keysight revenues were around $4B. In order to accelerate the time-to-market of 5G, IoT, and automotive electronics, Keysight Technologies has launched Pathwave Test 2020 Software Suite to enable Rapid Product Development.

    PathWave Test 2020 software suite

    Source Keysight Technologies

    Pathwave Test 2020 Software Suite has been developed on the Keysight Pathware Software Platform. The suite promises to deliver an integrated experience to accelerate time-to-market digital and wireless platforms and products. So, it is definitely going to help manufacturers in the electronic field in a big way. All 5G, IoT, and automotive engineers and developers and leverage the power of the Pathwave Test 2020 Software Suite to streamline test data processing and analysis. This further boosts speed product introductions in order to secure a competitive edge in the local of the global market as the case may be. The software enables data sharing and management between multiple platform software tools that include test automation, signal creation, advanced measurement, and signal generation. All this leads to a very useful ecosystem of data analytics.

    PathWave Test 2020 Software Suite

    The integrated Pathwave Test 2020 Software suite permits to develop and deploy application-tailored solutions to substantially enhance electronic test workflows and product introductions. Jay Alexander, Chief Technology Officer, Keysight Technologies says,

    “The digital transformation happening today in engineering enterprises relies on accelerating time-to-market using best in class software and hardware. Keysight’s PathWave Test 2020 software suite reflects our commitment to creating powerful software solutions that help our customers streamline their workflows.”


    October 21, 2019  10:26 PM

    Edge Computing: Are We Back To Square One? #EdgeComputing

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Edge computing

    Around four decades ago we didn’t have networking. There were standalone personal computers like one each in HR, purchase, finance, and so on. There were no laptops, no mobility. If finance needs some data from the PC in the HR, it had to move with the help of floppy disks. Individual computers, individual computing, processing, individual reports. Then came the concept of networking and centralized server for data and computing. Machines started talking to each other. Floppies were still needed but more for moving data to a computer out of the network or elsewhere. With this, organizations started relying more on computers and thus came business applications, CRM, ERP, etc. The need for data security and network security came into existence. Hubs changed to switches, switches changed to intelligent switches, routers came for connecting different networks, floppies changed to CDs, CDs to DVDs, and so on.

    Edge Computing

    Photo credit: jurvetson on Visualhunt / CC BY

    The whole game changed by then. The Internet was a revolution. So was the cloud. For the current generation mainframes, magnetic drums, magnetic tapes, data punch cards are all history. None of them have seen these. They are not into artificial intelligence, internet of things, analytics, Big Data, virtual reality, machine learning, etc. All of a sudden with the need for edge Computing aren’t we rolling back. It’s the edge data and edge Computing that is providing more relevant information to businesses. ERPs and CRMs are taking back seats. They are not hot anymore. They are per se there by default. But business needs have changed. What about the security and safety of edge devices. Computing, processing, encryption, how are these being handled at the edge? What about data from the edge to data servers? What about the risks of so many things happening at edge devices?

    Edge Computing

    Aren’t we getting more exposed? More so with heterogeneous devices, mobility, BYOD, etc. What about the disaster recovery plans for an edge when edge devices and edge Computing is becoming the core lifeline of businesses? Is our focus shifting to wasteful activities or we are moving in the right direction? Is knowing the coordinates of a mobile device more important or its data? What about the change in the whole paradigm of coding and testing with edge Computing becoming prime. Are we landing in more troubled waters or heading to higher maturity levels. If so is the case, we are making machines so intelligent to perform proactive, predictive, and prescriptive analysis. Then if all intelligent work is going to be performed by machines, what will humans do? Is edge Computing an ultimate solution or it’s just a stopgap?


    October 8, 2019  5:19 PM

    Multi-Cloud Intelligent Data Protection Solution @aparavisoftware

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Data protection, data retention, File Backup, Multi-cloud, Ransomware

    Aparavi delivers hybrid and multi-cloud file backup and long-term retention solutions. The new solution Aparavi has designed is to address the varying and fast-growing unstructured data loads that organizations are trying to tackle today. Organizations all across the globe are not only fighting with this issue but also crave for the best possible solution for backing up files from central storage devices to cloud endpoints. That is what Aparavi has come up with. The new solution also includes data awareness for intelligence and insight with efficient global security, search, and access. All this ensures files are protected and available. This is ideally a multi-cloud intelligent data protection solution from Aparavi. As we all know backup has a data dump problem. As such, enterprises have not been able to find a solution to this problem. Aparavi helps solve this issue in a very smooth and proven way.

    The new multi-cloud intelligent data protection solution from Aparavi has a brilliant feature Aparavi File Protect & Insight℠. This is in fact, the second line of defense against ransomware. What it does is file by file data protection and archive for servers, endpoints and storage devices ensuring data classification, content level search, and hybrid cloud retention and versioning. While on one hand Data Awareness ensures Data Classification, Metadata Aggregation, and Policy Driven Workflows. At the same time, Global Security manages Role-based permissions, Encryption both in-flight and at rest, and File versioning. Along with all this, on the other hand, Data Search and Access takes care of Anywhere/anytime file access, Seamless cloud integration, and full-content search. So far all these are happening but in bits and pieces through multi-vendor multi-products. Not a single product until now was able to perform with all these capabilities and features.

    Aparavi multi-cloud intelligent data protection solution

    That’s what takes Aparavi multi-cloud intelligent data protection solution to a next level that itself creates a new paradigm of data protection and availability. The benefits are phenomenal like Single view of all data and device status, Near Instant Data Discovery, Automated ransomware alerts & defense, multi and cross-cloud support, and rapid granular recovery. Let’s understand very clearly what Aparavi File Protect & Insight does. In the nutshell it protects data, it organizes data, and it makes data usable as and when required. Let’s recall a recent Arizona Beverages ransomware attack in which even after 6 months, it was still estimated that 40% of the company servers were still operating under old out of date data. What happens to a business under such circumstances is very well understood. That’s why organizations across the globe need a rugged and proven solution such as Aparavi multi-cloud intelligent data protection solution.

    Intelligent Data Protection

    Source: Aparavi

    Coming back to Arizona Beverages Ransomware attack and understanding the importance of Aparavi multi-cloud intelligent data protection solution. There are 1000+ employees in Arizona Beverages. More than 200 servers and computers were affected in a targeted attack. Millions of dollars were lost per day in sales. Now, what is important? Close your eyes and wait for the doomsday or get the best possible solution available in the market with a small investment that ensures no compromise will happen later with the organization’s finances or reputation at any cost. In this particular case, the Arizona Beverages network was hacked and encrypted. If Arizona Beverages had a second line of Aparavi protection, it would have provided them with a mountable archive to restore data files quickly on a file-by-file level or entire contents of a protected location. File under management could be restored based on need.

    Aparavi multi-cloud intelligent data protection solution

    Aparavi multi-cloud intelligent data protection solution ensures files under management can be restored based on need and left in the archive if the data is not needed for the resumption of daily operations. It provides a secondary immutable copy of data from servers, endpoints. Finally, it adds intelligent classification so that risk-averse data can be managed more securely, rather than stored on the potentially targeted local storage of data. Aparavi FPI certified clouds include Amazon Web Services (AWS), IBM Cloud, Google Cloud Platform, Scality, Wasabi, Oracle Cloud, Cloudian, Caringo, BackBlaze, and Azure. Aparavi FPI can solve problems like Ransomware Recovery, Endpoint and ROBO Protection, Data Retention and Archive, Compliance and Governance, and Storage Optimization. Can you name another single solution available in the market having the same capabilities?

    Jonathan Schwam, Principal Architect, Core82 Inc. says, “The business driver for selecting Aparavi was to absolutely, positively ensure that we had immutable data for the time required.”


    October 8, 2019  1:13 PM

    Micron Global Development Centre Launch in Hyderabad @microntech

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Memory, Micron, NAND, SSD, Storage

    A grand opening of Micron Global Development Centre in Hyderabad by Micron Technology Inc. Cleary shows their confidence in the talent pool available locally in Hyderabad. Present were some prominent top officials like Dee Mooney, Executive Director, Micron Foundation; Sanjay Mehrotra, President, and Chief Executive Officer, Micron Technology Inc.; April Arnzen, Senior Vice President of Human Resources, Micron Technology Inc.; and Jeff VerHeul, Senior Vice President, Nonvolatile Engineering, Micron Technology Inc. Their presence was more than enough to indicate Micron’s seriousness towards such a large scale launch in India, that too in Hyderabad, a fast emerging technology hub of India. This is Micron’s second footprint in India. They already have an operations site in Bengaluru launched in April 2019. Expansion at such a fast pace in India is really credible for both Micron and India. There was a high level of enthusiasm and energy all around on the huge floor.

    Micron Global Development Centre in Hyderabad aims to create a talent pool in diversified fields to gain faster breakthroughs in the latest emerging technologies like Artificial Intelligence, Machine Learning, Internet of Things (IoT)m, and Emerging Memory. Interestingly Micron has already more than 40,000 patents to its credit. With around 700+ employees currently at this newly launched Global Development Centre in Hyderabad, the aim is to increase this strength to 2,000 by next year or so. Micron is a well-established leader in the world with having its global standard manufacturing plants in 6 countries that include the United States, Japan, Malaysia, China, and Taiwan. In India, the focus is mainly on research and development. There are concrete plans to play a key role in contributing to the development of technologies behind breakthroughs in a wide range of areas like machine learning and artificial intelligence.

    Micron Global Development Centre

    Guests of honor at the launch of Micron Global Development Centre included KT Rama Rao, Minister of IT, Industries and Municipal Administration & Urban Development, Government of Telangana; Amitabh Kant, CEO, National Institute of Transforming India (NITI) Aayog; and S K Joshi, Chief Secretary, Government of Telangana. On the launch, Sanjay Mehrotra said,

    “We’re delighted to launch our Global Development Centre in Hyderabad and expand our team of engineers, researchers, developers, and IT specialists. Leveraging global talent, like our new Hyderabad team members, helps us drive innovation and stay at the forefront of emerging memory technologies.”

    With the help of a highly educated and talented workforce recruited through premium technology institutes of India, Micron aims to set very high standards to cater to the needs of their global companies across the globe. The hunt for the best talent to recruit is always on the top for Micron.

    Micron Global Development Centre

    The skillset of employees at Micron Global Development Centre Hyderabad includes information technology functions and engineering having expertise and knowledge of designing and developing high-quality, cutting-edge memory and storage solutions. To get the best of the talent Micron has already tied up with some of the best institutes in India and has also set some ultra-advanced labs in some of these institutes to let the talent groom in the right direction. Ultimately, there has to be a great team of engineers, developers, and researchers to innovate to cater to the complexity of the organization’s global operations network that includes procurement, supply chain, manufacturing, packaging, test & assembly, quality, and information technology segments. Micron already has a number of global centers of excellence across the globe having innovative achievements in design and product engineering for next-generation memory and high-value storage solutions. Micron is already a world leader in this.

    A podcast to the launch can be accessed here.


    October 5, 2019  10:31 PM

    Analytics on Data Lake Storage Goes 70x Faster With Dremio 4 @Dremio

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Analytics, Big Data, Data lake

    What matters most in the case of analytics is the speed of execution to gain faster data insights, almost in a real-time environment so as to take quick and accurate decisions to excel in business. Otherwise, all those efforts in investments in data, storage, software, and hardware fetch no real value to the business. So, the key component is speed. Now, how do you achieve that? For that, you need to understand the breakthroughs happening across the globe. One of the most significant in this regard is Dremio’s Data Lake Engine Dremio 4.0. It enables breakthrough speed for analytics on Data Lake Storage. Mind you, this is not a simple achievement. It is something that is going to change the whole paradigm of the performance of data lake engines. As a matter of fact, it is a new landmark in itself that makes Dremio stand apart from any other vendors.

    Dremio 4.0 significantly enhances performance and provides a self-service semantic layer for data in ADLS, S3, and other data sources. Dremio is one of the pioneer data lake engine company. The recent announcement about the release of its data lake engines for AWS, Azure, and Hybrid Cloud adds a new chapter in their achievements. Dremio 4.0, an open-source platform includes advanced columnar caching, predictive pipelining, and an altogether new smart execution engine kernel that delivers up to 70x increase in performance. That is, in fact, phenomenal. What is brings is flexibility and control for data architects and self-service model for data consumers. Dremio empowers companies to operationalize data lake storage like ADLS and S3. Data becomes easier to consume while provisioning the interactive performance that users strive to achieve. Dremio 4.0 data lake engine provides ANSI SQL capabilities, including complex joins, large aggregations, sub-selects, etc.

    Dremio Creats A New History in Data Lake Engines

    Ivan Alvarez, IT vice president, big data and analytics, NCR Corporation says, “We process hundreds of thousands of transactions on a daily basis and produce insights based on those transactions; this type of capability requires sophisticated and scalable data platforms. Dremio is working with NCR to solve the integration between traditional enterprise data warehouse and scalable distributed compute platforms for big data repositories. This integration allows NCR to also cross-pollinate data engineering knowledge among platforms and most importantly to deliver faster data insights to our internal and external customers.”

    Mike Leone, senior analyst, Enterprise Strategy Group says, “Organizations recognize the value of being able to quickly leverage data and analytics services to further their data-driven initiatives. But it’s more important than ever to start with a strong data foundation, especially one that can simplify the usage of a data lake to enable organizations to maximize data availability, accessibility, and insights. Dremio is addressing this need by providing a self-sufficient way for organizations and personnel to do what they want with the data that matters, no matter where that data is, how big it is, how quickly it changes, or what structure it’s in.”

    What makes Dremio special is its features like Columnar Cloud Cache (C3), Column-Aware Predictive Pipelining, Gandiva GA, Single Sign-on and Azure AD, and Advanced AWS Security. Tomer Shiran, co-founder and CEO, Dremio says, “Dremio’s Data Lake Engine makes queries on data lake storage extremely fast so that companies no longer have to move data into proprietary data warehouses or create cubes or extracts to get value from that data. We’re excited to announce new technologies – like our Columnar Cloud Cache (C3) and Predictive Pipelining – that work alongside Apache Arrow and the Dremio-developed Gandiva kernel to deliver big increases in performance.”

    Latest release of Dremio

    The latest release of Dremio can be downloaded from here. To getn more insights you can visit Dermio’s official blog.


    October 3, 2019  11:32 AM

    How to Improve and Lower the Cost of Distributed Enterprise Edge Data Management and Security

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Distributed applications, Distributed computing, Distributed storage, Distributed system, Edge computing, edge devices, edge security

    The distributed enterprise model continues to grow exponentially driven by both strategic plannings, as well as by default. That is, more and more organizations consist of a centralized headquarters datacenter with in-house IT expertise, together with remote sites, smaller footprint sites where users and technology are located, but oftentimes do not have the same level of onsite IT know-how, if any at all.

    This distributed enterprise model has various versions – ranging from a corporate office having subsidiary locations, such as a large retail brand with numerous retail stores distributed globally in malls and other shopping centers; to a large oil and gas or fast food conglomerate having numerous franchisee locations; to financial institutions with distributed ATM kiosks. Likewise, government agencies also rely upon a compute model in which there is a centralized command location, with intelligent mobile devices and internet of things (IoT) solutions in the field to help achieve their missions.

    What all of these distributed enterprises have in common is that whether they know it or not, their success has become unavoidably tied to their ability to successfully deploy, manage and protect an edge computing infrastructure. Naturally for the most part, at the HQ level where IT expertise is typically based, this fact is well known. However, at the remote locations where IT is not their core competency – nor should it be, this fact may be less recognized.

    I spoke recently with Cybera’s Andrew Lev, CEO; Bethany Allee, Executive Vice President of Marketing; and Paul Melton, Senior Vice President and General Manager, Petroleum for their perspective on this topic. Cybera delivers a secure software-defined wide area network (SD-WAN) edge for remote sites and IoT via the cloud. While Cybera stated they have over 90,000 customers in 23 countries, for the purposes of our discussion much of the focus was on its oil and gas customers (gas stations and convenience stores) in North America. This industry is a prime example of edge computing opportunities and challenges being faced.

    Lev explained that EMV Chip Card Compliance is on the minds of most in this industry (anyone that takes payment via credit card should be concerned). Gas stations are among those that have an even greater challenge, as they must ensure compliance inside the stores and outside at the pumps, where payment is also accepted. While compliance will help to eliminate credit card fraud and protect the owners from liability, preparing for it requires additional technology and services, across applications and the network, which can be very complicated and expensive to properly deploy and manage, as well as secure and protect. It is estimated that over 700,000 pumps in North America alone need to go through this transformation.

    Melton added that in addition to payments, other transactions are now taking place at the pump. Fuel pumps have become IoT devices that do a great deal more than just dispensing and monitor levels of fuel. Loyalty and reward programs, media, and lottery, as well as other products and services, are all enabled at the pump. And, while the bandwidth requirements are not significant, billions of transactions can take place on a monthly basis and at a high frequency that may include personal, financial, purchase or other private information. Making it more complicated, these transactions may need to be processed locally by the gas station/convenience store, transmitted back to a corporate office and/or sent to an outside third party vendor, in a different part of the country or world. Undoubtedly, even a moment of downtime or a breach in security can have disastrous consequences.

    Melton explained that Cybera technology helps to enable these transactions – data transmissions, to take place in a highly reliable and secure manner, by incorporating edge and cloud systems. And, while more and more customers are using broadband – not all broadband is created equal. Cybera delivers a certified appliance that ensures a failover network with wireless tech (LTE service) which will maintain uptime, even if the customer’s service is interrupted. The Cybera appliance has network segmentation and security built-in. Cybera enables the network flow that provides partners/customers with better visibility and control to deliver the products and services intended. In addition, Cybera’s cloud-based SD-WAN edge appliance has orchestration built into the system. It is able to orchestrate all of the applications and ensure they are performing well together. This is important for privacy and security. Cybera segments-out each application and creates a virtual application network for each, that is fully segmented away and ensures individual application traffic never crosses. However, it is also critical for performance prioritization. For instance, payments must be immediately processed, whereas a carwash transaction can wait. Cybera leverages its in-depth experience with the industry combined with facilitating the individual gas station/convenience store operator to set the priorities – critical over non-critical, those that have tolerance versus those that have zero tolerance for a delay. Cybera can do all of this via one box with embedded security backed by 24×7 support, delivering a mature multi-tenant environment (versus the five or six boxes with wires everywhere that were likely there previously). I found these points very interesting. And, I believe these capabilities are quite unique to Cybera.

    Allee also discussed the important fact that one of the biggest challenges Cybera, as well as other vendors, the face is a seamless integration of the numerous technologies, products, and services in the fuel industry ecosystem. However, given Cybera’s years in the industry, it has the benefit of knowing and having forged relationships with the majority. And, for those they do not already know, the operator/customer makes introductions. Collaboration is key between what can be hundreds of various technologies, applications, and services. These other vendors recognize they are not in the networking business. Via these relationships, the conflict that might otherwise be easily created is eliminated. And, customers can accelerate the safe and seamless deployment of new technologies and digital transformation.

    Lev concluded with explaining that Cybera has been delivering SD-WAN before the industry even had a name for it. There are about 100 companies that call themselves SD-WAN. However, almost all of the other vendors are focused on the large enterprise (and at a much higher price point), which has very different requirements and challenges than the highly distributed customers Cybera serves. Whether oil and gas, retail, quick-service restaurants (QSR), healthcare, financial or kiosks, for the distributed enterprise – brick and mortar, with smaller remote locations at the edge – deploying, managing and protecting the technology and applications that deliver a better customer experience, increased revenues, and enhanced profit margins have historically been highly complex, but with Cybera it appears it doesn’t have to be.

    Cybera’s website can be found at https://www.cybera.com/ should you wish to learn more.


    September 30, 2019  10:52 PM

    Building a Columnar Database on RAMCloud @Amazon

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Database Management Systems, Shared storage

    Building a Columnar Database on RAMCloud: Database Design for the Low-Latency Enabled Data Center (In-Memory Data Management Research) by Christian Tinnefeld

    Excerpt as on Amazon.com

    This book examines the field of parallel database management systems and illustrates the great variety of solutions based on a shared-storage or a shared-nothing architecture. Constantly dropping memory prices and the desire to operate with low-latency responses on large sets of data paved the way for main memory-based parallel database management systems. However, this area is currently dominated by the shared-nothing approach in order to preserve the in-memory performance advantage by processing data locally on each server. The main argument this book makes is that such a unilateral development will cease due to the combination of the following three trends: a) Today’s network technology features remote direct memory access (RDMA) and narrows the performance gap between accessing main memory on a server and of a remote server to and even below a single order of magnitude. b) Modern storage systems scale gracefully, are elastic and provide high-availability. c) A modern storage system such as Stanford’s RAM Cloud even keeps all data resident in the main memory. Exploiting these characteristics in the context of a main memory-based parallel database management system is desirable. The book demonstrates that the advent of RDMA-enabled network technology makes the creation of a parallel main memory DBMS based on a shared-storage approach feasible.

    About the Author
    Dr. Christian Tinnefeld received his B.Sc. and M.Sc. degrees from the Hasso Plattner Institute at the University of Potsdam, Germany, where he also pursued his doctoral studies. His main research interests are In-Memory Databases and Cloud Computing. In the former area, he has been collaborating with SAP for six years and contributed to initial concepts of the SAP HANA database. In the latter area, Christian has been collaborating with the RAM Cloud Project at Stanford University, California, the USA for three years.


    September 30, 2019  10:33 PM

    Forsa For Developers With Optane by @FormulusBlack @packethost

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    DRAM, Intel

    Forsa is a memory-based storage and virtualization system that has been created by Formulus Black. The solution is powered by latest in memory that is Intel’s Optane DC persistent memory. All this runs on the Packet’s cloud platform. That means the three giants in their respective fields Intel, Formulus Black, and Packet have jointly created an absolute solution for developers to test, validate, and optimize their data-intensive and real-time application workload on Forsta running with the best in the class memory on the cloud. This genius collaboration between the three to deliver persistent memory powered infrastructure solution to application developers across the globe is something ultimate for the developers. This is perhaps they have been craving for, for a long period to test, validate, and optimize data-intensive and real-time workloads on a proven system like this. The memory-based storage and virtualization solution is going to change the whole paradigm.

    Forsa

    Today’s developers and IT leaders are surrounded by data-intensive and latency-sensitive applications. Till now they didn’t have a strong platform like Forsa to match their talent and business demand. The business applications today need to access limitless data. At the same time, while on one hand, data volumes for business are increasing exponentially, on the other hand, there is no time for business to spare a moment without having full access to the applications and data. The results are required in a jet speed. That is how the need arose and Venture-backed startup Formulus Black and Intel decided to collaborate to develop high-performance infrastructure solutions powered by persistent memory. Forsa is server-based software designed by Formulus Black that delivers Intel Optane DC Persistent Memory with improvised performance, availability, usability, and capacity. This is the first such kind of environment designed for developers.

    Forsa

    Intel Optane DC persistent memory has the power to enhance the performance of data-intensive applications relative to NAND based SSDs. How it does that is quite interesting to know. It shortens the traditional path of data between memory and peripheral storage. That’s not all. Forsa works many steps ahead. It triggers the performance of Optance DC persistent memory through algorithms to optimize I/O between CPU and memory. In the nutshell, it performs some fantastic jobs like decreasing CPU usage, leveraging more TPS/IOPS, and lowering latency under maximum load conditions.

    Jing Xie, Chief Operating Officer at Formulus Black says, “We believe Intel Optane is the leading persistent memory technology in the market today and we are excited to provide software that enhances its usability, performance, and TCO for private and hybrid cloud platforms like Packet. By combining our respective capabilities, we are making it easier for developers and IT leaders to adopt persistent memory solutions and supercharge the performance of databases, ERP systems, and custom-built data-intensive applications.”

    Optane DC persistent memory is tremendously powerful in comparison to traditional DDR4 DRAM. It has the capability of offering a higher system capacity at lower total cost/gigabyte. It also supports data persistence. What that means is that data can be stored in memory in a very cost-effective manner which then can be utilized for faster analytics processing and query performance as and when required. This enables it to perform and get results in a real-time environment so that business gets the best out of technology.

    Forsa

    Alper Ilkbahar, vice president of Non-volatile Memory and Storage Solutions at Intel says, “As Intel Optane DC Persistent Memory scales in the market, the industry is embracing the technology as truly transformational. The evolving software ecosystem using Optane DC persistent memory, such as Formulus Black’s Forsa, will only accelerate the pace of innovation for developers and end-users.”

    Jacob Smith, Chief Marketing Officer and co-founder of Packet says, “Due to unmatched efficiency, the number of workloads that benefit from high-performance memory-based storage is expanding rapidly. The combination of Intel Optane and Formulus Black’s Forsa is exactly what many of our most demanding users are looking for.”

    For a free trial of Forsa on Packet, please sign up at https://www.packet.com/about/formulus-black/


    September 24, 2019  11:14 PM

    Software-Defined Interconnect Platform Luxon @Stateless_Inc

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Interconnect

    Think of a Software-Defined Interconnect Platform that has infinite interconnection service chains. What that means is that the new interconnects are easily created using any of the available combinations of security services, routing, data encryption, and so on. What if it has Increased connectivity possibilities. That means, with the help of additional connection protocol it becomes easy to integrate into existing networks. What if there is a new switch technology that opens more configuration possibilities. With the help of cutting edge switching software, software control extends to replace customized hardware. Then there is multi-tenant support. Once you deploy that software-defined interconnect platform in an enterprise or a service provider, end users are empowered to manage their own interconnects, scaling up to any extent. Now, come out of that thinking mode and get into the experience mode. You can now experience this all with Luxon by Stateless.

    With Software-Defined Interconnect platform, Stateless reinvents connectivity. It creates a new landmark in connectivity. This groundbreaking technology delivers simple yet powerful, composable, scalable connectivity for any cloud-based applications environment. The future of connectivity is here. It shifts the whole paradigm of connectivity to the software. The whole game is changed. The era of new hardware buying for every scaling up, every new application, and every new network are at the verge of extinction now with the launch of Software-Defined Interconnect platform Luxon by Stateless. Network connectivity is reinvented in a new avatar. Luxon, with its general availability, is the industry’s first software-defined interconnect (SD-IX) platform. Luxon’s technology, that is under the process of patenting, makes interconnecting workloads simple, composable, and scalable with its expanded capabilities. That is a revolution. And it is available.

    Software-Defined Interconnect Platform

    Enterprises are going to save a huge amount on account of a lot of hardware that will not be required now because of software-defined interconnect platform. Device-Centric methods of constructing infrastructure are no more required and thus can easily be discarded. This is possible with Luxon’s expanded platform that easily empowers businesses and colocation providers to create any number of interconnections cost-effectively, simply, and quickly. With the help of this new approach of Software-Defined Interconnect platform, customers can streamline their network operations without any capital expenditures.

    software-defined interconnect

    Source: Stateless

    Craig Matsumoto, Senior Analyst, 451 Research says, “As IT distributes outward to the cloud, businesses are becoming even more dependent on connectivity. What used to be the enterprise network now reaches out to multiple destinations, serving applications that have differing requirements. Stateless is simplifying connectivity by providing a kind of flexibility and control that isn’t necessarily found in conventional physical and virtual devices.”

    Murad Kablan, CEO and co-founder, Stateless says, “The accelerated movement of applications to the cloud is creating a new era of connectivity. To maintain a competitive advantage, businesses require fast, easy connections to the cloud, support for numerous interconnects and the ability to deploy a diverse array of new applications quickly. Today it often requires intricate engineering and new hardware and takes months for a business to deploy a single cloud application. Luxon simplifies a traditionally complex, multi-step process into a single step that takes only minutes.”

    To learn how Lexon works, watch this video:


    September 23, 2019  11:24 PM

    Keysight World India 2019 A Lot Insightful Event @keysight

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    5G, Testing

    This is the concluding post of the 2-post series covering Keysight World India 2019. The first post can be accessed here. The event was held in two major technology cities of India – Bangalore and New Delhi.

    What Keysight is doing is phenomenal indeed. The organization has expertise in providing an excellent environment to industries and industry leaders and inspire, enable, and accelerate the realization of their innovations. They are in fact enabler in real sense. It’s like catalyzing the whole electronic and semiconductor ecosystem in a very synchronized and orchestrated manner. This is the only company in the world that manages the electronic and semiconductor value chain while addressing test and measurement challenges and enriching with real would Solutions at such a large scale. It makes industries make their dreams come true by providing them with true potential to realize those. The global trends have changed completely when it comes to success. The whole mechanism has become very dynamic, demanding, and challenging thus asking for an agile and integrated approach to innovation.

    Keysight World India

    When there is a demand to deliver something really big, it needs a perfect combination of technology, partners, collaboration, inspiration, and solutions from around the world. That’s the core strength of Keysight Technologies and that is the driving idea of Keysight World India. Keysight Technologies is a world leader in electronic measurement offering a complete stack of test and design solutions to manage the entire lifecycle. You think of any area, be it simulation to R&D, design verification and pre-conformation, conformation, manufacturing, deployment, or service assurance, Keysight ensures to be the first choice.

    Keysight World India

    Sudhir Tangri, VP & Country General Manager India at Keysight Technologies says,

    “Since customers play an integral role in the success of Keysight, Keysight World helped in bringing our customers together and served as a platform for them to network and exchange ideas. At Keysight, we believe that these seminars add value to our customers and their businesses.”

    Sandeep Kapoor, Director Marketing (Europe, MEA and India), Keysight Technologies says,

    “The program was designed to provide experiences through real-world product demonstrations, technical presentations, and networking among industry experts. Keysight World was a great platform to learn from industry leaders and technical experts about the latest industry directions and technologies. We saw a lot of interest from our customers and industry experts in Keysight World. We had a strong panel of speakers who shed light on innovative solutions and topics that are key for our stakeholders.”

    The key focus areas of Keysight World India are 5G Technologies, Aerospace and Defense, Automotive & Energy, High-Speed Digital & Data Center Infrastructure and Network & Security Test. There’s a complete paradigm shift in all these industry segments with a high demand to innovate to market faster. Some of these include keeping up with network data speeds – 400G and beyond, test considerations for next-generation WiFi, Automation, to name a few that were topics covered in the sessions at Keysight World India. To listen to the global experts who spoke at Keysight World India, you can switch to the podcast here. These experts include:
    Sandeep Kapoor, Director Marketing (Europe, MEA, and India)
    Donna Majcen, Vice President, Global Field Marketing
    Angel
    Vince Nguyen, General Manager – Aerospace & Defense Business
    Michael Raser, Business Development Director, Automotive & Energy Solutions
    Brigham (Brig) Assay, Director of Strategic Planning, Internal Infrastructure Solutions
    Eric Taylor, Vice President, Managed Services Division


    September 23, 2019  11:08 PM

    Keysight World 2019 A Golden Chance To Meet The Leaders @keysight

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Testing

    Technology is changing very fast. To embrace it and use it to its fullest to reap maximum out of it, an enterprise needs to deploy it very fast. That deployment has not only to be fast enough but foolproof too. We all know the pain of leakages at a later stage could be killing for the business leading to financial and reputational losses. To make any technology workable and deployable for businesses and enterprises, companies like Keysight Technologies play a significant role. They ensure your technologies deploy well and work well. For instance, merely 5G has no value unless you have the knowledge and technology to deploy it and ensure it works for your business the way you want. It was a golden chance to witness most of the business and technology leaders working in Keysight Technologies handling various verticals at one place.

    Keysight World

    It was Keysight World 2019 in India that concluded recently hosted by Keysight Technologies. Keysight Technologies is one of the most sought after technology company by enterprises, service providers, and governments for only one reason, to accelerate innovation to Connect and Secure the world. Keysight World 2019 was hosted in two cities of India, Bangalore (now Bengaluru) and Delhi (the capital city of India). It was a right occasion to listen to most of the Keysight thought leaders and learn about their technological advances, breakthrough design/test/optimization, strategies, and leading-edge solutions. Keysight Technologies provides solutions on a global scale. Most of their solutions are firsts in the market, unique, innovative, and trendsetters. Every solution is a landmark in itself. It seems as if they have only one competitor globally. Competing with themselves and thriving for the next level of excellence from their existing and unbeatable levels of excellence.

    Keysight World 2019

    We will continue in the next post discussion more about Keysight World 2019.


    September 18, 2019  4:56 PM

    TNQ InGage: A Born Leader In Augmented and Virtual Reality @tnqingage

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    AR, augmented reality, iot, mixed reality, MR, Virtual Reality

    When two masters from their respective fields join hands that means something unique is going to happen in the industry. That is what has happened when Vijay Karunakaran from InGage and Abhigyan Arun from TNQ Technologies joined hands to create a new joint venture named as TNQ InGage. While Vijay Karunakaran has ample experience in boards tips and augmented and virtual reality, Abhigyan Arun has vast experience in content creation on the global front. Everybody in technology knows the opportunities lying in AR and VR business almost in every industry segment. That is why TNQ Technologies decided to invest USD 2 million in TNQ InGage. InGage, on the other hand, has extensive expertise in immersive technologies and IP. The core focus of TNQ InGage lies on developing Immersive Technology Solutions with the help of augmented reality, virtual reality, and mixed reality that is AR, VR, and MR.

    Integration with IoT and analytics will bring a new color to these solutions for industries like construction, manufacturing, and healthcare. The key services include immersive training simulators using virtual reality & haptics, Field service tools augmented reality & IoT, and digital experience centers for industry 4.0.

    TNQ InGage

    Abhigyan Arun, CEO, TNQ Technologies says, “TNQ Technologies is a 21-year-old, 2,500 people-strong company focussed on publishing technologies and services. We see the future of content consumption leveraging immersive AR and VR environments, and have made this investment in TNQ InGage as part of our long-term strategy. The extensive work InGage has done over the last 6 years under Vijay’s leadership, and the strong relationships with prestigious customers and partners make me confident that we have the right ingredients to establish ourselves as a leader in this technology and industry. TNQ Technologies will continue to focus on its existing publishing client base and will extend the AR/VR expertise of TNQ InGage to them as a new service line.”

    TNQ InGage

    The global AR and VR market aims to touch $100 billion in the next 5 years. In India, the current focus of the government is on skilling and development, adoption of industry 4.0, and improved infrastructure supporting the latest forms of content delivery. Keeping this in mind TNQ InGage plans to focus its investments in product R&D with a key focus to improve human interaction with virtual objects with the help of haptics and photorealistic rendering. The ultimate goal of the newly formed joint venture is to develop state of the art immersive world-class VR products and services for healthcare and industrial verticals keeping the areas of training and rehabilitation as a top priority.

    TNQ InGage

    Vijay Karunakaran, Founder & CEO, TNQ InGage says, “Over the last 6 years we have delivered over 500 projects, generated extensive IP, and built a great team. TNQ Technologies’ investment as a long-term strategic partnership will help us with growth and scalability. We are already investing in R&D to improve human interaction with virtual objects. Our plan is to develop products and services that will have a meaningful impact on how people train, work and live. We are also looking at expansion in the global market, including the book-publishing world, by leveraging TNQ’s existing capabilities and competencies.”


    September 18, 2019  4:21 PM

    Dr. Kurt Lauk, prominent business leader joins Tachyum @tachyum

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    "Angela Merkel", processor

    Dr. Kurt Lauk is a well-known personality on the global front. He is a prominent business leader, former chairman of Economic Council (EC) in Germany (Wirtschaftstart), and former advisor to Angela Merkel. He joins Tachyum as company’s Board of Advisors. Dr. Lauk has a tremendous success record of serving various management boards like that of Daimler and Audi. He has been a member of the EU parliament. He was Chairman of the Economy Council for 15 years and in that capacity, he advised Angela Merkel. Currently, he is the founder and President of Globe CP GmbH which is a personally owned family office since 2000. With Dr. Lauk joining as Tachyum’s Board of Advisors it will add huge value to the company. A comes with valuable insights regarding European industries. He carries a deep understanding of European manufacturers and suppliers.

    Also, Dr. Kurt Lauk is an expert in international strategy having an exhaustive experience of bringing Technology innovations to market. Dr. Lauk has a deep-rooted experience in European business for which the credit goes to his positions as Member of the Board of Management and Head of Worldwide Commercial Vehicles Division of Daimler Chrysler from 1996 to 1999. Another contributor to this is his tenure as Deputy Chief Executive Officer (Dy CEO) and Chief Financial Officer (CFO) of Audi AG from 1989 to 1992. Dr. Lauk is currently serving as a director on the supervisory boards of Magna Inc (Toronto), and Fortemedia (Cupertino). Nomura which is Asia’s global investment bank appointed him as Germany and Austria Advisory Board Chairman in July 2018. Since 2000 he is is a Trustee of IISS (International Institute of Strategic Studies in London).

    Dr. Kurt Lauk

    Previously he held the position of the chief executive officer at Zinser Textile Machinery GmbH. His other earlier positions include Vice President/Consultant/Partner at Boston Consulting Group and Chief Financial Officer & Controller for VEBA AG which is now known as E.ON SE. Dr. Lauk holds a Graduate degree from Ludwig-Maximilians-Universität München, an MBA from Stanford Graduate School of Business, and a doctorate in international politics from the University of Kiel. His rich experience includes serving as a Member of the European Parliament from 2004 to 2009, including as a member of the Economic and Monetary Affairs Committee, and as a Deputy Member of the Foreign and Security Affairs Committee.

    Kurt Lauk

    Dr. Radoslav Danilak, Tachyum’s Founder and CEO says, “Tachyum is honored to have an international business and technology luminary of Dr. Lauk’s caliber on our Board of Advisors. I am personally excited to work directly with Kurt, to be able to rely on his advice and counsel, especially regarding EU business strategies and how Tachyum can help facilitate EU technological sovereignty.”

    Adrian Vycital representing Tachyum’s lead investor IPM says, “Today, Slovakia is the world’s largest per-capita car producer. I look forward to working with Tachyum and Dr. Lauk to identify Prodigy value-added insertion points within Industry 4.0 and the automotive sector. We believe Tachyum will play an essential role in creating an innovation ecosystem, which is a key factor in the transformation of the Slovak economy from manufacturing to knowledge-based.”

    Dr. Kurt Lauk

    Tachyum’s flagship product codenamed Prodigy is the world’s first Universal Processor. The progress of its development is on track. Basically, with its high performance and ultra-low power consumption, Prodigy offers industry-leading performance for data center, AI, and HPC workloads. These exceptional properties gel well with 5G. That means Prodigy promises to bring a paradigm shift for telecom industry players. It will bring AI from cell tower mounted data centers to power intelligent IoT devices and autonomous systems. Tachyum plans to release a 7nm tapeout next year. And its volume production will begin in 2021. It is important to understand that Tachyum’s Prodigy Universal Processor Chip is the world’s smallest and fastest general-purpose 64-core processor developed till date. Amazingly it requires 10x less processor power and reduces processor cost by 3x.

    Tachyum’s immediate plans include enabling a 32-Tensor Exaflop supercomputer as well as build machines that are more powerful than the human brain by 2021. It is a great feat that Tachyum is going to achieve years ahead of industry expectations. Prodigy ensures the reduction of data center TCO (annual total cost of ownership) by 4x with the help of its disruptive processor design and a smart compiler that is going to make many parts of the hardware found in typical processor redundant. All this results in fewer transistors, fewer and shorter wires, because of smaller & simpler core translating into much higher speed and power efficiency for the Prodigy processor.


    September 17, 2019  3:46 PM

    Customer Statements About Next-Gen Zoho One @Zoho

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Business transformation

    This is the last post in the series covering what customers have to say about Next-Gen Zoho One and also the pricing part. The first post in the series is Next-Gen Zoho One Can Transform Enterprise Journey @zoho. The second post is Business Transformation Is The Mantra of Next-Gen Zoho One @Zoho and the penultimate one is Key Features of The Bold New Operating-System Services @Zoho. Now, let us see first what few of the prestigious customers of Next-Gen Zoho One have to say about it and how do they transform the business with it.

    Customer Speaks:

    Pranesh Padmanabhan, Founder & CEO, Studio 31 says,

    “Zoho One as a concept has fulfilled the dream of Studio 31 in becoming a fully-fledged, technology-enabled wedding photography and film company. It has given us the confidence and strength that we can sustain in this highly competitive, unorganized industry by not worrying about manual administration work anymore and fearing about human errors; it’s a very sensitive industry and every error will cost a client long memory. Now that we have a SaaS product running our business, we’ve got brilliant ideas and we’ve got the platform and time to make high-level strategic decisions to grow bigger, better and truly be one of the most-organized businesses in this sector.”

    Sonia Bhadoriya, Head of Business Development, Eurokids says,

    “We switched from Salesforce to Zoho One because of the fluidity of data across apps that allowed us to connect our departments. We use Zoho CRM, Sign and Creator extensively. The analytics tool enables us to visualize big data with graphs and charts. We love it!”

    Next-Gen Zoho One

    Niki Kushe, Group Head – CRM, India Infoline Finance Ltd says

    “IIFL is a leading financial services conglomerate serving over 4 million satisfied customers around the globe. Though an established organization, we are constantly on the look for ways to build our strength and to deliver excellent service to our ever-expanding customer base. At IIFL, we have started using Zoho One, which includes their super-powered CRM, Email, Campaign Management, Survey, Social Media Management, Sales IQ, Creator, Internal Chat and HR products across various entities. Honestly, there is no better value than Zoho One can offer, especially at this low a cost! Zoho One does change the way businesses operate by offering a whole suite of apps that are not only tightly integrated with each other but also play well with third-party applications. These vast varieties of solutions are easy to configure and customize which in itself paves for an efficient cross-selling platform.”

    Pricing:

    The story remains unfinished if I don’t mention the pricing for Zoho One. The pricing, in fact, is very simple and is pocket-friendly for any industry segment of any size. And there are no hidden costs to it. It’s Rs. 1500 per employee or RS. 3000 per user. The ROI in case of Zoho One is multifold and quick as all the new features mentioned above come free of cost with Zoho One.


    September 17, 2019  3:34 PM

    Key Features of The Bold New Operating-System Services @Zoho

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Business Process, provisioning, Single sign-on, Telephony

    This is the third post in the series of Zoho One Next-Gen features. You can access the first post here – Next-Gen Zoho One Can Transform Enterprise Journey @zoho and the second post here – Business Transformation Is The Mantra of Next-Gen Zoho One @Zoho. In this post, we shall be discussing some of the key features offered by the Bold New Operating-System of the Businesses that is Zoho One.

    Now let us look at some of the key features offered by the Bold New Operating-System Services as below:

  • 1. Communication: PhoneBridge
  • – PhoneBridge is Zoho’s newer entry in the bouquet of applications in Zoho One. It is, in fact, Zoho’s telephony platform that enables telephony in Zoho apps. For instance, in Zoho CRM this PhoneBridge integration permits users to make calls from Zoho apps. Not only that it also provides contextual information on incoming calls. What it means is, if PhoneBridge is enabled in Zoho CRM it gives users context for all incoming calls not only from Zoho CRM but also from Zoho Recruit, Zoho Mail, and 20+ other apps.

  • 2. Single Sign-On:
  • Single Sign-On (SSO), another new service from Zoho One enables customers to integrate any third-party applications with their account. As a matter of fact, Zoho Single Sign-On currently supports more than 50 third party applications and account is increasing on a regular basis. This third party application integration can be done in two ways. It can be done either individual user wise or groups wise.

  • 3. App Management and Provisioning
  • : As of now Zoho One allows provisioning for all of its 45 plus apps.

    Bold New Operating-System

  • 4. Business Workflow Management
  • : Orchestly is the new Innovative and intuitive drag and drop interface helping managers or the process owners to define processes effortlessly without any technical knowledge in coding. There are ample practical examples of this like Purchase Approvals, Content Publishing, Asset Management, Onboarding, and so on.

  • 5. Zoho Sign:
  • Zoho Sign builds an additional level of validation for customers with the help of blockchain-based timestamping through Ethereum. Ethereum, as we all know, is a globally accepted open-source platform. So when a document is signed using Zoho Sign, an Ethereum transaction happens in the background, actually. The hash of the signed document is added to the transaction notes of that Ethereum transaction happening in the background.

    Next post is the concluding post in this series. Continued »


    September 17, 2019  3:16 PM

    Business Transformation Is The Mantra of Next-Gen Zoho One @Zoho

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Business transformation

    This is the second post in the series where we are talking about Next-Gen new features and how it empowers its customers in a multifaceted manner focusing on multi-directional business growth. The first post can be accessed here. We are talking about the power of Orchestly interface. This interface is so easy to manage that even the non-tech business process owners of various departments can build their own processes without any help from the technology department. Within 2 years of its launch, Zoho One has achieved substantial growth and is now catering to more than 20,000 customers across the globe. The new next-gen features added to the operating system include a new application that connects every corner of business operations, a number of updates in the existing modules, and an overall unmatched performance how the complete business suite.

    If we look at the statistics of Zoho One usage around 25% of its customers use more than 25 applications, and more than 50% are using 16 or more applications from the bouquet of applications on Zoho One platform. This itself proves how badly businesses are craving for an all in one business solution thereby to reduce the complexity of managing a large number of vendors and consistently arising conflicts out of the integration of multi-vendor multi-platform business applications. The speed at which Zoho One is getting adopted in the market among different business verticals is a proof of a major shift in customer expectation and rejection of complexity in lieu of easy to use All-in-One platforms to get immense value out of it.

    Business Transformation with Next-Gen Zoho One

    Rajendran Dandapani, Director of Technology, Zoho says,

    “Technology is supposed to help businesses. Instead, it has evolved into complex beast customers have to tame—from juggling apps from multiple vendors to trying to solve the multi-app integration puzzle to dealing with vendors forcing customers into expensive, lengthy contracts. The technology industry has gone too far down this path and this has to change. With Zoho One, we want to change all of that. It’s a technology platform to run your entire business with a trustworthy vendor that is easy to do business with. With Zoho One, you are not just licensing the technology. You are licensing peace of mind.”

    Don’t miss the next post in this series.


    September 17, 2019  2:29 PM

    Next-Gen Zoho One Can Transform Enterprise Journey @zoho

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Blockchain, Business applications, Business Process Automation, Single sign-on, Telephony

    Zoho is truly fitting in what it says ‘the operating system for businesses’. Zoho One is, in fact, a complete business solution with a high level of material and high focus on all business levels together. With the launch of Zoho Next-Gen Zoho One, the whole ecosystem has become much stronger with a power to deliver more to the business across all the user and management levels. The operating system for businesses has got empowered with a process automation app, telephony, single sign-on, and blockchain capabilities. This next-gen features or tools enable it to achieve greater heights in customer adoption and business acceptability.

    While the other business applications in the market focus on money for licensing and making their system so complex to an extent that their customers are forced to rely on them or rather stay at their helm instead of hefty annual maintenance costs and licensing costs. On the other hand, every new level of Zoho One is getting more powerful, more flexible, more business-friendly thereby leveraging with the new powerful features at no additional cost to its customers. Every new feature added becomes an integral part of Zoho One thus improving its quality and making it more powerful and stable from the business point of you. This is proven by the fact that the popularity of Zoho One is increasing exponentially and so is its customer base in India as well as globally.

    Next-Gen Zoho

    The next generation of Zoho One is designed and built to take care of entire business operations including sales and marketing, finance and HR, operations and business intelligence, and so on. All this runs on a unified technology platform. The new business workflow management application that they named as Orchestly empowers Zoho customers to effortlessly create, manage, and optimize their business processes with the help of an intuitive drag and drop interface.

    We shall be continuing in the next post featuring more features of Next-Gen Zoho One. Continued »


    September 16, 2019  9:26 AM

    INTELLIGENT SEARCH CAPABILITIES @cloudtenna and @nasuni

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Cloud search, Cloud storage, Nasuni, Object storage

    Technology has matured to a level of collaboration and innovation. High-end tech companies have understood that very well. A recent example in this regard is a partnership between Cloudtenna with Nasuni. As a result of this partnership, enterprises will be able to get intelligent search capabilities. In fact, this is a boon for existing Nasuni clients. Now the can achieve instant success in finding your file from a multiple cloud environment they are using. And all this happens with a guarantee of compliance with enterprise file access rules. As we are well aware that Cloudtenna is one of the front runners in enterprise search technology. It announced today strategic partnership with Nasuni. This partnership results in an unmatched integration of Nasuni Cloud File Services with Cloudtenna’s DirectSearch. This, in turn, empowers enterprises to augment their enterprise-wide collaboration efforts with intelligent search capabilities.

    INTELLIGENT SEARCH

    Direct search is an outstanding recommendation engine which is empowered with machine learning and is capable of finding results with the lightning speed with the help of its sub-second search queries features that works on massive distributed datasets and including millions of items. For enterprises in today’s data-intensive environment file search has become a singular priority to increase employees productivity especially in an environment where the workforce is working in a distributed manner. Because enterprises have understood it well that to increase the productivity of their employees the level of collaboration between their employees based in various offices around the globe is very important. As I mentioned in one of my previous posts, recent research from IDC says an organisation having 1000 knowledge workers on board wastes around dollar 50000 per week. That comes to almost $2.5 million per annum.

    INTELLIGENT SEARCH

    All this happens due to employee’s failure to locate and retrieve electronic files well in time. In a few cases, it could lead to higher losses in terms of financial transactions as well as business reputation.

    Will Hornkohl, vice president of alliances at Nasuni says,

    “For enterprise users, it can be a challenge to find the exact file they are looking for across multiple clouds, not to mention on-premises servers. Bringing Cloudtenna into our growing partner community will ensure that users at the organizations we serve can always find exactly what they are looking for quickly and easily within the entirety of their global file share – all while using a single login for all file sources and while conducting intelligent searches that reflect personalized contextual insights that are modeled on each individual user’s file activity, history, teams and relationships. And of course, Nasuni and Cloudtenna both are built with safeguards that ensure complete compliance with all file access rules and protocols.”

    INTELLIGENT SEARCH

    There are around 500 enterprises that rely on Nasuni. With this significant collaboration, Nasuni has empowered enterprises to reap the highest level of benefits of cloud object storage which includes the unlimited capacity and inherent resiliency. As a matter of fact, this also changes the definition of ‘economy of the cloud’ for them. This will definitely enhance their control on performance from network-attached storage (NAS).

    INTELLIGENT SEARCH

    Aaron Ganek, CEO at Cloudtenna says,

    “Nasuni empowers organizations of all kinds not only to use cloud object storage in all of its flavors for primary storage of their files but also makes an unprecedented degree of global collaboration possible. Search capabilities are even more important when you’re looking at organizations like those that rely on Nasuni, which in many cases not only have massive datasets and equally goliath global file shares but also have employees who need to access a specific file they worked on with a colleague who’s literally on the other side of the world.”

    Ganek adds,

    “The Cloudtenna DirectSearch platform is uniquely designed to tackle distributed datasets, making it the ideal solution for Nasuni’s hybrid cloud file services platform. File search infrastructure faces a unique set of requirements that goes beyond the footprint of traditional search infrastructure used for log-search and site-search. It has to be smart enough to reflect accurate file permissions. It has to be smart enough to derive context to boost search results and has to do all this in a fraction of second.”

    To conclude Nasuni is now Cloudtenna’s tier-one supported data source. In fact, Cloudtenna is also certified as a third-party integration which is available to customers.


    August 30, 2019  2:30 PM

    Predictive Analytics and Data Mining @Amazon

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Amazon, Data Mining, Predictive Analytics

    Predictive Analytics and Data Mining: Concepts and Practice with RapidMiner by Vijay Kotu and Bala Deshpande

    Book Excerpt as on Amazon.com:

    Put Predictive Analytics into Action Learn the basics of Predictive Analysis and Data Mining through an easy to understand the conceptual framework and immediately practice the concepts learned using the open-source RapidMiner tool. Whether you are brand new to Data Mining or working on your tenth project, this book will show you how to analyze data, uncover hidden patterns and relationships to aid important decisions and predictions. Data Mining has become an essential tool for any enterprise that collects, stores and processes data as part of its operations. This book is ideal for business users, data analysts, business analysts, business intelligence and data warehousing professionals and for anyone who wants to learn Data Mining. You’ll be able to: 1. Gain the necessary knowledge of different data mining techniques, so that you can select the right technique for a given data problem and create a general-purpose analytics process. 2. Get up and running fast with more than two dozen commonly used powerful algorithms for predictive analytics using practical use cases. 3. Implement a simple step-by-step process for predicting an outcome or discovering hidden relationships from the data using RapidMiner, an open-source GUI based data mining tool

    Predictive analytics and Data Mining techniques covered: Exploratory Data Analysis, Visualization, Decision trees, Rule induction, k-Nearest Neighbors, Naïve Bayesian, Artificial Neural Networks, Support Vector machines, Ensemble models, Bagging, Boosting, Random Forests, Linear regression, Logistic regression, Association analysis using Apriori and FP Growth, K-Means clustering, Density-based clustering, Self Organizing Maps, Text Mining, Time series forecasting, Anomaly detection and Feature selection. Implementation files can be downloaded from the book companion site at www.LearnPredictiveAnalytics.com

    Demystifies data mining concepts with easy to understand language
    Shows how to get up and running fast with 20 commonly used powerful techniques for predictive analysis
    Explains the process of using open source RapidMiner tools
    Discusses a simple 5 step process for implementing algorithms that can be used for performing predictive analytics
    Includes practical use cases and examples

    “If learning-by-doing is your mantra — as well it should be for predictive analytics — this book will jumpstart your practice. Covering a broad, foundational collection of techniques, authors Kotu and Deshpande deliver crystal-clear explanations of the analytical methods that empower organizations to learn from data. After each concept, screenshots make the ‘how to’ immediately concrete, revealing the steps needed to set things up and go; you’re guided through real hands-on execution.”

    –Eric Siegel, Ph.D., founder of Predictive Analytics World and author of Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie, or Die


    August 30, 2019  2:19 PM

    A Guide to Delivering Business Results with Big Data Fast @Amazon

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Big Data

    Actionable Intelligence: A Guide to Delivering Business Results with Big Data Fast!
    by Keith B. Carte and Donald Farmer’

    Excerpt as on Amazon.com

    Building an analysis ecosystem for a smarter approach to intelligence
    Keith Carter’s Actionable Intelligence: A Guide to Delivering Business Results with Big Data Fast! is the comprehensive guide to achieving the dream that business intelligence practitioners have been chasing since the concept itself came into being. Written by an IT visionary with extensive global supply chain experience and insight, this book describes what happens when team members have accurate, reliable, usable, and timely information at their fingertips. With a focus on leveraging big data, the book provides expert guidance on developing an analytical ecosystem to effectively manage, use the internal and external information to deliver business results.

    This book is written by an author who’s been in the trenches for people who are in the trenches. It’s for practitioners in the real world, who know delivering results is easier said than done – fraught with failure, and difficult politics. A landscape where reason and passion are needed to make a real difference.

    This book lays out the appropriate way to establish a culture of fact-based decision making, innovation, forward-looking measurements, and appropriate high-speed governance. Readers will enable their organization to:

    Answer strategic questions faster
    Reduce data acquisition time and increase analysis time to improve outcomes
    Shift the focus to positive results rather than past failures
    Expand opportunities by more effectively and thoughtfully leveraging information
    Big data makes big promises, but it cannot deliver without the right recipe of people, processes and technology in place. It’s about choosing the right people, giving them the right tools, and taking a thoughtful—rather than formulaic–approach. Actionable Intelligence provides expert guidance toward envisioning, budgeting, implementing, and delivering real benefits.

    “Actionable Intelligence has the critical insights business leaders need to leverage Big Data to win in the emerging digital marketplace! Well done, Keith!”

    —Ed Hunter, Vice President, Product Supply-Asia, Procter & Gamble Europe SA – Singapore


    August 30, 2019  2:15 PM

    Executing Data Quality Projects : Book on @Amazon

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Data quality

    Executing Data Quality Projects: Ten Steps to Quality Data and Trusted Information (TM) 1st Edition by Danette McGilvray

    Excerpt as on Amazon.com

    Executing Data Quality Projects presents a systematic, proven approach to improving and creating data and information quality within the enterprise.

    Recent studies show that data quality problems are costing businesses billions of dollars each year, with poor data linked to waste and inefficiency, damaged credibility among customers and suppliers, and an organizational inability to make sound decisions.

    Data Quality

    This book describes a Ten-Step approach that combines a conceptual framework for understanding information quality with the tools, techniques, and instructions for improving and creating information quality. It includes numerous templates, detailed examples, and practical advice for executing every step of the approach. It allows for quick reference with an easy-to-use format highlighting key concepts and definitions, important checkpoints, communication activities, and best practices.

    The author’s trademarked approach, in which she has trained Fortune 500 clients and hundreds of workshop attendees, applies to all types of data and all types of organizations.

    Includes numerous templates, detailed examples, and practical advice for executing every step of The Ten Steps approach.
    Allows for quick reference with an easy-to-use format highlighting key concepts and definitions, important checkpoints, communication activities, and best practices.
    A companion Web site includes links to numerous data quality resources, including many of the planning and information-gathering templates featured in the text, quick summaries of key ideas from The Ten Step methodology, and other tools and information that is available online.

    This book is a gem. Tested, validated and polished over a distinguished career as a practitioner and consultant, Danette’s Ten Steps methodology shines as a unique and much-needed contribution to the information quality discipline. This practical and insightful book will quickly become the reference of choice for all those leading or participating in information quality improvement projects. Experienced project managers will use it to update and deepen their knowledge, new ones will use it as a roadmap to quickly become effective. Managers in organizations that have embraced generic improvement methodologies such as six sigma, lean or have developed internal ones would be wise to hand this book to their Black Belts and other improvement leaders.

    – C. Lwanga Yonke, Information Quality Practitioner.


    August 30, 2019  12:26 PM

    An Excellent Application and Infrastructure Monitoring Tool @Site24x7

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Application monitoring, application performance monitoring, IT Infrastructure Monitoring, Server monitoring

    What I feel is Site24x7 will soon be the top layer application and infrastructure monitoring tool for all CIOs and CTOs irrespective of the next layers of support mechanisms they have in place. Site24x7 has a lot of features that are unmatched with any of the existing global standard monitoring tools. Firstly, none have all the features it has. Secondly, it beats all others in one field or the other. And that is why I call it to be the most suitable and promising top layer application and infrastructure monitoring tool. Let us learn a little more about Site24x7. It’s very lightweight in comparison to its peers in the market from other companies. Also, it is a non-intrusive agent. What that means is that the Site24x7 monitoring agent runs almost silently and with minimal dependencies. It consumes very little memory on a system.

    application and infrastructure monitoring tool

    Data in Site24x7 application and infrastructure monitoring tool comes through log files, APIs, and simple commands. The overall proposition is quite cost-effective. That means, with it, you can monitor and manage your infrastructure and the servers on a wide range of parameters starting from basic availability to server cluster troubleshooting. The pricing has nothing to do with the physical configuration of your server. It indicates, that its pricing is not based on the number of cores, that mostly happens in the market to exaggerate the costing model. Its costing also doesn’t consider the RAM size or the number of hours your server needs to run. It empowers you to monitor and control your complete server infrastructure in a holistic manner without any tension of the financial implications. That makes it the first choice for enterprises of any size and volume.

    Infrastructure Monitoring Tool

    application and infrastructure monitoring tool

    Source: site24x7

    Tell me, how many other tools in the market are so powerful with so much flexibility and least costing model. Site24x7 is a modern application and infrastructure monitoring tool that is a SaaS service. It is already serving thousands of businesses and millions of users across the globe. With its new launches of Site24x7 Signals, AI-based intelligent dashboards & Security, This state-of-the-art application and infrastructure monitoring tool have created a new landmark in this field.


    August 29, 2019  9:42 AM

    How GOFRUGAL Is The Best ERP for Trade and Supply Chain @gofrugaltech

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    ERP, Supply chain, trade

    In the previous post, I covered the multifaceted achievements of Kumar Vembu. In this post, we shall be learning more about GOFRUGAL ERP. GOFRUGAL ERP is a complete, comprehensive and collaborative core business application to take care of entire trade and supply chain ecosystem. It takes care of 100% transaction automation collaborating suppliers, customers, partners, and other stakeholders in the ecosystem. GOFRUGAL ERP intensively evolved with an in-depth understanding of the minutest of customer needs and their pain points. It is available in different flavors like on-premise, in the cloud or on mobile. Before buying it a customer can have a free trial to play around the solutions and experience its unbeatable simplicity. Its key features and benefits include handling customers’ omnichannel businesses, customizable, ML-driven supply chain, autopilot with AI-driven decisions. In addition, it easily integrates into the current complex IT environment as per the requirement of backward forward integration.

    GOFRUGAL

    GOFRUGAL customers include businesses of various sizes like small or independent stores, large stores, regional local chains or national chains. During the last 18 years of its evolution, it has catered to more than 40 plus business formats that include supermarkets, pharmacies, salons, Apparel stores, restaurants, bakeries, FMCG, and pharma. Let’s see what is latest in GOFRUGAL. Recently GOFRUGAL launched GoSecure which is a real-time backup as a service for traders. More than 90% of retail and distribution businesses don’t have a data backup solution in place that means they are running with a high risk of backing up anytime. Most of these businesses are not too serious about the safety of their data. GOSecure’s real-time backup as a service automatically backs up every transaction that takes place on the system to a cloud-based service in a secured manner.

    GOFRUGAL

    That means in case of any kind of contingencies user can restore the data without any technical support requiring just and OTP.


    August 29, 2019  9:30 AM

    Kumar Vembu and His Entrepreneurial Journey @gofrugaltech

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Cloud ERP, ERP, retail

    Kumar Vembu is a veteran entrepreneur. He has expertise in technology. Kumar is CEO and founder of GOFRUGAL. He wears multiple hats. He is an entrepreneur, investor farmer, yoga expert, and a lot more. Started his career with Qualcomm at San Diego. In 1995 he moved back to India to start his entrepreneurial journey. As a matter of fact, before starting his venture, he worked with large Telecom and networking companies like IITM, HCL, HP, and TeNet. While working, he was always striving to launch a world-class software product company in India and that is when he co-founded Zoho Corp with his brother Sridhar Vembu in 1995 and another company Delium in 2016. In between these two companies he launched GOFRUGAL in 2004 with a mission to make ERP easy and usable by all businesses of any size.

    Kumar Vembu and GOFRUGAL

    As of today, GOFRUGAL’s multiple solutions are running successfully in more than 30000 brands across more than 40 business formats with a sole purpose to transform business operations digitally to gain better customer experience, higher sales thereby acquiring more customers. Let us learn more about GOFRUGAL. GOFRUGAL is a digital-first company. It was founded in 2004 with a singular goal of empowering retail, restaurant, and distribution businesses with the help of appropriate ERP tools that empowers them to achieve greater levels of profitability and efficiency. The key focus of all GOFRUGAL applications stays on customer experience and mobility because these two are the new drivers for the growth of any business of any capacity. That is how GOFRUGAL empowers businesses to go Digital to stay ahead of their competition and sustain enough agility to grow and succeed continuously.

    Kumar Vembu

    We shall be continuing about GOFRUGAL in the concluding post. There are many more interesting facts about this wonderful venture initiated by Kumar Vembu.


    August 19, 2019  10:58 PM

    Care For Data Accuracy? Go For Golden-Record-As-A-Service @Naveego

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Data, Master data management, MDM

    With the launch of next-generation data accuracy platform by Naveego, it has marked another landmark in the respective field. This NextGen data accuracy platform with self-service MDM and advanced security features shows Golden record across the comprehensive Enterprise data systems. In fact, this marks the launch of powerful Golden-Record-As-A-Service offering thus eliminating the need for costly IT resources. As a matter of fact, it results in 5x faster deployment and a humongous 80% cost savings over legacy solutions. In fact, in a short span of time Naveego has emerged as a leader in cloud-first distributed Data Accuracy solutions. With its recent launch of the next generation of its complete data accuracy platform Naveego has set itself at a much higher stand as compared to its counterparts in the market. It comes bundled with self-service Master Data Management (MDM) and Golden-Record-as-a-Service (GRaaS).

    Rather, it has become a boon for non-technical business users to manage Technology in a much easier and more accurate manner. This powerful offering empowers non-technical executives in an organization to acquire the data that they may require for advanced analytics without any intervention of somebody from the IT department or needing any professional services. GRaaS has many more powerful benefits to help business see itself rise to new heights. For instance, it ensures that there exists a single version of data for all business verticals in an organization. They can easily bank on the same data for their respective analytics and reporting. Interestingly, it results in 80% reduction in cost. Also, the implementation goes 5 times faster than legacy solutions.

    Data Accuracy – How Critical?

    In fact, this next-generation platform consists of an advanced patent-pending security mechanism that ensures merging and checks consistency without decryption of data or even any requirement of having platform access to the encryption key. The best thing is it does not require any customization or infrastructure change for that matter that results in a low cost of ownership (TCO). Moreover, as it is a complete solution It eliminates the requirement of highly skilled individuals to implement and maintain this system. That gives another major benefit to an enterprise. As we all know data is expanding in any organization at a tremendous speed exponentially. This is because of adoption of latest technologies like artificial intelligence (AI), machine learning (ML), the internet of things (IoT), heterogeneous devices including mobile devices, autonomous vehicles, and a large number of other sources in the ecosystem that are outside of traditional data centers.

    data accuracy

    All these have emerged with a very essential requirement of data cleansing which is becoming quite cumbersome for enterprises as well as highly expensive. On average the annual cost to organizations is  $15 million for maintaining bad data according to Gartner. As a matter of fact, in addition to this, there are other heavy costs that have become a big headache for enterprises. This includes the high price of legacy systems, customization of existing systems, etc. A debt of $3.1 trillion is there every year on the US economy. For an individual organization, it might look irrelevant but that is not the case. A company with, for instance, 50,000 incorrect records will have to incur a cost of $5 million per year to maintain those at approximately $100 per incorrect recordhttps://hbr.org/2017/09/only-3-of-companies-data-meets-basic-quality-standards.

    Data Accuracy

    In lack of a proper mechanism in place, this cost of maintaining incorrect data will keep on rising exponentially every year looking at the speed at which the data is increasing. Another burning issue for organizations is scrubbing and prepping of data for which organizations have to hire or outsource high-wage data scientists. Evidently, reports indicate that 80% of the time of these high-wage data scientists goes in collecting and cleansing inaccurate digital data. Without this cleansing of inaccurate digital data, an organization cannot use it for analysis purposes. This, Naveego terms as ‘data janitor work‘ which doesn’t match to the skills of data scientists but unfortunately this kind of work eats out most of their time whereas they are hired for focusing on the highly skilled job of data analysis.

    That in fact creates a vicious circle for the organizations from which they will never be able to come out and ultimately will succumb to it sooner or later unless they adopt a powerful system like Naveego’s next-generation Data Accuracy platform with Self-Service MDM and Advanced Security Features to ensure Golden Record across all Enterprise Data Systems. Now let us understand how Naveego explains the emergence and importance of Golden record. The complete data accuracy platform that Naveego provides supports hybrid and multi-cloud environments providing distributed Data accuracy solution. It in fact proactively manages, identifies, and eliminates any kind of customer data accuracy problems across all enterprise data sources thus resulting in a single Golden record thereby ensuring data consistency across the Enterprise. In turn, it eliminates any chances of data lakes from becoming data swamps.

    Data Accuracy

    The solution talks to Kubernets, Apache Kafka, and Apache Spark technologies thereby ensuring rapid deployment distributed processing and flawless integration with data. This data may be residing anywhere in the cloud or on-premise/off-premise. The matter of fact is it supports all kind of hybrid and multi-cloud environments. Naveego ensures the data accuracy of any volume with realtime streaming from multiple data sources in any environment irrespective of its schema or structure. The key features of Naveego’s Next Generation Data Accuracy Platform include Self Service, Golden-Record-as-a-service, Golden Record C, Automated Profiling of Data Sources at the edge (machine learning), Automated Profiling of any Data Source including IoT, Automatic Data Quality Checks driven by Machine Learning and so on.

    Michael Ger, General Manager, Automotive and Manufacturing Solutions, Cloudera says, “Companies across all industries are reimagining themselves within a digitally transformed future. Central to that future is leveraging a data tsunami resulting from newly connected consumers, products and processes. Within this context, data quality has taken on critical new importance. The Naveego data accuracy platform is critical for enabling traditional approaches to business intelligence as well as modern-day big data analytics. The reason for this is clear – actionable insights start with clean data, and that’s exactly what the Naveego platform delivers.”

    Data Accuracy

    Katie Horvath, CEO, Naveego says, “The ability to achieve golden record data has typically been available only by hiring a systems integrator or other specialist, at a high cost and TCO to the enterprise. The next generation of our Data Accuracy Platform is truly a game-changer, empowering business users to access trusted data across all data types for analytics purposes, entirely on their own with an easy to use, flow-oriented user interface – and at a significantly lower cost. This is sure to disrupt pricey legacy solutions that require vast amounts of professional resources and on average five times longer to deploy.”


    Forgot Password

    No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

    Your password has been sent to: